Earlier this month, a Tesla in Utah crashed into a stopped fire truck at 60mph. The story got plenty of attention from the media, which prompted billionaire Tesla CEO Elon Musk to have a meltdown on Twitter about biased reporting.
Today, the Associated Press obtained document on the Utah crash from local police which appear to show that the vehicle was under the control of Autopilot at the time, and accelerated for a few seconds before the crash. The driver hit the brakes manually a moment before the impact.
Data from the Model S electric vehicle show it picked up speed for 3.5 seconds shortly before crashing into a stopped firetruck in suburban Salt Lake City, the report said. The driver manually hit the brakes a fraction of a second before impact.
Police suggested that the car was following another vehicle and dropped its speed to 55 mph to match the leading vehicle. They say the leading vehicle then likely changed lanes and the Tesla automatically sped up to its preset of 60 mph (97 kph) without noticing the stopped cars ahead of it.
Musk’s tweets after news of the accident first emerged took aim at the fact that when other cars crash, they don’t get the same kind of attention from the media:
To an extent, he has a point. Humans are extraordinarily bad at driving, and a few crashed Teslas doesn’t mean that the technology is less safe than human-piloted cars. But the notion that because something is better than what came before it, it doesn’t get to be scrutinized is clearly false. More houses burned down before asbestos was a thing, but would minor lung issues due to asbestos be the kind of story that newspapers shouldn’t have covered in the 1930s? Lead pipes brought clean water to tens of millions of households, driving a new wave of sanitary living conditions that enabled urbanization, but the adverse side effects are still worth studying.
Tesla, by absolute choice and in no part because of Musk’s never-ending PR tour, is one of the front-runners in deploying driver assistant technologies to cars. Everything from the name — Autopilot — to the promo videos of hands-free driving gives the impression that less attention is needed to drive a Tesla than a regular vehicle. News stories about a new technology failing in a sometimes-fatal way aren’t intended to say that all driver-assist technologies are bad and should be banned; they just raise awareness of the side-effects of trusting one particular new technology (Autopilot) too much, and occasionally raise the question of whether there’s a slower but safer method of rolling out these new technologies to the public.
If Musk is really invested in persuading people that his cars are safer, Tesla should release far more data on Autopilot’s safety record. There’s currently one public statistic about the safety of Tesla vehicles before and after Autopilot, and the NTSB, which first released the number, has since said that it’s flawed, at best.
Human drivers are bad. Anyone who has tried to drive in the left lane of the I-95 can tell you that, and the fact that people die in car crashes due to human error on a daily basis isn’t news any more. Reporting on the failure of a new technology doesn’t imply that the new technology is worse than the status quo; it simply makes people aware of the problems of adopting new technologies before they’re fully ready for the mainstream.