On Friday, March 23rd in Mountain View, California, the driver of a Tesla Model X died in a fiery crash on Highway 1o1. Tesla later confirmed that its Autopilot feature was engaged at the time of the accident. According to data from the Association for Safe International Road Travel (ASIRT), somewhere in the ballpark of 3,287 other people around the world died in car accidents on that same day while driving or riding as a passenger in cars that did not have Tesla’s Autopilot engaged. Nearly 1.3 million people die each year in car accidents across the globe according to ASIRT, and an additional 20 million to 50 million people are injured or disabled. In the United States alone, more than 37,000 people die annually in automobile accidents, and another 2.35 million are injured or disabled.
What happened to the driver of that Tesla Model X last month in Mountain View was tragic. Of that, there is no question. But in light of all the alarmist media coverage I’ve seen online and on TV in the weeks that followed, I wanted to share some quick thoughts on the incident in an effort to separate fact from fiction.
First and foremost, Autopilot was indeed active at the time of the fatal Model X accident in California last month. Tesla drivers tend to be quick to blame Autopilot when accidents occur that are then widely publicized, but Tesla’s analysis of vehicle logs typically reveals those claims to be false. This time around, however, Tesla has confirmed publicly that Autopilot was enabled when the Model X in question slammed into a barrier on Highway 101. Here is an excerpt from Tesla’s statement:
In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.
This portion of the company’s blog post does come off as a bit insensitive, but it’s also understandable and, to be frank, entirely necessary. Bloggers, online journalists, and TV news anchors launched into a frenzy when news of the fatal accident first broke, pointing a stern finger at Tesla’s Autopilot as the cause of the crash.
In light of Tesla’s statement, we can state conclusively that Autopilot caused the accident. But tragic though it may be, it appears that the driver’s irresponsible actions were directly responsible for his death.
Local Northern California ABC affiliate ABC 7 News published a report last week that shared some of the details surrounding the accident. According to that report, the driver of the Model X has been identified as 38-year-old Walter Huang of Foster City, California. Huang had been an engineer at EA for 13 years before recently taking a job at Apple this past November. The report says he bought his new Tesla Model X to celebrate soon after landing the new job.
The details surrounding the accident at indeed tragic. Huang reportedly leaves behind a wife and two young children. But contained within those details are indications that Huang had been quite irresponsible with his use of Tesla’s Autopilot feature, and that irresponsibility unquestionably played a role in his demise.
This past November, I spent a week with a Tesla Model X P100DL that included the company’s Autopilot option. I wrote about my experience the following month. In describing my time testing Autopilot, I noted that it often got “confused” and would sometimes even veer out of a lane. One time as Autopilot was driving the Model X around a curve on a two-lane street, it even started to veer into an oncoming lane.
But none of this erratic Autopilot behavior surprised me at all. Why? Because Tesla tells drivers exactly what to expect from Autopilot long before they ever use the feature themselves.
I spent nearly two hours with a Tesla representative going over the Model X’s features before I drove away from Tesla’s lower Manhattan showroom the day I pick up the car. Much of that time was spent discussing Autopilot. We even went for a test drive, first with the Tesla rep driving the Model X, then with me driving as he sat in the passenger seat and explained things.
My experience was in no way unique. Tesla spends time with each customer when he or she takes delivery of a new car. Should that car be equipped with Autopilot, the customer spends time with a Tesla salesperson learning everything I learned about Autopilot that day.
Long story short, Autopilot is not a refined autonomous driving feature. It’s an experimental feature that is still in its very early stages, and drivers are advised to treat it as such. The drivers hands should remain on the steering wheel at all times, and the driver’s attention should remain on the road at all times. Period. To use Autopilot any other way is irresponsible. Full stop.
Tesla’s biggest mistake with Autopilot, in my opinion, is calling it “Autopilot.” The name is dangerous. Cadillac offers a similar solution that I tested on its CT6 flagship sedan, but Cadillac calls it “Super Cruise.” That’s how drivers should think about these early-stage autonomous driving features. Tesla’s Autopilot option isn’t a “self-driving” feature — it’s enhanced cruise control.
With that in mind, let’s look once again at a portion the quote I pasted above from Tesla’s statement. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision,” Tesla wrote. “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”
Translation: the driver wasn’t paying attention. He ignored multiple warnings from Tesla’s Autopilot system, warnings that Tesla very clearly states should result in the driver taking back control of the vehicle.
But wait, it gets even worse.
According to ABC 7 News’ report, Huang’s brother is quoted as saying that he complained “7-10 times the car would swivel toward that same exact barrier during Autopilot. Walter took it into dealership addressing the issue, but they couldn’t duplicate it there.”
In other words, the location of the crash was a spot where Autopilot had previously had difficulties as many as 10 times. And yet when Huang was driving the car that day, he ignored warnings and left Autopilot in control for at least “about five seconds and 150 meters” before the accident occurred.
Again, this is a sad and tragic accident. All fatal car accidents are sad and tragic. The truth of the matter is that the driver of this Model X was not using Tesla’s Autopilot feature responsibly. It stings, yes, but it appears as though Huang had every opportunity to prevent the crash. That doesn’t make this accident any less tragic, and it doesn’t make the loss suffered by Huang’s friends and family any less heartbreaking. But much of the media coverage I’ve seen has been alarmist and irresponsible, and for that reason it’s important that all the details surrounding this incident are made clear.