Late last month, a Model X cruising along a California highway crashed into a highway divide whereupon the vehicle’s battery pack caught on fire and engulfed the Model X in flames. Tragically, the driver of the Model X — Walter Huang — did not survive the accident.
A subsequent investigation into the incident revealed that Tesla’s Autopilot feature was engaged at the time of the crash. What’s more, Tesla, upon examining the car’s logs, revealed that the Model X issued a number of hands-on warning signals to Huang in the seconds preceding the crash.
A few weeks removed from the accident, there’s now word that Huang’s family is gearing up to sue Tesla.
During a recent interview with ABC 7, Huang’s wife Sevonne explained: “I just want this tragedy not to happen again to another family.”
Interestingly enough, Huang’s wife goes on to say that her husband had complained about how the Model X’s Autopilot behaved on the same stretch of road where the accident occurred.
Sevonne Huang: “And he want to show me, but a lot of time it doesn’t happen.”
Dan Noyes: “He told you that the car would drive to that same barrier?”
Noyes: “The same barrier that he finally hit?”
Sevonne: “Yeah, that’s why I saw the news. I knew that’s him.”
With a lawsuit looming, Tesla has since provided its strongest response yet to the crash, effectively blaming Mr. Huang for the crash.
According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.
The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang’s drive that day.
We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive.
Incidentally, this isn’t the first time Tesla has had to defend its Autopilot software in the midst of an ongoing or impending lawsuit. Just about a year ago, Tesla was hit with a lawsuit from a customer who claimed that Tesla was using drivers as “beta testers of half-baked software that renders Tesla vehicles dangerous.”