Ahead of what was sure to be a high-profile legal battle, Tesla this week opted to settle a lawsuit that centered on the safety of the company’s Autopilot feature. Tesla’s Autopilot feature was introduced in 2015 to a lot of fanfare. However, it wasn’t long before we started to see several Autopilot-related accidents emerge.
With respect to the case at hand, a Tesla Model X owner named Walter Huang in 2018 was tragically killed when his vehicle drove head-on into a highway divider. At the time, the Model X’s Autopilot feature was engaged. This led some to speculate that a software error was responsible for the horrific accident. Incidentally, Huang was an Apple engineer at the time.
A subsequent investigation from the National Transportation Safety Board (NTSB) report revealed, upon examining data from Tesla’s onboard computer, that Huang’s car had a tendency to veer towards the highway divider on previous highway trips.
Tesla’s explanation for the crash
The accident prompted a wrongful death lawsuit from Huang’s family. In response, Tesla issued what was considered to be something of a tone-deaf statement. The statement implied that the accident was actually the result of Huang driving while not paying attention to the road.
The statement reads in part:
According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.
The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang’s drive that day.
About 6 years later, the lawsuit against Tesla was poised to go to trial. And given that a big part of Tesla’s defense revolved around Huang not using the Autopilot feature correctly, Tesla was aiming to get Apple to provide it with information about Huang’s iPhone usage leading up to and during the crash:
Tesla obtained a sworn statement from an Apple engineering manager, James Harding, who analyzed unencrypted telemetry data on Huang’s phone and said it “suggests possible user interaction, which might be a screen touch or button press.”
The Huang family’s lawyers have countered in a court filing that Tesla purposefully hid its questioning of Harding from them until after pretrial fact-finding deadlines. They are now trying to force Apple to provide more information, and the iPhone maker is pushing back, saying that it shouldn’t have to hand over confidential material.
Interestingly, Huang’s family wasn’t going to dispute that Huang was using his iPhone at the time of the crash. Instead, the family was planning to focus on Tesla’s hyperbolic claims regarding its Autopilot feature. Specifically, the family alleges that Tesla’s boastful claims cause drivers to believe the feature is safer than it is.
Terms of the settlement are confidential
With the lawsuit settled, Tesla will avoid what could have very well been a public relations disaster. Terms of the settlement deal are sealed. Tesla notes that the dollar amount involved is being kept secret because others “may perceive the settlement amount as evidence of Tesla’s potential liability for losses, which may have a chilling effect on settlement opportunity in subsequent cases.”
Meanwhile, Tesla in recent years has taken steps to more narrowly define the abilities and limitations of its Autopilot feature.