Click to Skip Ad
Closing in...

NTSB says Tesla was partly to blame for fatal 2016 Autopilot crash

Updated Sep 12th, 2017 5:12PM EDT
BGR

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

In May 2016, a Tesla Model S being controlled by the Autopilot system crashed into a truck-trailer in Florida, killing the sole occupant. Since the day of the crash, the National Transportation Safety Board has been hustling to work out what went wrong. The conclusions are in, and it’s not good for Tesla.

“Operational limitations” of the Autopilot system played a “major role” in the crash, the NTSB found. The problem was not specifically that Autopilot failed to operate as designed, or should have avoided the crash; instead, it appears that the driver was relying too heavily on Autopilot, using it outside of its designed limitations, and the over-confidence (and lack of attention to the road) were major factors in the crash.

“Today’s automation systems augment, rather than replace human drivers. Drivers must always be prepared to take the wheel or apply the brakes,” NTSB Chairman Robert Sumalt said.

Although Autopilot has sensors that detect if the driver isn’t holding the wheel, it doesn’t enforce attentive driving via hard stops in the software. It will issue visual reminders on the dash to keep holding the wheel and to pay attention, but as any number of YouTube videos show, those aren’t enforcing anything.

Autopilot is also only designed for use on divided highways. The fatal crash was the result of a truck-trailer turning left across two lanes of traffic — not the kind of fully managed highway that Autopilot is designed for use on. The NTSB noted that Autopilot isn’t designed to pick up cross traffic (like that truck-trailer), but it doesn’t restrict Autopilot use to roads it knows are safe.

“It is clear that Tesla’s system did little to constrain the use of Autopilot to roadways for which it was designed,” Sumalt said, and “Tesla system data provided no indication of driver interaction with the vehicle after the final cruise control speed was set.”

A simulation created by the NTSB gives some idea of how the crash went down. There were no visibility problems on the day of the crash, so both vehicles were on a collision course for nearly 10 seconds, with neither driving noticing anything amiss.

https://twitter.com/ryanfelton/status/907602073365217281

In any other case, we’d be talking about this as a simple example of dangerous distracted driving — and that seems to be what the NTSB is hinting at with Autopilot. Although it’s a good system when used properly, Tesla hasn’t done a standout job of restricting its use, or even explaining the limitations to drivers.

UPDATE: A Tesla spokesperson contacted BGR via email to supply the following statement in response to the NTSB’s conclusions:

At Tesla, the safety of our customers comes first, and one thing is very clear: Autopilot significantly increases safety, as NHTSA has found that it reduces accident rates by 40%. We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.

Chris Mills
Chris Mills News Editor

Chris Mills has been a news editor and writer for over 15 years, starting at Future Publishing, Gawker Media, and then BGR. He studied at McGill University in Quebec, Canada.