Last week, word emerged that Tesla’s Autopilot software was being investigated by the National Highway Traffic Safety Administration (NHSTA) in the wake of a tragic auto accident wherein a Tesla Model S crashed into a tractor-trailer, killing the driver of the Tesla in the process. The accident occurred in Florida this past May but was only brought to the public’s attention a few days ago.
Since then, there’s been something of a whirlwind of debate surrounding Tesla in general and the future of autonomous driving specifically. Compounding matters are reports that the driver of the Model S may have been watching a movie at the time of the crash. While this has not been established as fact, the Florida Highway Patrol has confirmed that a portable DVD player was found in the wreckage.
All that said, here’s what we know of the fatal May 7 crash which, as Tesla points out, is the first fatality in 130 million miles of logged Autonomous driving.
The following graphic from the official police report details how the accident occurred. The Tesla Model S (represented via the red arrow below) was headed east. It’s not yet clear how fast it was going at the time of the crash but some witnesses have claimed that it may have been traveling in excess of 80 mph.
Meanwhile, a tractor trailer heading west on a parallel road began to take a huge left turn. Unfortunately, the truck wasn’t able to make the turn fast enough to avoid the oncoming Model S. Upon impact, the Tesla Model S drove underneath the trailer, tearing off the top of the car in the process. At this point, the Model S, which still had Autopilot engaged, veered off into a nearby field before colliding with power pole.
Of course, the looming question here is why didn’t the Tesla recognize the impending collision and implement any preventative braking?
Addressing the issue, Tesla and Elon Musk have subsequently said that the car’s software likely mistook the truck’s trailor for an overhead highway sign.
Adding more information, Tesla’s official blogpost on the incident reads in part:
What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.
Meanwhile, Mobileye — the company that provides some of the brains behind Tesla’s Autopilot software — issued a statement of its own indicating that the current incarnation of its software isn’t yet capable of preventing accidents like the one the Model S was involved in.
We have read the account of what happened in this case. Today’s collision avoidance technology, or Automatic Emergency Braking (AEB) is defined as rear-end collision avoidance, and is designed specifically for that. This incident involved a laterally crossing vehicle, which current-generation AEB systems are not designed to actuate upon. Mobileye systems will include Lateral Turn Across Path (LTAP) detection capabilities beginning in 2018, and the Euro NCAP safety ratings will include this beginning in 2020.
Shortly thereafter, Tesla issued a retort of sorts, saying that Mobileye’s technology comprises just one part of the company’s Autopilot software.
Tesla’s autopilot system was designed in-house and uses a fusion of dozens of internally- and externally-developed component technologies to determine the proper course of action in a given scenario. Since January 2016, Autopilot activates automatic emergency braking in response to any interruption of the ground plane in the path of the vehicle that cross-checks against a consistent radar signature.
In other words, Tesla here seems to be saying that its Autopilot software can in fact detect and avoid a laterally crossing vehicle. It just didn’t do so in this case due to the “radar signature” misinterpreting the looming tractor trailer with an overhead highway sign.
Now whether or not the driver of the Model S was watching a movie at the time of the crash — as claimed by one witness and the driver of the truck — the broader takeaway here is that autonomous driving technology is far from perfect and that human drivers need to remain cognizant of their surroundings at all times.
As for the NHTSA investigation, it’s worth noting that it’s just meant to gather preliminary data and to determine if Tesla’s Autopilot software malfunctioned or if it functioned appropriately given the circumstances.