Click to Skip Ad
Closing in...

New video shows a Tesla in Autopilot mode almost crash into a highway barrier

Published Apr 6th, 2018 7:52AM EDT
Tesla Autopilot
Image: Ena/AP/REX/Shutterstock

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Late last month, a Model X careened into a highway divider whereupon the battery pack on Tesla’s electric SUV caught on fire. Tragically, the driver behind the wheel of the Model X did not survive the accident. While details surrounding the crash are still forthcoming, Tesla last week confirmed that the car’s Autopilot feature was turned on at the time of the crash.

As the company noted in a March 30 blog post:

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

Of course, many were quick to question why the vehicle’s Autopilot feature — with 5 seconds and 150 meters to spare — didn’t implement any type of defensive action on its own.

Tackling this issue, a Tesla owner in Chicago recently took his Tesla for a drive and noted that when faced with a similar intersection, his car almost crashed as well. What’s more, another Tesla owner went one step further and took his car for a spin on the same stretch of road as the aforementioned fatal accident and encountered similar behavior from the car’s Autopilot feature.

Now yet another video has emerged which provides us with perhaps our crispest look yet as to what might be going on. Originally posted to the Tesla subreddit this week, you can see the Tesla almost hurl itself into the highway divider, prompting the driver to quickly assume control of the car.

Interestingly, the driver notes that this type of behavior on Autopilot didn’t manifest until relatively recent software updates.

“Previous autopilot versions didn’t do this,” the driver noted. “10.4 and 12 do this 90% of the time. I didn’t post it until I got .12 and it did it a few times in case it was a bug in 10.4.”

As to what’s going on, it appears that the Tesla is simply trying to center itself, mistaking the lines surrounding the barrier for a wide line.

For what it’s worth, Tesla issued a statement to Jalopnik this week noting that the Autopilot feature is not designed to relieve a driver of all responsibility.

Autopilot does not, as ABC 7’s reporting suggests, make a Tesla an autonomous car or allow a driver to abdicate responsibility. To review it as such reflects a misrepresentation of our system and is exactly the kind of misinformation that threatens to harm consumer safety. We have been very clear that Autopilot is a driver assistance system that requires the driver to pay attention to the road at all times, and it has been found by NHTSA to reduce accident rates by 40%. It would be highly unfortunate if news stories like this influenced people to not use a system that adds to safety.

Yoni Heisler Contributing Writer

Yoni Heisler has been writing about Apple and the tech industry at large with over 15 years of experience. A life long expert Mac user and Apple expert, his writing has appeared in Edible Apple, Network World, MacLife, Macworld UK, and TUAW.

When not analyzing the latest happenings with Apple, Yoni enjoys catching Improv shows in Chicago, playing soccer, and cultivating new TV show addictions.