Click to Skip Ad
Closing in...

Watch hackers manipulate a Tesla on Autopilot into accelerating to 85 MPH

Published Feb 19th, 2020 7:00PM EST
Tesla Autopilot Hack
Image: Ena/AP/REX/Shutterstock

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

A group of researchers from McAfee recently demonstrated how manipulating a speed limit sign can cause a Tesla in Autopilot mode to behave erratically. Specifically, researchers placed a tiny sticker on a 35 MPH speed limit sign and, as a result, both a 2016 Tesla Model X and Model S in Autopilot mode interpreted the speed limit to be 85 MPH. In turn, both vehicles started to accelerate to 85 MPH.

The experiment, which was obviously conducted in a controlled environment, helps illustrate some of the inherent challenges associated with achieving Level 5 autonomous driving. While the underlying technology can improve by leaps and bounds — and it has over the past few years — there’s no way to fully account for a range of external factors. Notably, the margin for error when driving around in a vehicle that weighs thousands of pounds is simply too small — and the risk of injury too great — to make fully self-driving cars practical, in my opinion.

A video of the demonstration can be viewed below:

Incidentally, MIT Technology Review notes that this is not the first time we’ve seen researchers manage to fool a Tesla in Autopilot mode:

In an 18-month-long research process, [McAfee researchers] Trivedi and Povolny replicated and expanded upon a host of adversarial machine-learning attacks including a study from UC Berkeley professor Dawn Song that used stickers to trick a self-driving car into believing a stop sign was a 45-mile-per-hour speed limit sign. Last year, hackers tricked a Tesla into veering into the wrong lane in traffic by placing stickers on the road in an adversarial attack meant to manipulate the car’s machine-learning algorithms.

All that said, the experiments above prove that we have a long way to go before fully autonomous cars become a reality.

Interestingly enough, you might recall that Elon Musk back in 2016 and 2017 promised that the self-driving technology Tesla was working on was sufficiently advanced as to potentially allow a car to drive itself from California to New York with absolutely zero driver interaction. At the time, Musk said that Tesla would demo this capability before 2018.

“In November or December of this year,” Musk said during a talk in 2017, “we should be able to go all the way from a parking lot in California to a parking lot in New York with no controls touched in the entire journey.” Further, Musk added that the Model S used for the demo would not follow a fixed route in order to show off the software’s ability to respond to shifts in traffic congestion and other external factors in real-time.

Fast forward to 2020 and there’s no denying that Tesla’s self-driving software has improved dramatically. Still, the idea of fully ignoring the road and letting a car drive itself with no driver oversight currently seems implausible, if not downright dangerous.

Yoni Heisler Contributing Writer

Yoni Heisler has been writing about Apple and the tech industry at large with over 15 years of experience. A life long expert Mac user and Apple expert, his writing has appeared in Edible Apple, Network World, MacLife, Macworld UK, and TUAW.

When not analyzing the latest happenings with Apple, Yoni enjoys catching Improv shows in Chicago, playing soccer, and cultivating new TV show addictions.