Click to Skip Ad
Closing in...

Model 3 driver falls asleep with Autopilot on, promptly crashes into 11 construction barrels

Published Jul 25th, 2019 11:16AM EDT
Tesla Autopilot
Image: Ena/AP/REX/Shutterstock

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

While Tesla’s Autopilot software has improved considerably over the past few years, there’s still plenty of room for improvement. Part of the problem is that Tesla owners seem to have a bad habit of using the feature improperly. While Tesla makes a point of telling drivers that they must keep their eyes on the road when Autopilot is engaged, not every Tesla owner actually follows suit. And as we’ve seen in recent years, this type of behavior can often result in a serious crash, and in some instances, tragic consequences.

The latest instance of Autopilot gone awry is thankfully a bit more on the lighthearted side, which is to say that no one was killed or even injured. The incident in question involves a Model 3 owner who seemingly opted to not pay attention to the road while his Autopilot feature was turned on. Consequently, his Model 3 drove into a construction zone and plowed through a string of 11 construction barrels.

According to the poster — who goes by the handle Richard FS — the video is meant to show the shortcomings of Tesla’s Autopilot feature. It’s worth noting though, that Richard FS admits to falling asleep at the wheel.

Tesla Model 3 with FSD option. Automatic Emergency Braking totally failed me on the one time I needed it most,” the video’s caption reads. “With all the phantom braking events I have experienced in the 2-1/2 months I’ve owned it, it does seem like it would panic when it saw this coming

This accident was my fault. I fell asleep at the wheel. I wasn’t sleepy prior to falling asleep or I would have done something about that. That’s actually the scary part. 2. A Tesla driver on Autopilot should not allow oneself to be lured into a false sense of security and should maintain vigilance at all times. I had monitored Autopilot for about an hour prior to the collision without a single hint of an incident. 3. Since Autopilot CAN avoid hitting a small human (either by braking or swerving or both) then I would expect that it could detect a 50-gallon drum in the middle of the road. I say that I expected this because I get sudden braking at times when there is nothing to brake about, and Tesla has told me that it detected something that could have been a hazard so it braked out of an abundance of caution. The conclusion from that is that it is better to brake when not needed that to not brake when needed. And that’s my point. If it’s so conservative on what constitutes a hazard then what about those 10 barrels? Yes, they’re plastic and not picked up by radar but a small child isn’t either, so it’s relying on those 3 forward facing cameras. Why didn’t it detect the barrels?

Predictably, the comments on the video were quick to attack Richard FS for not having his eyes on the road.

The larger takeaway here is that Autopilot isn’t anywhere close to being sophisticated enough to enable drivers to fall asleep at the wheel. And on a related note, it’s worth mentioning that Tesla’s Autopilot program appears to be in a bit of disarray, with Elon Musk recently restructuring the team after growing frustrated with its lack of progress in certain performance areas.

Yoni Heisler Contributing Writer

Yoni Heisler has been writing about Apple and the tech industry at large with over 15 years of experience. A life long expert Mac user and Apple expert, his writing has appeared in Edible Apple, Network World, MacLife, Macworld UK, and TUAW.

When not analyzing the latest happenings with Apple, Yoni enjoys catching Improv shows in Chicago, playing soccer, and cultivating new TV show addictions.