Click to Skip Ad
Closing in...

The hysteria over Tesla’s Autopilot has been completely blown out of proportion

Updated Nov 22nd, 2019 4:28AM EST
BGR

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Over the past few weeks, Tesla’s Autopilot software has been unfairly singled out and scrutinized as a piece of technology that Tesla recklessly deployed before being 100% ready for day-to-day use. This hysteria against all things Tesla reached a fever pitch this week when Consumer Reports published a self-righteous piece calling for Tesla to disable Autopilot until they get the technology figured out. And late on Thursday, we learned that even the U.S. Senate Committee on Commerce, Science and Transportation directed a letter to Elon Musk asking him or a Tesla representative to answer a few questions.

With all of the commotion, speculation and, at times, wild accusations being thrown in Tesla’s direction, you might be forgiven for assuming that Teslas on Autopilot have been running amok like mindless zombies on The Walking Dead and causing accidents by the hundreds.

It’s time to jump back to reality.

DON’T MISS: New leaked photo shows the iPhone 7 in every single color

Despite an avalanche of hit-pieces and misguided op-eds, it’s far too soon to say with any certainty that Tesla’s Autopilot software has been the direct cause of any specific crash. With respect to the fatal Model S crash that happened in Florida last May, one witness claims that the driver may have been watching a portable DVD player at the time of the crash. Additionally, evidence suggests that the driver (a former Navy SEAL who friends say had a “need for speed“) was likely speeding at the time of the crash.

More recently, it came to light that the widely publicized Model X crash in Montana involving Autopilot was likely the result of a) the feature being used incorrectly and b) the driver ignoring repeated warning alerts to re-assume control of the vehicle.

Just last night, Tesla disclosed that the Model X that crashed and turned itself over on a Pennsylvania highway did not, as it turns out, have the Autopilot software turned on.

Now this isn’t meant to blame the victims, but rather to illustrate that perhaps education about Tesla’s Autopilot software would be more beneficial than blindly casting the technology off as something dangerous and to be avoided.

One area where Tesla may have made a misstep lies in the Autopilot name itself. It certainly works well from a marketing perspective, but calling the feature Autopilot arguably gives drivers a false sense of security: the name suggests, even on an unconscious level, that Teslas are futuristic vehicles capable of traversing through any and all road conditions with ease, all without the need for human assistance.

But in reality, Tesla’s Autopilot software in its current incarnation was only designed to be used in specific conditions, a fact Tesla makes a point of emphasizing.

autosteer warning tesla

The Autosteer feature in particular, Tesla informs its drivers, should only be used “on highways that have a center divider and clear lane markings, or when there is a car directly ahead to follow.” Tesla also adds that the feature should not be used “on other kinds of roads or where the highway has very sharp turns or lane markings that are absent, faded, or ambiguous.”

Nonetheless, Tesla has indicated that it will soon publish a blog post that more clearly delineates the capabilities and limitations of its Autopilot software.

Consumer Reports, though, isn’t satisfied and takes Tesla to task for selling consumers a “pile of promises about unproven technology.” Driving the point home, Consumer Reports’ VP of consumer policy and mobilization Laura MacCleery added that “consumers should never be guinea pigs for vehicle safety ‘beta’ programs.”

This is grossly misleading. To date, Tesla vehicles with Autopilot engaged are statistically safer than other cars out on the road. But because Tesla is Tesla, any word of a crash where Autopilot software may or may not have been activated gets blown up out of proportion. Thing is, we’ve already been down this road before: remember when the sky was falling back when some media outlets were reporting that Tesla cars were statistically more prone to catching on fire?

Consumer Reports suggests that Tesla disable Autopilot software until it’s ready for prime time use. But such a suggestion completely ignores the realities that govern technology as advanced as self-driving automotive software.

The following statement was made in the wake of the release of Apple Maps and I believe the same applies to Tesla’s Autopilot software.

Unfortunately, like dialect recognition or speech synthesis (think Siri), mapping is one of those technologies that can’t be fully incubated in a lab for a few years and unleashed on several hundred million users in more than a 100 countries in a “mature” state.

Autopilot software can only improve out in the real world when used by real drivers. If we as a society want to live in a world where cars can one day drive themselves, the stark reality is that there will be a growth or learning period involved that will have to take place out in the real world.

On another front, Jalopnik raises a good point in stating that we need to start holding drivers accountable irresponsibly using software designed to make everyone on the road safer.

Consumer Reports argues that Tesla’s Autopilot system, as it sits, is too open for misuse. I don’t buy that. Because nowhere in the system’s marketing, in its instructions, or even its implied promises of capability, does it ever absolve that responsibility that we’ve relied on for over a hundred years. It’s still the responsibility of the person sitting in the driver’s seat, not the responsibility of the company that created it. That’s how we’ve looked at cars since they’ve been invented, and no one is going after Ferrari, for instance, for all the deaths caused when its cars are used improperly. We’re not demanding it electronically limit the speeds of its cars, adjustable only to GPS-informed local speed limits.

And as Tim Stevens of CNET points out, we only hear about Tesla’s Autopilot software when crashes occur, not when they’re avoided.

This controversy is about many things, but at its core, I firmly believe Autopilot is a technology that can and has saved lives. The problem is, we’ll never know how many lives are being saved by the various active safety systems found in modern cars. We’ll only know when they fail to save a life.

But like most items that travel through the meat grinder that is the news cycle, information about Tesla’s Autopilot software has been completely distorted to serve the conclusory arguments that Tesla’s software is half-baked and not safe for drivers to use.

Tesla’s Autopilot feature is far from perfect, and undoubtedly, Tesla can take any number of steps to improve its performance and to better educate its consumer base about its limitations. If the feature malfunctions during normal use and results in an accident, the company deserves to be critiqued. But catapulting weighted accusations at the company based on incomplete and, often times, incorrect information does nothing more than needlessly engender fear.

Lastly, Consumer Reports’ list of four recommendations for Tesla underscore how deeply the company’s technology is misunderstood.

Two of the four recommendations read:

  • Disable Autosteer until it can be reprogrammed to require drivers to keep their hands on the steering wheel
  • Test all safety-critical systems fully before public deployment; no more beta releases

As to the first point, requiring drivers to keep their hands on the wheel runs counter to the very notion of what Autosteer is. What a laughable suggestion.

As to the second, Tesla has said publicly that Autopilot is beta software only in the sense that the system has yet to cumulatively log more than 1 billion miles.

Musk later clarified that Tesla will likely reach that data point sometime within six months.

Yoni Heisler Contributing Writer

Yoni Heisler has been writing about Apple and the tech industry at large with over 15 years of experience. A life long expert Mac user and Apple expert, his writing has appeared in Edible Apple, Network World, MacLife, Macworld UK, and TUAW.

When not analyzing the latest happenings with Apple, Yoni enjoys catching Improv shows in Chicago, playing soccer, and cultivating new TV show addictions.