When Tesla flipped the switch and activated its Autopilot software late last year, it didn’t take long before Model S owners began pushing the software to the absurd limits. In short order, we began seeing videos of Model S owners engaged in downright dangerous behavior, from a driver who decided to eat breakfast while his Model S zoomed down a highway at 90 MPH to an individual who turned on Autopilot and then decided – I kid you not – to go sit in the backseat.
Such activities may make for entertaining viral videos, but when the “toy” you’re playing around with is actually a 5,000 pound car that can cause untold amounts of damage, such behavior is nothing short of a legitimate and very real danger to the public at large.
DON’T MISS: 7 hidden iOS 10 features iPhone users can look forward to
As a result, Tesla a few months ago said that it would implement measures to limit some of the more ridiculous activities Model S owners were engaged in.
“There’s been some fairly crazy videos on YouTube,” Tesla CEO Elon Musk said during a conference call a few months ago. “This is not good. And we will be putting some additional constraints on when autopilot can be activated to minimize the possibility of people doing crazy things with it.”
Tesla did end up restricting the use-cases for Autopilot, and now comes word via Electrek that the company plans to implement even more safeguards around the software.
New safeguards surrounding Autopilot are reportedly the result of recent Tesla Model S accidents that transpired while drivers were misusing the Autopilot feature. For instance, a recent Tesla accident in China occurred when a driver was busying himself with retrieving a cloth from the glove compartment and wiping down his dashboard. In another accident in Montana, Tesla discovered that a Model X driver ignored repeated warnings to assume control over the car.
That being the case, once Tesla introduces its next-gen Autopilot software, there will be more constraints regarding when and how drivers can use the feature. As it stands now, Autosteer is designed to turn off when a driver fails to respond after 15 seconds of “visual warnings and audible tones.” With an impending update, the restrictions will become just a tad more restrictive.
According to sources familiar with the Autopilot program, Tesla will add a safety restriction that will result in not only the Autopilot disengaging after alerts are repeatedly ignored, but also blocking the driver from re-engaging the feature after it was automatically disengaged.
The driver will not be able to reactivate the Autopilot until the car is stopped and put in ‘Park’. So far, it looks like it would only affect the Autosteer feature of the Autopilot and TACC would still be available for the duration of the drive.
The goal of the new restriction appears to be to encourage Tesla owners to respond to the visual alert and not to ignore them.
There’s no official word on when Tesla plans to roll out this update to users.
Additionally, a report from Electrek this past July suggests that an impending update to Tesla’s Autopilot software will, among other things, be able to more precisely display other cars on the digital dash along with the exact direction and angle they are moving in. The publication also adds that Autopilot will soon be able to identify and react appropriately to traffic signs and traffic lights.
Meanwhile, Tesla’s next-gen Autopilot hardware is reportedly in its final testing phases. According to reports, upcoming Tesla models will sport more radar units and enhanced camera systems all around the car, all making for much smarter software.