Click to Skip Ad
Closing in...

Uber’s self-driving car saw pedestrian before fatal crash, but was programmed not to brake

Published May 24th, 2018 11:55AM EDT
Uber self-driving car accident braking report
Image: AP/REX/Shutterstock

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

The National Transportation Safety Board has released its preliminary report from the investigation into the fatal collision between an Uber self-driving vehicle and a pedestrian in Tempe, Arizona earlier this year. According to the report, Uber’s self-driving system classified the pedestrian (who was wheeling a bicycle at the time) as a bicycle, and decided 1.3 seconds before the collision that an emergency braking manouvre was needed to avoid a crash. However, the NTSB report says that Uber’s self-driving vehicles are not programmed to emergency brake, and instead rely on the human driver to intervene in such situations.

The four-page report provides a detailed look at the circumstances surrounding the crash, giving us the most complete picture yet of how the vehicle was configured. The real details of what caused this crash seem to come down to Uber’s safety decisions, and are buried at the bottom of the second page.

According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision. According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

[Emphasis mine]

To be clear: Uber’s own self-driving system determined 1.3 seconds before the collision that the car needed to hit the brakes to avoid a collision. Uber’s settings did not enable the car to hit the brakes, instead relying entirely on the safety driver. However, the system didn’t give any kind of visible or auditory warning to the driver that they needed to hit the brakes.

Making things worse, the report also confirms that the safety driver was expected to use an iPad mounted in the center console to monitor the self-driving system and flag events for later review:

In addition, the operator is responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review.

So what Uber did is disable the Volvo’s built-in collision avoidance features, give control of the vehicle to a self-driving system that’s unable to perform emergency braking, and then rely on a human driver — who is also meant to be playing with a tablet — to avoid any last-minute collisions.

The preliminary report doesn’t lay blame for the crash on any party, but does spend some time noting that the pedestrian’s toxicology report showed meth and marijuana in her system. The report also notes that she “was dressed in dark clothing and that the bicycle did not have any side reflectors,” and that she crossed in a dark section of roadway.

Chris Mills
Chris Mills News Editor

Chris Mills has been a news editor and writer for over 15 years, starting at Future Publishing, Gawker Media, and then BGR. He studied at McGill University in Quebec, Canada.