Click to Skip Ad
Closing in...

After crashing into a firetruck, self-driving cars just hit another roadblock: The Senate

Published Mar 15th, 2018 8:06PM EDT
Self-driving car legislation
Image: YouTube

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Self-driving cars are only getting more and more common. Just today, General Motors announced that it would invest $100 Million to start producing a consumer version of the self-driving Chevy Bolt. Semi-autonomous driving technology, like Tesla’s infamous Autopilot, is now available in many more car models than even a few years ago.
While many driver eagerly await the day they can fall asleep at the wheel (if their autonomous car even has one) in traffic and wake up at their destination, legislators in the Senate are taking a look at what this new technology means when it comes to traffic and safety laws on the road.

That’s why on Wednesday, a handful of Democrat US Senators raised concerns with a bill that intends to fast-track testing and deployment of self-driving cars across the country.

The bill, which passed the House in September, lets self-driving car manufacturers bypass certain rules that require cars to have human controls, like steering wheels and brake pedals. Self-driving cars don’t need these features, but manufacturers are stuck following the same rules that apply to say, your 2008 Honda Civic that makes a strange noise in 2nd gear.

The five Democratic senators, in a letter to the bill’s Republican sponsors, said that “exemptions from current safety standards should be temporary and reviewable” and that the “bill should at least require vehicles with partial automation to also be subject to safety evaluation reports. This would help assure safety.”

The takeaway from this: As self-driving and semi-autonomous cars become more common, legislators will need to find the sweet-spot between making sure the new technology is safe, and not overly restricting innovation and convenience.

The concerned Senators cited the crash that happened in January between a Tesla on Autopilot and a firetruck as evidence of the risk these technologies pose. On the other hand, around 16,000 car crashes happen every day across the country, and 94% of crashes cite human error as a factor.

Proponents of self-driving technology claim that computers are way more reliable than humans, but some people just can’t get past the idea of trusting their lives to sensors and lines of code, even if we do that inadvertently with other technologies, often without even noticing.While we don’t know what the final version of this bill will look like, we can expect the debate to ramp up as these vehicles slowly start taking over the road.

Chris Mills
Chris Mills News Editor

Chris Mills has been a news editor and writer for over 15 years, starting at Future Publishing, Gawker Media, and then BGR. He studied at McGill University in Quebec, Canada.