Click to Skip Ad
Closing in...
  1. Amazon Prime Day Deals 2021
    12:56 Deals

    Amazon just announced a ton of new deals for day 2 of Prime Day

  2. Prime Day Deals 2021
    04:05 Deals

    Amazon Prime Day deals 2021: See hundreds of the best deals right here

  3. Amazon Dash Smart Shelf
    15:16 Deals

    I’m obsessed with this Amazon gadget you’ve never heard of – and it&#821…

  4. Prime Day Nest Thermostat Deal
    16:28 Deals

    The newest Nest Thermostat rarely goes on sale, but it’s $99.98 for Prime Day

  5. Amazon Gift Card Prime Day Deals
    07:58 Deals

    Free money is definitely Amazon’s hottest deal of Prime Day 2021




After crashing into a firetruck, self-driving cars just hit another roadblock: The Senate

March 15th, 2018 at 8:06 PM
Self-driving car legislation

Self-driving cars are only getting more and more common. Just today, General Motors announced that it would invest $100 Million to start producing a consumer version of the self-driving Chevy Bolt. Semi-autonomous driving technology, like Tesla’s infamous Autopilot, is now available in many more car models than even a few years ago.

While many driver eagerly await the day they can fall asleep at the wheel (if their autonomous car even has one) in traffic and wake up at their destination, legislators in the Senate are taking a look at what this new technology means when it comes to traffic and safety laws on the road.

That’s why on Wednesday, a handful of Democrat US Senators raised concerns with a bill that intends to fast-track testing and deployment of self-driving cars across the country.

The bill, which passed the House in September, lets self-driving car manufacturers bypass certain rules that require cars to have human controls, like steering wheels and brake pedals. Self-driving cars don’t need these features, but manufacturers are stuck following the same rules that apply to say, your 2008 Honda Civic that makes a strange noise in 2nd gear.

The five Democratic senators, in a letter to the bill’s Republican sponsors, said that “exemptions from current safety standards should be temporary and reviewable” and that the “bill should at least require vehicles with partial automation to also be subject to safety evaluation reports. This would help assure safety.”

The takeaway from this: As self-driving and semi-autonomous cars become more common, legislators will need to find the sweet-spot between making sure the new technology is safe, and not overly restricting innovation and convenience.

The concerned Senators cited the crash that happened in January between a Tesla on Autopilot and a firetruck as evidence of the risk these technologies pose. On the other hand, around 16,000 car crashes happen every day across the country, and 94% of crashes cite human error as a factor.

Proponents of self-driving technology claim that computers are way more reliable than humans, but some people just can’t get past the idea of trusting their lives to sensors and lines of code, even if we do that inadvertently with other technologies, often without even noticing.While we don’t know what the final version of this bill will look like, we can expect the debate to ramp up as these vehicles slowly start taking over the road.




Popular News