A pedestrian in Arizona killed when a self-driving Uber vehicle collided with her at night has Democratic senators scrutinizing the AV Start Act, a bill that supports the “development of highly automated vehicle safety technologies.”

Legal concerns for riders

The bill could make possible more wide-spread testing of Autonomous Vehicles, or self-driving cars. Critics say that the legislation forced arbitration clauses in terms of service agreements between autonomous vehicle manufacturers and the consumers that ride the cars they make.

Prior to the accident in early March, a group of 27 people representing bicyclists, pedestrians, and environmentalists sent a letter to Senate Majority Leader Mitch McConnell and Senate Minority leader Chuck Schumer outlining their disagreement that includes:

  • public sale of unproven autonomous vehicle technologies,
  • granting automakers broad and unsafe exemptions from existing federal standards
  • ignoring the need for the US DOT to issue minimum safety requirements

Activity surrounding the legalities and compliance needed to test Autonomous Vehicles has intensified.

In 2017, 33 states had introduced legislation regarding the topic compared to 20 states the year before.

Selling safety

Safety is a selling point for the makers of autonomous vehicles. The goal is to reduce crashes related to problems like driver fatigue, drunk driving, and being distracted.

Waymo, the car company owned by Google, says their mission is to “make it safe and easy for everyone to get around—without the need for anyone in the driver’s seat.”

In a safety report, General Motors (GM) writes that its mission statement is to make reality a world of “zero crashes, zero emissions, and zero [road] congestion.”

The cars’ ability to function flawlessly depends on three technologies as noted in an article from the Franklin Institute.

Sensors, connectivity, and software/control algorithms provide input and monitor a car’s surroundings. Software continually processes data and makes decisions on steering, braking, and route selection.

A detailed infographic from GM describes how machine learning, perception, and behavioral controls are integrated to create a safe journey “to identify pedestrians in a crosswalk, or an object darting suddenly into its path, and to respond accordingly.”

Key factors in the Uber collision

But is anything failsafe?

The Uber vehicle was traveling about 40 miles per hour when it collided with the pedestrian walking her bicycle across the road. She was outside the crosswalk at the time, according to the Tempe police. Police Chief Sylvia Moir said a crash would have been tough to avoid in any situation, and for any vehicle, since the victim, 49-year-old Elaine Herzberg had stepped out of the shadows and into the path of the car.

That statement brings into question the likelihood of achieving GM’s vision of zero crashes.

BMW’s top engineer has said that the radar and LiDAR technologies on board the Uber Volvo failed and the operator was distracted.

Self-driving vehicles are expected to become commercially viable in the coming decade as testing intensifies. A handful of Democratic senators need convincing not just that the technology is safe but that the companies won’t shift the burden of proof in an accident to the consumer.