Sponsored By

A fatal autonomous vehicle accident in Arizona this week throws the spotlight on regulators and testing regimens.

Stephen Moore

March 20, 2018

2 Min Read
How safe are self-driving vehicles?

Fact: According to National Highway Traffic Safety Administration data, 37,461 lives were lost on U.S. roads in 2016, including 5,987 pedestrian fatalities. This equates to a fatality rate of 1.18 deaths per 100 million VMT (vehicle miles travelled). The data also included one death attributed to a driver traveling in self-drive mode; the Tesla he was driving collided with a truck.

While even 2017 data has yet to be released, 2018 data will sadly include the first fatality attributed to testing of an autonomous vehicle—a pedestrian crossing the road in Tempe, AZ, was hit, despite a human monitor being behind the wheel of an Uber self-driving vehicle.

An Uber autonomous vehicle caused the first fatality in the U.S. attributed to self-driving cars in Tempe, AZ, this week.

Considering that industry leader Waymo's vehicles drove 2 million miles in self-driving mode in 2017, and Uber has only driven around 1 million miles-plus of public roads to date, the safety numbers already don’t look good for autonomous vehicle testing this year, even if many more miles are racked up.

Billions of dollars are being poured into the development of autonomous vehicles and they are seen as the path to future cities in a shared economy, with less congestion and pollution, and less need for parking spaces, opening up vast tracts of urban land that can be put to better use. There’s no doubt that self-driving cars are here for the long run. The big question now is how to regulate and supervise the extensive testing that will be required for this mode of transportation to become safer than humans driving.

There’s also a moral dilemma that must be resolved. Faced with careering into multiple pedestrians crossing the road illegally or veering off the road and smashing into a single innocent bystander, what does the AI in an autonomous vehicle choose to do?

Anthony Foxx, who served as U.S. Secretary of Transportation under former President Barack Obama, tweeted: “There is still so much to know about the Tempe driverless car accident resulting in a loss of life. That said, this is a wake-up call to the entire AV industry and government to put a high priority on safety.”

And he’s right. Autonomous vehicle technology might be being deployed too rapidly for its own good. According to the California's Department of Motor Vehicles’ Autonomous Vehicle Disengagement Reports 2017, such human interventions are a regular occurrence. Waymo, for example, reported 63 driver-initiated disengagements between Dec. 1, 2016, and Nov. 30, 2017, over 352,544.6 autonomous miles on public roads. Most of these were due to “hardware discrepancies,” “unwanted maneuver of the vehicle” and “perception discrepancy.” Perhaps it’s time to take a back seat, review the pace of testing on public roads and come up with a safe and secure mechanism for transforming the promise of self-driving to reality.

About the Author(s)

Stephen Moore

Stephen has been with PlasticsToday and its preceding publications Modern Plastics and Injection Molding since 1992, throughout this time based in the Asia Pacific region, including stints in Japan, Australia, and his current location Singapore. His current beat focuses on automotive. Stephen is an avid folding bicycle rider, often taking his bike on overseas business trips, and is a proud dachshund owner.

Sign up for the PlasticsToday NewsFeed newsletter.

You May Also Like