04252018Headline:

Pittsburgh, Pennsylvania

HomePennsylvaniaPittsburgh

Email Eric T. Chaffin Eric T. Chaffin on Twitter Eric T. Chaffin on Facebook Eric T. Chaffin on Avvo
Eric T. Chaffin
Eric T. Chaffin
Attorney • (888) 480-1123

Uber Self-Driving Crash in AZ Raises Questions About Technology

0 comments

Self-Driving Fatal Uber Crash

Uber has halted testing of self-driving cars once again following a crash in Arizona.

This is the second time the company has had to stop and re-evaluate its technology following a crash. The first time occurred in March 2017, when a Honda crashed into a self-driving Volvo run by Uber. No one was hurt. The company halted testing for only a few days before resuming again the following Monday, after discovering that everything on the self-driving car has worked properly and that the driver of the Honda was at fault.

This time things are different because a person was killed in the incident.

Self-Driving Car Strikes and Kills a Pedestrian

According to Bloomberg, Uber has halted testing again because one of its self-driving cars struck and killed a pedestrian in Tempe, Arizona. The 49-year-old woman was crossing the road at about 10:00 at night when the Uber car—which was operating in self-driving mode at the time, but did have a supervisory driver inside—struck her and injured her. She was taken to the hospital and later died of her injuries.

In response to the incident, which is still under investigation, Uber halted testing of its self-driving cars not only in Arizona, but in Pittsburgh, San Francisco, Toronto, and Phoenix, too.

Meanwhile, new questions are coming up about the safety and regulation of self-driving vehicles. So far, all tests have involved a driver behind the wheel, but the New York Times reports that starting in April, companies will be allowed to test these vehicles on the road without drivers.

Arizona has already taken that step, with self-driving company Waymo currently using cars without human drivers in the state.

Autonomous Car Crash Raises Safety Questions

Technology proponents are convinced that self-driving cars will be much safer than human-driving cars on the road, once all the bugs are worked out.

The Rand Corporation released a report a few months ago noting that more lives would be saved under a more “permissive” policy—when the safety performance for self-driving vehicles is just 10 percent better than that of the average human driver—rather than a more restrictive policy that requires them to be 75 to 90 percent safer.

Yet, there are reminders that the technology isn’t perfect. In addition to the two Uber crashes, there have been others. In May 2016, a Tesla self-driving vehicle was in self-driving mode when a crash occurred, killing the driver in the car. A tractor-trailer made a left turn in front of the Tesla vehicle, and the car failed to apply the brakes.

A Google autonomous car was also involved in an accident in California. It was a Google Lexus AV that was trying to maneuver around sandbags surrounding a storm drain when it crashed into a public transit bus. The vehicle was damaged, but there were no injuries. Google later reported that a software problem was to blame for the accident.

The Washington Post reports that this new crash in Arizona raises questions about how these cars will respond when faced with human behavior that “may not always align with the letter of the law.” The pedestrian in this case was walking outside of the crosswalk. Slate reported that the eight-lane road has only one crosswalk in nearly two miles, “making jaywalking a requirement of the urban design.”

Leave a Comment

Have an opinion? Please leave a comment using the box below.

For information on acceptable commenting practices, please visit Lifehacker's guide to weblog comments. Comments containing spam or profanity will be filtered or deleted.