A self-driving Uber SUV struck and killed a pedestrian in suburban Phoenix in the first death involving a fully autonomous test vehicle — a crash that could have far-reaching consequences for the new technology.
The fatality Sunday night in Tempe was the event many in the auto and technology industries were dreading but knew was inevitable.
Uber immediately suspended all road-testing of such autos in the Phoenix area, Pittsburgh, San Francisco and Toronto. The testing has been going on for months as automakers and technology companies like the ride-hailing service compete to be the first with cars that operate on their own.
The Volvo was in self-driving mode with a human backup driver at the wheel when it hit 49-year-old Elaine Herzberg as she was walking a bicycle outside the lines of a crosswalk, police said. She died at a hospital.
Uber CEO Dara Khosrowshahi expressed condolences on his Twitter account and said the company is working with local law enforcement on the investigation.
The National Transportation Safety Board, which makes recommendations for preventing crashes, and the National Highway Traffic Safety Administration, which can enact regulations, sent investigators.
Tempe police Sgt. Ronald Elcock said local authorities haven't drawn any conclusions about who is at fault but urged people to use crosswalks. He told reporters at a news conference Monday the Uber vehicle was traveling around 40 mph when it hit Helzberg immediately as she stepped on to the street.
Neither she nor the backup driver showed signs of impairment, he said.
The public's image of the vehicles will be defined by stories like the crash in Tempe, said Bryant Walker Smith, a University of South Carolina law professor who studies self-driving vehicles.
Although the Uber vehicle and its human backup could be at fault, it may turn out that there was nothing either could have done to stop the crash, he said.
Either way, the fatality could hurt the technology's image and lead to a push for more regulations at the state and federal levels, Smith said.
Autonomous vehicles with laser, radar and camera sensors and sophisticated computers have been billed as the way to reduce the more than 40,000 traffic deaths a year in the U.S. alone. Ninety-four percent of crashes are caused by human error, the government says.
Autonomous vehicles don't drive drunk, don't get sleepy and aren't easily distracted. But they do have faults.
"We should be concerned about automated driving," Smith said. "We should be terrified about human driving."
In 2016, the latest year available, more than 6,000 U.S. pedestrians were killed by vehicles.
The federal government has voluntary guidelines for companies that want to test autonomous vehicles, leaving much of the regulation up to states.
Many states, including Michigan and Arizona, have taken a largely hands-off approach, hoping to gain jobs from the new technology, while California and others have taken a harder line.
California is among states that require manufacturers to report any incidents during the testing phase. As of early March, the state's motor vehicle agency had received 59 such reports.
Arizona Gov. Doug Ducey used light regulations to entice Uber to the state after the company had a shaky rollout of test cars in San Francisco. Arizona has no reporting requirements.
Hundreds of vehicles with automated driving systems have been on Arizona's roads.
Ducey's office expressed sympathy for Herzberg's family and said safety is the top priority.
The crash in Arizona isn't the first involving an Uber autonomous test vehicle. In March 2017, an Uber SUV flipped onto its side, also in Tempe. No serious injuries were reported, and the driver of the other car was cited for a violation.
Herzberg's death is the first involving an autonomous test vehicle but not the first in a car with some self-driving features. The driver of a Tesla Model S was killed in 2016 when his car, operating on its Autopilot system, crashed into a tractor-trailer in Florida.
The NTSB said that driver inattention was to blame but that design limitations with the system played a major role in the crash.
The U.S. Transportation Department is considering further voluntary guidelines that it says would help foster innovation. Proposals also are pending in Congress, including one that would stop states from regulating autonomous vehicles, Smith said.
Peter Kurdock, director of regulatory affairs for Advocates for Highway and Auto Safety in Washington, said the group sent a letter Monday to Transportation Secretary Elaine Chao saying it is concerned about a lack of action and oversight by the department as autonomous vehicles are developed. That letter was planned before the crash.
Kurdock said the deadly crash should serve as a "startling reminder" to members of Congress that they need to "think through all the issues to put together the best bill they can to hopefully prevent more of these tragedies from occurring."
Krisher reported from Detroit, Fonseca reported from Flagstaff, Arizona. Susan Montoya Bryan in Albuquerque, New Mexico, and Jacques Billeaud in Phoenix contributed to this story.
This story has been corrected to show that federal investigators found Tesla's Autopilot system was a factor in the deadly Florida crash.
Copyright © The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.