A self-driving Uber recently struck and killed a woman in an Arizona suburb. While the local Chief of Police placed the blame largely on the pedestrian, two experts who viewed footage of the incident argue that the self-driving Uber should have “seen” the woman, suggesting that the vehicle itself may have been at fault.
Pedestrian Killed in Arizona
On the evening of March 18, 2018, a Volvo XC90 SUV outfitted with Uber’s sensing system was heading northbound on Curry Road at approximately 38 miles per hour, approaching a busy intersection in Tempe, AZ, a suburb of Phoenix. The Volvo was part of a test of self-driving vehicles being conducted by Uber. While there was a safety driver behind the wheel, the vehicle was traveling in self-driving mode at the time.
As the self-driving Uber was approaching the intersection, 49-year-old Elaine Herzberg was crossing the street with her bicycle, roughly 100 yards from the crosswalk. When Ms. Herzberg stepped down from a concrete median and into a lane of traffic, the self-driving Uber struck and killed her, apparently without slowing down.
This self-driving Uber accident is thought to be the first fatal pedestrian crash involving an autonomous vehicle. In response to the incident, Uber immediately suspended all road-testing of autonomous vehicles in the Phoenix area, Pittsburgh, San Francisco, and Toronto pending a full investigation by the National Transportation Safety Board (NTSB).
Was the Self-Driving Uber to Blame?
While the NTSB has not yet reached a conclusion as to the cause of the crash, an early statement from Tempe Chief of Police Sylvia Moir suggested that the pedestrian was likely at fault.
“It’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway,” said Chief Moir. “It is dangerous to cross roadways in the evening hour when well-illuminated managed crosswalks are available. The driver said it was like a flash, the person walked out in front of them. His first alert to the collision was the sound of the collision.”
Despite Chief Moir’s apparent certainty, however, two experts who viewed video footage of the incident recently told the Associated Press that the blame in the matter is not so obvious. The self-driving Uber’s laser and radar sensors should have spotted Ms. Herzberg and her bicycle in time to brake, they contend, meaning that the autonomous vehicle may have been more at fault.
“The victim did not come out of nowhere,” said Bryant Walker Smith, a University of South Carolina law professor who studies autonomous vehicles. “She’s moving on a dark road, but it’s an open road, so Lidar (laser) and radar should have detected and classified her” as a human being.
Mr. Smith added that, while the video may not show the complete picture of what happened, “this is strongly suggestive of multiple failures of Uber and its system, its automated system, and its safety driver.”
Sam Abuelsmaid, an analyst for Navigant Research who also follows autonomous vehicle technology, agreed, arguing that laser and radar systems can see in the dark much better than humans or cameras and that Ms. Herzberg was well within the range to be detected.
“It absolutely should have been able to pick her up,” said Mr. Abuelsmaid. “From what I see in the video it sure looks like the car is at fault, not the pedestrian.”
Additionally, Mr. Smith suggested that the Uber’s safety driver, 44-year-old Rafael Vasquez, appeared to have been depending too much on the self-driving system because he was not looking up at the road.
“The safety driver is clearly relying on the fact that the car is driving itself,” said Mr. Smith. “It’s the old adage that if everyone is responsible no one is responsible. This is everything gone wrong that these systems, if responsibly implemented, are supposed to prevent.”
Defective Product Litigation
We are surrounded by products in our modern world. They are manufactured by people we have never met in parts of the world we have never seen. We trust that these products are safe, that they are manufactured with the best materials and under the best conditions by companies with the best of intentions.
But what if they are not? What if you or a loved one is severely injured or even killed by a malfunctioning self-driving vehicle or a common household item? Who should be held accountable in the case of a defective product injury?
At times like these, injured people may find that they could benefit from the guidance of an experienced and knowledgeable lawyer, like the Chicago personal injury lawyers at GWC Injury Lawyers, Illinois’ largest Personal Injury and Workers’ Compensation law firm.
If you have been wrongfully injured, whether by a self-driving vehicle or in some other way, please contact GWC today to schedule a free consultation with one of our attorneys. Call our office at (312) 464-1234 or click here to chat with one of our representatives.<< BACK TO BLOG POSTS