Earlier this year, an autonomous vehicle that was involved in a trial conducted by Uber struck and killed a woman in what is thought to have been the first self-driving car death involving a pedestrian. While the company reached a settlement with the woman’s family, a recent report suggests that its executives potentially ignored a warning raising concerns about the safety of the Uber self-driving car program just days before the fatal crash.
Uber’s “System-Design Failure”
On the evening of March 18, 2018, a Volvo XC90 SUV outfitted with Uber’s sensing system was approaching a busy intersection in Tempe, AZ. The Volvo was part of a test of self-driving vehicles that Uber had been conducting. While there was a human test driver behind the wheel, the vehicle was traveling in self-driving mode at the time.
As the vehicle was heading towards the intersection, 49-year-old Elaine Herzberg was crossing the street with her bicycle, roughly 100 yards from the crosswalk. When she stepped down from the concrete median and into a lane of traffic, the Uber struck and killed her, apparently without having slowed down at all.
Soon after, Uber reached a wrongful death settlement with the pedestrian’s family. The rideshare giant also permanently shuttered its self-driving car test program in Arizona and temporarily halted similar initiatives in Pittsburgh, San Francisco, and Toronto until full investigations by the National Transportation Safety Board (NTSB) and local police into the Uber self-driving car crash could be completed.
Following its investigation, the NTSB placed significant blame for the crash on a “system-design failure” on the part of Uber’s vehicle. Essentially, while the vehicle detected the pedestrian’s presence six seconds before the crash, it struggled to properly identify her until 1.3 seconds before impact, when it was determined that emergency braking was needed. However, because the emergency braking system had been disabled to prevent conflict with the self-driving system, braking became the responsibility of the vehicle’s test driver, who was reportedly streaming The Voice on her phone and she failed to stop in time.
Uber Self-Driving Car Death Comes Days after Warning
As tragic as they are, fatal auto accidents happen every day. Statistics show that 94 percent of accidents are the result of human error. Advocates of autonomous vehicle technology are quick to point out that this demonstrates the benefit of self-driving systems, which promise to eliminate the majority of negligent actions. But this technology is still developing, so it may be important to ask: Could Uber have prevented Elaine Herzberg’s death? According to an internal email recently made public by tech news website The Information, high-ranking executives within the company had been warned of safety issues with its self-driving vehicles and should have anticipated the possibility of such an incident.
On March 13, five days before the fatal crash, Robbie Miller, a manager in the testing-operations group, sent an 890-word email to Eric Meyhofer, the head of Uber’s autonomous vehicle unit, and six other Uber executives and attorneys, warning that the self-driving cars being tested were “routinely in accidents resulting in damage…usually the result of poor behavior of the operator or the AV [autonomous vehicle] technology,” with “several of the drivers [appearing] not to have been properly vetted or trained.”
Miller’s email also noted that the company’s autonomous vehicles were “hitting things every 15,000 miles,” that an Uber self-driving car was damaged “nearly every other day in February,” that “near-miss” incidents occurred as often as every hundred miles, and that test drivers had to take control of the vehicles every one to three miles.
Miller described multiple incidents, including one in which a self-driving car “drove on the sidewalk” and another in which a crash was averted solely because the driver of another vehicle swerved to avoid Uber’s vehicle. He also complained of a notable lag in the proper investigation of such incidents, sometimes as long as two weeks – when they were not ignored altogether.
Uber Expanding Too Quickly?
In Miller’s assessment, the more miles the autonomous vehicles were driving, the more likely these potentially dangerous incidents were to occur – something that seemed inevitable in a corporate culture such as Uber’s, which, The Information speculates, emphasized the number of miles driven as a gauge of how advanced its software was becoming. For example, the company boasted that its self-driving fleet had already driven two million miles by December 2017, just 100 days after crossing the one-million-mile threshold.
To reduce the likelihood of a serious or even fatal Uber self-driving car crash, Miller proposed a significant reduction of on-the-road vehicle testing, claiming that the number of vehicles could be reduced by as much as 85 percent without slowing the development of the software because the company’s cars were already collecting “more than enough data.” He also recommended that Uber reinstate its former policy of having two test drivers in each vehicle.
Miller reportedly did not receive a direct response to his email, though he was assured by his manager that the company would look into the issues that he raised.
Just five days later, Elaine Herzberg was killed.
Chicago Uber Accident Attorneys
With over $2 billion successfully recovered for our clients, GWC is one of Illinois’ most respected Personal Injury and Workers’ Compensation law firms.
If you have been injured in an Uber accident, please contact GWC today to schedule a free consultation with one of our Chicago Uber accident attorneys. Call our office at (312) 999-9999 or click here to chat with one of our representatives at any time.