NTSB Blames Self-Driving Car Death on Uber’s “System-Design Failure”

 In Auto Accidents Blog

Self-Driving Car DeathA recent National Transportation Safety Board (NTSB) report places blame for the first self-driving car death involving a pedestrian on a “system-design failure” on the part of Uber. The report came just one day after Uber announced a permanent shutdown of its autonomous vehicle test program in Arizona in the wake of the recent self-driving car death.

First Self-Driving Car Death Involving Pedestrian

On the evening of March 18, 2018, a Volvo XC90 SUV outfitted with Uber’s sensing system was approaching a busy intersection in Tempe, AZ at approximately 38 miles per hour. The Volvo was part of a test of self-driving vehicles being conducted by the rideshare company. While there was a test driver behind the wheel, the vehicle was traveling in self-driving mode at the time.

As the Uber vehicle was approaching the intersection, 49-year-old Elaine Herzberg was crossing the street with her bicycle, roughly 100 yards from the crosswalk. When Ms. Herzberg stepped down from a concrete median and into a lane of traffic, the self-driving Uber struck and killed her, apparently without slowing down. This incident was thought to be the first self-driving car death involving a pedestrian.

Soon after, Uber reportedly reached a wrongful death settlement with Ms. Herzberg’s husband and daughter for an undisclosed amount.

Uber Shutters Arizona Test Program

In response to the incident, Uber halted all road-testing of autonomous vehicles in the Phoenix area, Pittsburgh, San Francisco, and Toronto pending full investigations by the NTSB and the Tempe Police Department. Arizona Gov. Doug Ducey subsequently suspended the company’s ability to test self-driving cars on public roads in the state.

On May 23, two months after the incident, Uber announced that it has permanently shuttered its self-driving testing program in Arizona. The rideshare giant reportedly laid off close to 300 workers in the state, most of them test drivers.

While it has ended its Arizona test program, Uber still plans to restart self-driving operations in other markets, including Philadelphia and San Francisco, though “in a much more limited way.”

“We’re committed to self-driving technology, and we look forward to returning to public roads in the near future,” said an Uber spokesperson, who pointed to the company’s recent hire of former NTSB Chair Christopher Hart as evidence of Uber’s commitment to a “top-to-bottom” review of its “overall safety culture.”

Uber also announced that its other autonomous vehicle operations would not resume until investigations into the Arizona self-driving car death were complete.

NTSB Finds “System-Design Failure”

Uber did not have to wait long. On May 24, one day after the company’s announcement, the NTSB issued its preliminary report on the self-driving car death. While a final report is still forthcoming, the NTSB found fault with what could be characterized as Uber’s “system-design failure.”

Like other autonomous vehicles, Uber’s self-driving computer system consists of three modules.

The Perception Module uses information gathered by the vehicle’s sensors to identify relevant objects nearby. The Uber vehicle that struck and killed the Arizona pedestrian was equipped with the following Perception Module sensors:

  • Cameras, which can spot such features as traffic lights, lane markings, and road signs;
  • Radar, which measures the speed of nearby objects; and
  • LIDAR, a variant of radar that uses invisible pulses of light to determine the shape of the vehicle’s surroundings in great detail, even in the dark.

The self-driving computer system combines readings from these sensors to build a model of the world, which machine-learning systems use to identify cars, pedestrians, bicycles, and other objects.

The Prediction Module forecasts how these objects will behave in the next few seconds – for instance, will the pedestrian walking with the bicycle step into the road?

Finally, the Driving Policy Module determines what actions the vehicle should take in response to these predictions.

Experts working in autonomous-vehicle technology typically argue that the Perception Module is the most difficult to build – a claim that the NTSB report seems to support.

Even though the Uber vehicle’s radar and LIDAR sensors detected Ms. Herzberg six seconds before the crash, the Perception Module struggled to identify her, first classifying her as an unknown object, then as a motor vehicle, and ultimately as a bicycle, the path of which the Prediction Module could not determine.

Finally, just 1.3 seconds before impact, the Driving Policy Module determined that emergency braking was needed, but the emergency braking system had been disabled to prevent conflict with the self-driving system. Braking was instead the responsibility of the test driver, who was looking down at the vehicle’s display screen at the time of this incident and failed to stop, resulting in the tragic self-driving car death.

Human Failsafe Fails to Keep Pedestrian Safe

The circumstances leading to this self-driving car death may be complex, but the details as laid out in the NTSB report ultimately point towards a system-design failure as the chief culprit.

While a self-driving vehicle should slow down when its Perception Module becomes confused, unexpected braking brings its own problems, as vehicles are frequently rear-ended after suddenly decelerating. For this reason, self-driving vehicles such as Uber’s often delegate the responsibility for braking to human test drivers, who are there to compensate for self-driving system imperfections and ensure overall safety. However, this backup system only works if the human failsafe is paying attention to the road at all times, which the test driver in this incident was not.

Autonomous vehicle technology evangelists promote a vision of a world in which traffic injuries are rare, pointing out that 94 percent of accidents are caused by human error, something that self-driving systems would eliminate. Nevertheless, as long as self-driving cars continue to rely on human backups to compensate for system-design flaws, self-driving car death is likely to remain with us for the foreseeable future.

Chicago Uber Accident Attorneys

With more than $2 BILLION successfully recovered for our clients, GWC Injury Lawyers is Illinois’ largest Personal Injury and Workers’ Compensation law firm.

If you have been injured in an Uber accident, please contact GWC today to schedule a free consultation with one of our Chicago Uber accident attorneys. Call our office at (312) 464-1234 or click here to chat with one of our representatives.

Take Control

Our entire team of lawyers and support staff are dedicated to fighting for the justice that you deserve.

Contact Us

Start typing and press Enter to search