The so-called self-driving car is looking more and more like a reality each day. While manufacturers argue that the self-driving car would make the roads safer by reducing human error, that claim was called into question last year following a fatal crash in which a Tesla failed to apply the brakes while in self-steering mode. Though a federal investigative agency recently determined that driver actions were the chief cause of the crash, it also noted that “operational limitations” of the Tesla itself played a “major role,” raising concerns about how new technology and human nature may collide, sometimes quite literally, in the near future.
Autonomy vs. Self-Driving in Practice: A Fatal Trip
On May 7, 2016, Joshua Brown was operating his Tesla Model S eastbound on U.S. Highway 27, outside of Williston, FL. Mr. Brown was reportedly traveling an estimated 74 miles per hour with the vehicle’s semi-autonomous Autopilot system engaged. At that time, Frank Baressi was hauling blueberries in a semi-trailer from the opposite direction. He then made a left turn in front of Mr. Brown who, it was later determined, made no evasive action or attempt to slow down. The windshield and roof of his vehicle were sheared off by the middle section of the semi-trailer. Mr. Brown was pronounced dead at the scene, the first recorded fatality in the United States involving what is frequently if inaccurately referred to as a self-driving car.
Tesla Autopilot: The Basics
Tesla features self-driving hardware in all of its current vehicles, including the Model S. Known as Tesla Autopilot, this semi-autonomous driver-assist mode uses cameras and sonars to allow it to handle some speed controls, steering functions, and warnings for lane departure. Perhaps most relevant to this situation, Autopilot reportedly has automatic emergency braking to detect objects that may hit the car and take preventative measures.
Tesla plans to produce fully self-driving vehicles by the end of 2017, a prospect that raises serious safety and legal concerns for critics of the technology. Will this self-driving technology malfunction, some wonder, by failing to detect and avoid other vehicles? And if so, who would be legally responsible for any injuries in a self-driving car crash?
The NHTSA Weighs In
Tesla has cautioned that Autopilot should only be used under the close supervision of the driver, much like the Federal Aviation Administration mandates that pilots monitor any aircraft operating in autopilot mode. The company has further argued that its Autopilot technology, which alerts drivers to potential hazards and can take emergency evasive action, actually reduces the likelihood of traffic collisions.
For its part, the National Highway Traffic Safety Administration (NHTSA), a federal agency within the Department of Transportation, largely agreed. In January 2017, following six months of investigation into the causes of the Florida crash, the agency issued a report of its findings. In addition to declining to implement a recall for Tesla’s semi-autonomous cars, the NHTSA noted that crash rates involving the vehicles had dropped nearly forty percent since Autopilot’s wide introduction.
The agency also reported that its investigation did not uncover any defects in the design or implementation of the vehicle’s Autopilot cruise capabilities or its automatic emergency braking system, and further concluded that Tesla both anticipated the potential effects of driver misuse and incorporated them into the final design of Autopilot.
The NHTSA ultimately determined that Mr. Baressi was at fault for the fatal collision because of his failure to yield the right of way when turning left, though it also noted that Mr. Brown was “not attentive” and “failed to take any evasive action.”
Though the NHTSA ultimately exonerated Tesla in this incident, a more recent report by another government agency argued that the company shared at least some of the blame for what happened by failing to address a key elephant in the room: the inattentive driver.
The Inattentive Driver and All-Too-Human Nature
The National Transportation Safety Board (NTSB) is an independent federal agency tasked with investigating civil transportation accidents, including certain types of highway crashes. On September 12, 2017, the NTSB issued its own report on the Tesla fatality. Like the NHTSA before it, the NTSB determined that the direct causes of the crash were the truck driver’s failure to yield and the “inattentive” Tesla driver’s overreliance on the vehicle’s cutting-edge technology. But it was that last part – the very nature of the inattentive driver – that led the agency to shift at least some of the blame onto the automaker itself.
Distracted driving is one of the leading causes of motor vehicle collisions. According to one recent study by Cambridge Mobile Telematics, distracted driving was involved in an estimated 52 percent of motor vehicle crashes in the United States. Far too often, drivers get so comfortable within their vehicles that they simply do not pay enough attention to the road ahead of them. (On a related note, Mr. Baressi alleged that Mr. Brown was watching a “Harry Potter movie” on a screen in his Tesla at the time of the crash, an allegation that no investigator has been able to confirm.) This common fact of the road, the inattentive driver, can sometimes lead to tragic results. And, ironically, Tesla’s use of Autopilot to combat inattentive driving may in some ways encourage it.
For example, the NTSB noted that, during the 37.5 minutes his vehicle’s cruise control and lane-keeping systems were in use before the crash, Mr. Brown only had his hands on the wheel for 25 seconds. To the outside observer, this suggests an especially low level of attentiveness that the semi-autonomous features of this vehicle may potentially have encouraged. The board recommended that all automakers, not just Tesla, incorporate safeguards that keep drivers’ attention engaged in order to combat the all-too-human tendency to let the mind wander.
The NTSB also concluded that Tesla and other auto manufacturers need to implement greater protections to ensure that their “lifesaving” technology is only used the way that it is actually intended so as to actually save lives.
In the case of the crash in question, Mr. Brown was driving a Tesla Model S, which is considered a “Level 2” on a self-driving car scale from 0 to 5. While Level 5 vehicles can operate more or less autonomously in basically every circumstance, Level 2 vehicles are far more limited. For example, they are designed to detect vehicles they are following directly behind in order to prevent rear-end collisions, but, as with Mr. Brown’s vehicle, the radar and cameras are not able to detect vehicles turning across their paths. As such, drivers of Level 2 vehicles are instructed to monitor their vehicles continuously in order to take control quickly and to limited their use of the technology to roadways without intersections – something that some, unfortunately, fail to do.
While Tesla explicitly informed Model S owners that Autopilot should only be used on limited-access highways, it did not incorporate protections against its use on other types of roads. The NTSB reissued its suggestions that the government should require automakers to equip new vehicles with technology that wirelessly transmits crucial information to other vehicles in order to prevent collisions.
“In this crash, Tesla’s system worked as designed, but it was designed to perform limited tasks in a limited range of environments,” said NTSB Chairman Robert Sumwalt. “Tesla allowed the driver to use the system outside of the environment for which it was designed.”
To put it another way: Technology may be ever-changing, but human nature remains stubbornly the same.
Car Accident Litigation
As must seem all too clear, even with the most modern of safeguards, a driver’s attention can still drift, resulting in serious injury to others on the road, all through no fault of their own. When this happens, the injured parties may find that they could benefit from the guidance of an experienced car accident lawyer to help them get the financial compensation that they deserve, like the car accident lawyers at GWC Injury Lawyers, Illinois’ largest Personal Injury and Workers’ Compensation law firm.
If you or your loved one has been injured, whether from a self-driving car or by some other means, please contact GWC today to schedule a free consultation with one of our attorneys. Call our office at (312) 464-1234 or click here to chat with one of our representatives.