A federal agency’s recent actions suggest that Teslas with Autopilot, a semi-autonomous computer system, may be closer to facing a recall.
On June 9, 2022, the National Highway Traffic Safety Administration (NHTSA) announced that it was expanding its ongoing probe into the safety of Teslas with Autopilot to include an “engineering analysis.”
Documents posted on the NHTSA’s website also raised serious concerns about Tesla’s semi-autonomous technology. Though Autopilot performs certain driving tasks in equipped vehicles and has sometimes been described as “self-driving,” the agency was quick to note that there are no vehicles available for purchase today that can actually drive themselves.
“Every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for operation of their vehicles,” the NHTSA said in a statement. While driver-assist systems can help avoid collisions, the agency stressed that they must be used correctly and responsibly.
Nevertheless, the NHTSA documents revealed that Tesla drivers are using Autopilot in areas where its capabilities are limited and often failing to take action to avoid crashes despite warnings from the system.
Sixteen Crashes with Emergency Vehicles or Trucks
The NHTSA began its investigation in August 2021, following a series of crashes since 2018 in which Teslas with Autopilot or with the Traffic Aware Cruise Control system struck vehicles at scenes where first responders had deployed flashing lights, flares, illuminated arrow boards, or cones to alert drivers of hazards.
The agency initially investigated sixteen crashes into either emergency vehicles or trucks with warning signs, which resulted in fifteen injuries and one death. In the majority of these cases, the Teslas issued collision alerts just before impact, while automatic emergency braking intervened to slow the cars down in approximately half the cases. Autopilot typically gave up control of the vehicle less than a second before impact.
The NHTSA later expanded its probe to include collisions involving similar patterns that did not also involve trucks with warning signs or emergency vehicles. In total, the regulatory body considered 191 potential crashes for its investigation, though it dismissed 85 of them either because other drivers were involved or because there was insufficient information for a definite assessment.
The NHTSA has not revealed how many victims of these collisions have retained car accident attorneys to assist them with personal injury litigation.
Do Teslas With Autopilot Have a Safety Defect?
During the “engineering analysis” phase, NHTSA investigators will be evaluating additional data, studying vehicle performance, and exploring “the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks, undermining the effectiveness of the driver’s supervision.”
An engineering analysis typically marks the final stage of an NHTSA investigation. In most cases, the agency decides whether to implement a recall or close its probe within a year after it completes an engineering analysis. A recall of Teslas with Autopilot would impact approximately 830,000 vehicles, representing nearly everything that the company has sold in the United States since 2014.
Before pursuing a recall, the NHSTA will have to determine if Autopilot has a safety defect.
So far, the agency has found that the drivers in the crashes under investigation typically failed to take action to avoid a crash, even though they usually had their hands on the steering wheel at the time, suggesting that Autopilot does not do enough to make drivers pay attention.
In crashes for which video footage was available, the NHSTA noted that the drivers should have seen first responder vehicles approximately eight seconds before impact. Nevertheless, investigators also concluded that a driver’s use or misuse of the monitoring system “or operation of a vehicle in an unintended manner does not necessarily preclude a system defect.”
Autopilot Used in Unsafe Areas or Conditions
In about one-quarter of the cases studied in the probe, the main cause of the crashes appeared to be drivers running Autopilot in areas where the system has limitations or in conditions that would interfere with its operation, including “operation on roadways other than limited access highways, or operation in low traction or visibility environments such as rain, snow, or ice.”
At the conclusion of its own investigation, the National Transportation Safety Board (NTSB) recommended that the NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate. Other auto manufacturers already restrict the deployment of their own driving systems to limited-access divided highways.
The NTSB also called for new technology to help ensure that Tesla drivers concentrate on the road.
The NHTSA has not yet acted on these recommendations, which the NTSB has no power to enforce.
Strong Advocacy for Car Accident Victims
No matter what type of vehicle is responsible for a collision, car accident victims often need strong legal advocacy if they hope to obtain full and fair compensation for their injuries. For this reason, if you have been hurt in a crash, consider doing what so many others have have done before you and reach out to the dedicated car accident attorneys at GWC Injury Lawyers LLC.
With more than $2 billion recovered in verdicts and settlements, GWC is one of the premier Personal Injury and Workers’ Compensation law firms in Illinois. Our car accident attorneys have the experience, the determination, the resources, and the reputation necessary to get you and your family the justice you deserve.
To schedule a free, no-obligation consultation with a knowledgeable car accident attorney, contact GWC today. You may call our office at (312) 464-1234 or click here to chat with a representative at any time.