‘Critical Safety Gap:’ NHTSA Tesla Autopilot Probe Finds Elon Musk’s Driver Engagement Software Is Weak

Elon Musk, founder, CEO, and lead designer of SpaceX, speaks at a news conference after th
AP Photo/John Raoux

The National Highway Traffic Safety Administration (NHTSA) has closed its long-standing investigation into Tesla’s Autopilot driver assistance system after reviewing hundreds of crashes involving the technology’s misuse, including 13 that were fatal.

TechCrunch reports that the NHTSA’s Office of Defects Investigation released documents on Friday detailing the completion of an extensive body of work that uncovered evidence of Tesla’s weak driver engagement system being inappropriate for Autopilot’s permissive operating capabilities. This mismatch created a critical safety gap between drivers’ expectations of Autopilot’s operating capabilities and the system’s true capabilities, leading to foreseeable misuse and avoidable crashes.

California Tesla Crash

California Tesla Crash (Contra Costa County Fire Protection District via AP)

Florida Tesla Crash (Florida Highway Patrol)

Florida Tesla Crash (Florida Highway Patrol)

According to the NHTSA, “This mismatch resulted in a critical safety gap between drivers’ expectations of [Autopilot’s] operating capabilities and the system’s true capabilities. This gap led to foreseeable misuse and avoidable crashes.”

The investigation, which began in 2021, reviewed 953 reported crashes up until August 30, 2023. Approximately half of these crashes were deemed insufficient in data, involved another vehicle at fault, found Autopilot not in use, or were otherwise unrelated to the probe. The remaining 467 crashes were categorized into three groups: 211 crashes where the Tesla struck another vehicle or obstacle with adequate time for an attentive driver to respond, 145 crashes involving roadway departures in low traction conditions, and 111 crashes involving roadway departures where the driver’s inputs inadvertently disengaged Autopilot.

Although Tesla instructs drivers to pay attention to the road and keep their hands on the wheel while using Autopilot, NHTSA and other safety groups have expressed that these warnings and checks are insufficient. In December, NHTSA stated that these measures were “insufficient to prevent misuse.” Tesla agreed to issue a recall via a software update to theoretically increase driver monitoring, but the update did not appear to significantly change Autopilot.

Concurrent with the closure of the initial probe, NHTSA is opening a new investigation to evaluate the effectiveness of the Autopilot recall fix implemented by Tesla in December. Parts of the recall fix require the owner to opt-in, and Tesla allows drivers to readily reverse some of the safeguards, according to NHTSA.

Read more at TechCrunch here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

COMMENTS

Please let us know if you're having issues with commenting.