Self-Driving Disaster: Elon Musk’s Tesla ‘Autopilot’ System Linked to 736 Crashes and 17 Fatalities Since 2019

Elon Musk watches SpaceX launch
Joe Raedle /Getty

Tesla’s Autopilot system has been linked to 736 crashes and 17 fatalities in the United States since 2019, raising serious concerns about the safety of Elon Musk’s driver-assistance technology.

The Washington Post reports that Tesla’s Autopilot system has been linked to 736 crashes and 17 fatalities in the U.S. since 2019, causing the safety of the company’s driver-assistance technology to be seriously questioned.

Elon Musk satanic costume

Elon Musk’s Halloween costume (Taylor Hill /Getty)

Florida Tesla Crash (Florida Highway Patrol)

Florida Tesla Crash (Florida Highway Patrol)

The information has sparked a discussion about the safety of autonomous driving technology because it reveals a significant rise in accidents and fatalities linked to Autopilot over the previous four years. There have been 17 fatal incidents associated with Autopilot, 11 of which have happened since May 2022, up from three that were documented in June 2022.

In one serious incident, Tillman Mitchell, a 17-year-old boy, was struck by a Tesla Model Y while it was allegedly in Autopilot mode as he stepped off a school bus. The car never slowed down. It struck Mitchell at 45 mph. According to Mitchell’s great-aunt, Dorothy Lynch, the teenager was thrown into the windshield, flew into the air, and landed face down in the road. Mitchell survived the crash but suffered from memory issues, a broken leg, and a fractured neck.

“If it had been a smaller child,” Lynch said, “the child would be dead.” Lynch expressed her concern about the technology, saying, “I pray that this is a learning process. People are too trusting when it comes to a piece of machinery.”

Despite the mounting worries, Tesla CEO Elon Musk continues to defend the Autopilot system. He has argued that autopilot-equipped vehicles are safer to drive than those that are only operated by human drivers. “At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it even though you’re going to get sued and blamed by a lot of people,” Musk said last year. “Because the people whose lives you saved don’t know that their lives were saved. And the people who do occasionally die or get injured, they definitely know — or their state does.”

Critics counter that the data clearly demonstrates the technology’s shortcomings. Missy Cummings, a professor at George Mason University’s College of Engineering and Computing and a former senior safety adviser at the National Highway Traffic Safety Administration, claimed that “Tesla is having more severe — and fatal — crashes than people in a normal data set.”

The data also calls into question Tesla’s decision to remove radar sensors from brand-new cars and disable them in already-registered vehicles. According to some experts, this choice may have contributed to the rise in incidents.

Some people are advocating for a ban on automated driving as the discussion progresses. Lynch, reflecting on her nephew’s accident, said, “I think they need to ban automated driving. I think it should be banned.”

Read more at the Washington Post here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan

usechatgpt init success

usechatgpt init success

COMMENTS

Please let us know if you're having issues with commenting.