Regulators Want ‘Suicide’ Function for Tesla

AP Photo/Ng Han Guan
AP Photo/Ng Han Guan

Tesla Motors stock jumped by $5 at the opening on January 11 in response to an update to its hands-free operation, but then ethical concerns that drivers would do crazy things caused a $10 plunge.

In October, Tesla October released a wireless update called “Autopilot” to its Version 7.0 vehicle operating software. The functionality was advertised as allowing properly equipped cars to steer, switch lanes, and manage speed on its own–to “assist drivers.” Techies loved it and urban planners claimed it would make driving safer, more efficient and eventually cut freeway congestion by over 30 percent.

But government transportation regulators started panicking when YouTube videos showed Tesla drivers pulling stunts that could cause mass fatalities–like setting the car on Autopilot and then climbing over to the backseat and playing video games.

A Model S driver in one video admitted ignoring audio warnings until the vehicle automatically swerved over the double-yellow dividing lines toward an oncoming vehicle. He added, “Had I not reacted quickly to jerk the steering wheel in the opposite direction, a devastating head-on collision would have occurred.”

Tesla’s Autopilot system uses a combination of forward radar, a forward-looking camera, 12 long-range ultrasonic sensors, and fast processors. It is advertised to be able to handle straight-ahead, predictable highway driving.

But Tesla’s marketing effort, by using the name “Autopilot,” is clearly equating its functionality with the type of capability that Google’s is trying to achieve for completely autonomous vehicles that can drive themselves in all situations, including without passengers.

Regulators are concerned that there are huge ethical questions about Autopilot’s algorithmic decisionmaking in a crisis. A recent MIT University paper titled “Why Self-Driving Cars Must Be Programmed to Kill; warned that “Autopilot,” in a potentially deadly crash involving a single Model S driver and a car with a mother and small children, would have to face what was the “greater good” choice: should the $100,000 Model S sacrifice its own driver to spare the children, or sacrifice the kids for the Tesla “client”?

The Mercedes-Benz S and E-Class sedan since 2013 has featured “Distronic Plus,” a suite of assisted driving technologies that closely resembles the Tesla Autopilot system, but without automatic steering. Mercedes commonly refer to the technology as “assistive”, but never uses words like auto, automatic, or “autonomous.”

Many Tesla drivers fail to understand that the car is not meant to operate in Autopilot like a helpful smartphone app. Model S drivers must understand that the car goes from positive tracking to a gray area when the car is confused or does not know where it is going.

The Tesla users’ website has a number of comments that Autopilot requires frequent small corrections to maintain lane position and address other problems. The biggest issue is that the car defaults to lane center–even when a wide vehicle is in adjacent lane or hugging the lane line on corners.

Despite Tesla Chief Executive Elon Musk saying at last week’s Consumer Electronics Show in Las Vegas that he was not aware of any accidents caused by the earlier version of the software, he did want to minimize the possibility of people doing “crazy things.”


Please let us know if you're having issues with commenting.