JOIN BREITBART. Takes 2 seconds.

Police: Drunk Driver Asleep at Wheel as Tesla Autopilot Cruises at 70MPH

Elon Musk of Tesla confused
Getty/Peter Parks
LUCAS NOLAN

A drunk driver recently slept behind the wheel of his Tesla Model S traveling 70 miles per hour on the highway, forcing police to use innovative tactics to stop the car.

Techspot reports that a driver was spotted by police in his gray Tesla Model S on Highway 101 driving at approximately 70 miles per hour. He was breaking the speed limit so police attempted to pull him over — then they realized that the man appeared to be asleep behind the wheel of his vehicle, with the Tesla in autopilot mode.

The police pulled up behind the vehicle and flashed their lights and sirens in an attempt to have him pull over but he was unresponsive, according to California Highway Patrol Public Information Officer Art Montiel. After another police car drove in front of the Tesla and slowed down, the car came to a stop on the highway.

It then took officers some time to wake the driver who was taken to a local gas station where he failed a sobriety test and was arrested on suspicion of driving under the influence of alcohol. Tesla’s autopilot system is designed to warn drivers if they do not have their hands on the wheel while the mode is engaged and eventually slows down and stops the car if the driver does not respond, however, the driver appeared to have his hand on the bottom of the wheel resulting in the car traveling approximately seven miles with the driver asleep behind the wheel.

Tesla has stated on multiple occasions that its autopilot system is designed to be used on highways with a center divider that has clear lane markings and that the mode does not turn a Tesla car into a self-driving vehicle. However, there have been multiple accidents surrounding the feature, often due to drivers taking the “autopilot” term literally and believing their car can now drive itself — instead the feature should be considered to be like an advanced cruise control, not an autonomous vehicle solution.

In January, a Tesla vehicle in autopilot mode crashed into a fire engine. The driver of the car said that he had the autopilot system engaged when he hit the fire engine traveling at approximately 65 miles per hour. The fire engine was reportedly on the shoulder of the road while assisting at the scene of another accident. The Culver City Firefighters Twitter account tweeted out photos of the crash, stating “Amazingly there were no injuries.”

In May, another Tesla vehicle hit a police car, “The officer was not in the police SUV at the time but handling a call for service about 1,000 yards away,” KTLA reported, adding, “The Tesla driver, a 65-year-old man, sustained minor injuries, but declined transport to a hospital, the sergeant said.”

However, not all involved in Tesla autopilot crashes have been lucky enough to escape with minor injuries, in April the company blamed the driver of one of their vehicles for his autopilot-related death. In a statement provided to Business Insider, Tesla said:

According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.

The spokesperson added that “the fundamental premise of both moral and legal liability is a broken promise, and there was none here.” The spokesperson further stated that Tesla was “extremely clear that Autopilot requires the driver to be alert and have hands on the wheel” and enforced that warning “every single time Autopilot is engaged.” The spokesperson further added:

We empathize with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive (emphasis added).

This driver should count himself lucky that in his case the Tesla vehicle was stopped before it crashed.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or email him at lnolan@breitbart.com

.

Please let us know if you're having issues with commenting.