Hackers have reportedly begun using stickers which confuse Tesla’s autopilot system to trick the cars into driving into the wrong lanes or towards oncoming traffic.
Forbes reports that Keen Labs, a Chinese cybersecurity research group which is regarded as one of the best in the world, has successfully developed two different methods of confusing Tesla autopilot’s lane-recognition tech. The first method the researchers tried was to alter the lane markings on the road, adding patches to the ground to make them seem blurred. This worked but the patches on the road were particularly conspicuous and easily seen by drivers, as a result, Keen Labs determined this would be too hard to implement in the real world.
Keen Labs then attempted to create a “fake lane” after they discovered that Tesla’s autopilot would detect a laneway once it tracked three small stickers placed on the ground. The hackers tested this theory on a test track by placing small stickers at an intersection, believing they could trick the Tesla vehicle into thinking the patches marked out the continuation of the right-hand lane. This proved correct as the car took a turn into the real left lane which could easily have caused an accident on a real road.
Keen Labs wrote in a paper: “Our experiments proved that this architecture has security risks and reverse-lane recognition is one of the necessary functions for autonomous driving in non-closed roads. In the scene we build, if the vehicle knows that the fake lane is pointing to the reverse lane, it should ignore this fake lane and then it could avoid a traffic accident.”
Keen Labs also developed a method to control tesla’s windscreen wipers and take control of the cars steering using a game control pad, but both of these methods were quite impractical to implement and as a result did not pose a particular security threat. But Keen Labs warned that the fake lane hack would be easy to implement using cheap materials.
Tesla told Forbes that the issue relating to the remote control gamepad had been fixed before Keen Labs ever looked into it, as for the other issues: “The rest of the findings are all based on scenarios in which the physical environment around the vehicle is artificially altered to make the automatic windshield wipers or Autopilot system behave differently, which is not a realistic concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should always be prepared to do so, and can manually operate the windshield wiper settings at all times.”
Tesla has had a number of issues relating to its autopilot system, some of which have allegedly proven fatal in the past. Some analysts have called Tesla CEO Elon Musk’s comments relating to the abilities of the company’s autopilot system “reckless.”