
Oh dear. With fully self driving cars becoming more of a reality, stuff like this scares me. A lot.
Researchers at cyber-security firm Keen Lab have managed to cause a Tesla’s self-driving feature to swerve off course and into the wrong lane, using just a few stickers they placed on the road. But if you’re in the 79% of the population that don’t ever want to get in a self driving car, you’ve got nothing to worry about. Furthermore, if you DO happen to own a Tesla, Elon Musk is offering people new Tesla 3’s as a reward for finding these bugs. Anyways, in a paper published last week, researchers first tried to throw the Tesla off course by blurring out markings on the left lane. The system easily recognised this, and the team concluded that “it is difficult for an attacker to deploy some unobtrusive markings in the physical world to disable the lane recognition function of a moving Tesla vehicle.”
However, the researchers then went on to conduct a “fake lane attack,” with much better (depending on your perspective) results. By placing a mere three stickers on the floor, they were able to trick the Tesla into moving into the opposite lane, and potentially into oncoming traffic.
The researchers write in the paper “Misleading the autopilot vehicle to the wrong direction with some patches made by a malicious attacker… is more dangerous than making it fail to recognize the lane, Tesla autopilot module’s lane recognition function has a good robustness in an ordinary external environment (no strong light, rain, snow, sand, and dust interference), but it still doesn’t handle the situation correctly in our test scenario.”
Whats so worrying about this is that anyone, and i mean anyone can deploy this attack. you don’t even need to connect to the vehicle. “This kind of attack is simple to deploy, and the materials are easy to obtain.”