fbpx
 
Home / News, Videos & Publications / News / Homeland & Cyber Security /

Tesla’s Autopilot System Fooled by “Phantom Objects”

Tesla’s Autopilot System Fooled by “Phantom Objects”

February 5, 2020

Homeland & Cyber Security

Popular Mechanics — Researchers from Ben-Gurion University’s Cyber Security Research Center have managed to fool the Tesla autopilot system with a $300 trick.

An inexpensive projector system displaying false speed limit signs in trees or shining a slender man-like figure onto the road can actually force Tesla’s autopilot to change behavior, adjusting speed to the “road signs” and slowing down for what it thinks might be a pedestrian.

BGU Ph.D. student Ben Nassi, lead author of the projector paper, used a battery-operated projector and a drone to cast an image of a pedestrian onto the pavement. He wanted to see if he could create a scenario that any hacker could easily replicate without having to reveal their identity.

Nassi tested out his theory against Tesla’s Autopilot, as well as Mobileye 630 PRO, another of the most advanced automated driver systems, which is used in cars like the Mazda 3. He projected an image of a vehicle onto the street, which the Model X picked up on; created false speed limit signs, which were detected; and even created fake street lines that forced the Tesla to switch lanes.

These so-called “phantom objects” prove that computer vision still has a long way to go before self-driving cars can ever truly be reliable as alternatives for mass transit or personal car ownership. Accordingly, the researchers refer to their efforts as a “perceptual challenge.”

“We show how attackers can exploit this perceptual challenge to apply phantom attacks … without the need to physically approach the attack scene, by projecting a phantom via a drone equipped with a portable projector or by presenting a phantom on a hacked digital billboard that faces the internet and is located near roads,” they write in the abstract.

Nassi says phantom objects aren’t just a concern through projector methods like his own. These false positives could also be embedded into digital billboards, which are often in a car’s frame of vision. The phantom symbol may only appear for milliseconds and it could cause a car to speed up or slow down suddenly.

Nassi and his team refer to this inability of automated vehicles to double-check what they’re seeing as the “validation gap.” The solution is simple, the researchers posit: Manufacturers of automated driving systems should be working on communication systems to help the computer vision systems double-check that what it’s seeing is reality.

Until new communication systems hit the mass-market, definitely keep your eyes open and alert while driving your Tesla.

Read more on the Popular Mechanics website >>