Security researchers have demonstrated how hijacked billboards could be used to confuse self-driving cars -- forcing them to slam on the brakes, or worse.
Autonomous driving systems have come on leaps and bounds in recent years, but not without mistakes, confusion, and accidents occurring.
Vehicle intelligence has a long way to go before it could be considered fully autonomous and safe to use without the supervision of a human driver, and as technology firms continue to refine their platforms, the focus tends to be on weather conditions, mapping, and how cars should respond to hazardous objects -- such as people in the road or other cars.
See also: Tesla's Elon Musk: Some 'expert, careful' drivers get beta Full Self-Driving next week[1]
However, as reported by Wired[2], there may be other, unseen hazards that humans cannot detect with the naked eye.
New research conducted by academics from Israel's Ben Gurion University of the Negev suggests that so-called "phantom" images -- such as a stop sign created from flickering lights on an electronic billboard -- could confuse AI systems and prompt particular actions or movements.
This could not only cause traffic jams but also more serious road accidents, with hackers leaving little evidence of their activities -- and leaving drivers perplexed over why their smart vehicle suddenly changed its behavior.
CNET: Tesla Model S price drops to $69,420, seven-seat Model Y coming soon[3]
Light projections spanning only a few frames and displayed on an electronic billboard could cause cars to "brake or swerve," security researcher Yisroel Mirsky told the publication, adding, "so somebody's car will just react, and they won't understand why."
Tests were performed on a vehicle using Tesla's latest version of Autopilot, and MobileEye. According