"GhostStripe Attack Haunts Self-Driving Cars by Making Them Ignore Road Signs"

A team of researchers from Singapore-based universities has proven the possibility of interfering with autonomous vehicles through their reliance on camera-based computer vision and making them miss road signs. "GhostStripe," a technique undetectable to the human eye, could impact Tesla and Baidu Apollo drivers. It exploits the sensors implemented by both brands, specifically CMOS camera sensors. The method involves using LEDs to shine light patterns on road signs, causing the cars' self-driving software to fail to understand the signs. It is essentially a classic adversarial attack on Machine Learning (ML) software. This article continues to discuss findings regarding the potential GhostStripe attack. 

The Register reports "GhostStripe Attack Haunts Self-Driving Cars by Making Them Ignore Road Signs"

Submitted by grigby1

Submitted by grigby1 CPVI on