"UCI Researchers: Autonomous Vehicles Can be Tricked Into Dangerous Driving Behavior"

Researchers at the University of California, Irvine have discovered that autonomous vehicles can be tricked into an abrupt halt or other undesired driving behavior by placing an ordinary object on the side of the road.  The researchers stated that a box, bicycle, or traffic cone might be all that is necessary to scare a driverless vehicle into coming to a dangerous stop in the middle of the street or on a freeway off-ramp, creating a hazard for other motorists and pedestrians.  Autonomous vehicles cannot distinguish between objects present on the road by pure accident or those left intentionally as part of a physical denial-of-service attack.  Both can cause erratic driving behavior.  The researchers focused their investigation on security vulnerabilities specific to the planning module, a part of the software code that controls autonomous driving systems.  This component oversees the vehicle’s decision-making processes governing when to cruise, change lanes, slow down, and stop, among other functions.  The researchers stated that the vehicle’s planning module is designed with an abundance of caution, logically, because you don’t want driverless vehicles rolling around out of control.  However, their testing has found that the software can err on the side of being overly conservative, leading to a car becoming a traffic obstruction or worse.  

 

UCI News reports: "UCI Researchers: Autonomous Vehicles Can be Tricked Into Dangerous Driving Behavior"

Submitted by Anonymous on