"A Laser Pointer Could Hack Your Voice-Controlled Virtual Assistant"
A team of researchers from the University of Michigan and the University of Electro-Communications in Tokyo has proven that it is possible for hackers to trick voice-controlled virtual assistants, including Siri, Alexa, and Google Assistant, into registering light as audio commands by using a laser beam. The team demonstrated that attackers can perform a number of different malicious activities through the use of Light Commands such as unlocking smart lock-protected front doors, opening connected garage doors, making purchases for victims on e-commerce websites, unlocking connected vehicles, and more. The researchers used up to 60 milliwatts of laser power to hijack smart home devices, phones, and tablets. They are working with Google, Apple, and Amazon to help implement hardware and software fixes to protect users from Light Commands. This article continues to discuss the vulnerability of voice assistants to Light Commands, the risks associated with such attacks, and how these attacks can be prevented by users.