Sep 7, 2017
'Dolphin' attacks fool Amazon, Google voice assistants
Voice-controlled assistants by Amazon, Apple and Google could be hijacked by ultrasonic audio commands that humans cannot hear, research suggests. Google told the BBC it was investigating the claims presented in the research. The attack would not work on systems that had been trained to respond to only one person's voice, which Google offers on its assistant. Apple and Google both allow their "Wake words" to be switched off so the assistants cannot be activated without permission. A Google spokesman said: "We take user privacy and security very seriously at Google, and we're reviewing the claims made."
Make a complaint about Amazon, Apple or Smart by viewing their customer service contacts.