How confident are you that your virtual assistant is only listening to your commands? Think there’s any chance Siri or Google Now could be duped into performing actions on iPhones and Android handsets that you haven’t actually asked for? Yes, it seems some smart hackers have found one more way to compromise the security of iPhones and Androids to either spy on users, or initiate secondary malware activities.
DON’T MISS: iPhone 6s Plus vs. Galaxy Note 5: Real world performance test (it’s not even close)
According to Wired, researchers at ANSSI, which is a French government agency working on information security, have discovered they can beam radio commands to Siri and Google Now without the smartphone owner noticing anything. The virtual assistants could then be fooled into starting several actions, such as opening certain web pages that might deploy additional malware, texting or calling premium phone numbers that would generate income for hackers, calling a third party so that the user can be spied upon, or send spam and phishing messages via email, Facebook or Twitter.
The iPhone or Android users would notice the attack only if he or she looked at the screen – or after the fact, in case any traces are left. Even if discovered, the hack would still be useful in those cases where individuals with malicious intentions would target large groups of users in crowded areas such as bars, where phones would usually be placed in pockets or purses.
“The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts,” researchers José Lopes Esteves and Chaouki Kasmi wrote in a paper published by the IEEE. Or as Vincent Strubel, the director of their research group at ANSSI puts it more simply, “The sky is the limit here. Everything you can do through the voice interface you can do remotely and discreetly through electromagnetic waves.”
There is one condition that has to be met for the hack to work: The iPhone or Android device has to have its headphones (with microphone) plugged in. The hack transforms the headphones into an ad-hoc antenna that can be used to convert electromagnetic waves sent by hackers into voice commands that Siri or Google Now would interpret as genuine.
The electromagnetic waves are generated with the help of a computer running open-source GNU Radio, a USRP software-defined radio, an amplifier and an antenna. The whole setup could be assembled inside a backpack and would have a range of about six and a half feet. A bigger antenna placed in a van would extend the range to more than 16 feet.
Obviously, if Siri or Google Now are disabled, the hack won’t work.