Click to Skip Ad
Closing in...
  1. Amazon Kitchen Gadgets
    08:08 Deals

    This $20 Amazon kitchen gadget went viral on TikTok, and it’s mesmerizing

  2. Best Camera Drone Deal
    08:45 Deals

    Amazon’s best camera drone deal is a 2K drone that folds up as small as an iPhone fo…

  3. True Wireless Earbuds Price
    14:36 Deals

    SoundPEATS true wireless earbuds price on Amazon is way lower than it should be

  4. Amazon Air Fryer Deals
    11:58 Deals

    Amazon air fryer deals: Get an amazing $129 smart air fryer with Alexa for $69

  5. Amazon Deals
    09:59 Deals

    Today’s top deals: $65 foldable 2K camera drone, insane air fryer deal, $35 Echo Dot…

Hackers find a sneaky new way to attack your phone through Siri and Google Now

October 14th, 2015 at 5:35 PM
Siri Google Now Malware Attacks

How confident are you that your virtual assistant is only listening to your commands? Think there’s any chance Siri or Google Now could be duped into performing actions on iPhones and Android handsets that you haven’t actually asked for? Yes, it seems some smart hackers have found one more way to compromise the security of iPhones and Androids to either spy on users, or initiate secondary malware activities.

DON’T MISS: iPhone 6s Plus vs. Galaxy Note 5: Real world performance test (it’s not even close)

According to Wired, researchers at ANSSI, which is a French government agency working on information security, have discovered they can beam radio commands to Siri and Google Now without the smartphone owner noticing anything. The virtual assistants could then be fooled into starting several actions, such as opening certain web pages that might deploy additional malware, texting or calling premium phone numbers that would generate income for hackers, calling a third party so that the user can be spied upon, or send spam and phishing messages via email, Facebook or Twitter.

The iPhone or Android users would notice the attack only if he or she looked at the screen – or after the fact, in case any traces are left. Even if discovered, the hack would still be useful in those cases where individuals with malicious intentions would target large groups of users in crowded areas such as bars, where phones would usually be placed in pockets or purses.

“The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts,” researchers José Lopes Esteves and Chaouki Kasmi wrote in a paper published by the IEEE. Or as Vincent Strubel, the director of their research group at ANSSI puts it more simply, “The sky is the limit here. Everything you can do through the voice interface you can do remotely and discreetly through electromagnetic waves.”

There is one condition that has to be met for the hack to work: The iPhone or Android device has to have its headphones (with microphone) plugged in. The hack transforms the headphones into an ad-hoc antenna that can be used to convert electromagnetic waves sent by hackers into voice commands that Siri or Google Now would interpret as genuine.

The electromagnetic waves are generated with the help of a computer running open-source GNU Radio, a USRP software-defined radio, an amplifier and an antenna. The whole setup could be assembled inside a backpack and would have a range of about six and a half feet. A bigger antenna placed in a van would extend the range to more than 16 feet.

Obviously, if Siri or Google Now are disabled, the hack won’t work.

Chris Smith started writing about gadgets as a hobby, and before he knew it he was sharing his views on tech stuff with readers around the world. Whenever he's not writing about gadgets he miserably fails to stay away from them, although he desperately tries. But that's not necessarily a bad thing.

Popular News