Click to Skip Ad
Closing in...

Google Maps security report shows us how Google stops sneaky scams

Published Mar 31st, 2023 11:00AM EDT
Google Maps reviews
Image: Google

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Navigation is easily one of the most common ways smartphones are used, with Google Maps being a fan favorite. In addition to providing directions to most destinations, Google Maps lets you discover and explore your neighborhood or your next travel destination. These features make Google Maps an obvious target for malicious individuals looking to scam unsuspecting users.

Thankfully, Google has both manual and advanced automated defenses that prevent Google Maps from delivering potentially harmful content. The company just recapped the actions it took to fight fake content on Google Maps last year, revealing the massive scope of Google Maps attacks that malicious individuals attempt.

It’s easy to understand why Google Maps is such a target for hackers. The app supports user-generated content, like photos of places and business information. Hackers could use these avenues to upload Google Maps content that might look legitimate to most people who end up clicking on links or phone numbers.

Thankfully, Google is watching and preventing many of these malicious attempts from going through. The figures for last year are impressive:

  • 115 million policy-violating reviews blocked – 20% more than in 2021
  • 200 million photos and 7 million videos blocked – the content was blurry, low-quality, or violated policies
  • 20 million attempts to create fake Business Profiles blocked – 8 million more than in 2021
  • 185,000 businesses protected after suspicious activity detection

Google also detailed advanced malicious Google Maps campaigns that machine learning (ML) intelligence helped discover:

Last year, we launched a significant update to our machine learning models that helped us identify novel abuse trends many times faster than previous years. For example, our automated systems detected a sudden uptick in Business Profiles with websites that ended in .design or .top something that would be difficult to spot manually across millions of profiles. Our team of analysts quickly confirmed that these websites were fake and we were able to remove them and disable the associated accounts quickly.

Similarly, Google caught fraudulent imagery with the help of machine learning:

In some places, scammers started overlaying inaccurate phone numbers on top of contributed photos, hoping to trick unsuspecting victims into calling the fraudster instead of the actual business. To combat this issue, we deployed a new ML model that could recognize numbers overlaid on contributed images by analyzing specific visual details and the layouts of photos. With this model, we successfully detected and blocked the vast majority of these fraudulent and policy-violating images before they were published.

Finally, Google says that it took a group of Google Maps fraudsters to court last year. The group was impersonating Google, attempting to sell fake Google Maps reviews. The company says it has been sharing insights with the FTC and other international government bodies about deceptive actions on Google Maps.

That doesn’t mean Google can catch all fraudulent activity on Google Maps. And you should still pay attention to what you click on while in the app. But Google does a great job protecting Google Maps users. These yearly recaps from Google remind us we’re taking Google Maps security for granted, as there’s a lot of work needed behind the scenes to keep users safe.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2008. When he’s not writing about the most recent tech news for BGR, he brings his entertainment expertise to Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming almost every new movie and TV show release as soon as it's available.