Click to Skip Ad
Closing in...

3 AI scams to watch out for in 2024

Published Jan 8th, 2024 3:46PM EST
ChatGPT photo illustration
Image: Rafael Henrique/SOPA Images/LightRocket via Getty Images

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

In 2024, we’ll find out whether artificial intelligence is just a passing craze or truly the future of computing. In the meantime, while there are plenty of legitimate and intriguing use cases for AI, the technology will also be central to countless scams in the coming months. We are all curious about AI, and scammers are going to use that to their advantage.

On Monday, anti-fraud experts from Scams.info shared a list of three AI scams that we should all be on the lookout for in 2024. As a general rule of thumb, if it sounds too good to be true, it likely is, but these specific cons should be on your radar as we enter the new year.

1. Investment fraud AI scams

Companies such as Google, Microsoft, and OpenAI have already poured millions of dollars into AI and will invest even more this year. Scammers will use this fact as a springboard to convince you to invest in shady opportunities. If someone on social media tries to convince you that AI is going to boost your ROI on an investment, think twice before opening your wallet.

“Watch out for investments that promise high returns with little risk and be sure to do comprehensive research before handing over money,” warns Nicholas Crouch at Scams.info. “Budding investors should also be aware of opportunities that ask you to recruit new investors; these often operate as Ponzi or pyramid schemes that while benefiting those at the top of the pyramid, very rarely benefit others involved.”

2. Impersonating loved ones

A common tactic scammers have been using for years is posing as a friend or family member and asking for money. These aren’t especially convincing when the scammer on the other line sounds nothing like your nephew or your grandmother, but AI could be a gamechanger for this scam. It would require real effort, but scammers could potentially use AI to replicate the voice of a loved one. All they’d need is a YouTube video or Facebook post containing the person’s voice to train the AI. Would you be able to tell the difference on a phone call?

“It’s vital that people protect their social media accounts to prevent scammers having access to recordings of your voice and details of your wider family,” says Crouch.

3. Bypassing security with voice

Some banks use voice identification to verify users when banking over the phone. For the same reasons we discussed above, this is suddenly far less secure than it was a few years ago. If you post videos or audio clips or yourself anywhere on the internet, a bad actor can use those clips to copy your voice. As Crouch notes, banks also use other data to verify your identity, but this brings scammers one step closer to stealing your bank account.

AI could fundamentally change our lives and the way we interact with our devices. It’s also the latest tool that hackers and scammers will be using to take advantage of us. Be smart, stay alert, and always do your research before engaging with AI.

Jacob Siegal
Jacob Siegal Associate Editor

Jacob Siegal is Associate Editor at BGR, having joined the news team in 2013. He has over a decade of professional writing and editing experience, and helps to lead our technology and entertainment product launch and movie release coverage.