AI tools are making it easier than ever for scammers to defraud consumers.
The FBI reports that cybercrime is on the rise, reaching over 1 million complaints last year for a total of $20.877 billion in losses. That’s a 26% increase in losses over the previous year.
“There has never been more fraud and more different types of fraud than there is now, and the primary factor is the availability and accessibility of AI tools,” says Mason Wilder, a certified fraud examiner and research director at the Association of Certified Fraud Examiners.
Think about how easy AI makes the tasks you do. Scammers are using these tools to impersonate known people on audio and video. They are also streamlining the way they collect data on and target potential victims, and they can do it efficiently.
To stay safe, fraud experts recommend the following strategies.
Treat incoming calls, emails and texts with suspicion
Scammers can use AI tools to automate outreach methods and collect personal information at rapid speed, says Eva Velasquez, CEO of the nonprofit Identity Theft Resource Center.
“They’re harvesting all of this rich information about a person which previously they had to put more time and effort into,” she says. It’s so easy that “everybody is a target.”
As a result, we need to be more vigilant. Treat all inbound texts, calls or emails — even if they appear to be from a legitimate source — with a degree of skepticism.
“If you didn’t initiate the contact, go to the source,” Velasquez says. That means calling the number you know to be correct to verify the identity of anyone reaching out to you.
Meet MoneyNerd, your weekly news decoder
So much news. So little time. NerdWallet’s new weekly newsletter makes sense of the headlines that affect your wallet.

Get familiar with AI video and audio capabilities
Scammers can clone voices so a phone call sounds like a relative, celebrity or government official, says Chuck Bell, programs director for advocacy at Consumer Reports.
“Many scams we see are imposter scams where they are impersonating your bank or service provider, FedEx, Amazon, a government agency like the IRS, or a tech company,” Bell says.
Scammers can use AI to alter video and audio so it appears authentic. They usually reach out to you via phone call, WhatsApp message, text or email. Bell says they might invent a story to gain access to your computer, personal information and finances.
Naivety is not your friend in the age of AI. If you’re aware of scammers’ new audio and video altering tactics, it can be easier to spot them, Velasquez says, even though they look real.
“Most of us are wired to respond to law enforcement,” Velasquez says, which is why scammers use that identity so often.
Velasquez also encourages people to bring up these kinds of AI scams with older family members, so they know what to watch for. Some scammers will impersonate grandchildren claiming to be in trouble with the law and needing financial help immediately.
Be wary of unsolicited investing advice
In the past, scammers would connect with targets over social media or dating sites and request money. Now, Wilder says a new method involves creating a convincing investing platform that is fake.
“They convince people to put more and more money into the platform, and it looks like it’s going up and up,” Wilder says.
The scam, he explains, is enhanced at every step with AI. First, scammers use AI to automate the initial outreach to targets. Once they make contact, they use AI-augmented video or audio to gain the victim’s trust. Then, they use AI to build a convincing online platform to collect their money.
Finally, the platform disappears, along with the target’s money.
Freeze your credit and opt into alerts
Wilder also suggests signing up for transaction alerts on your financial accounts so you get a notification every time your card is used for a purchase. That way, he says, you can quickly dispute any suspicious charges.
Given all of the leverage AI offers to scammers, consumers need all of the help they can get.