Don't Believe Your Ears: How to Spot an AI Voice Scam

·9 min read·FindMyIP Team

The phone rang at 11:17 on a Monday morning.

Jennifer DeStefano picked up. On the other end, a voice she would have recognized anywhere — her 15-year-old daughter's — broke into sobs.

"Mom, I messed up."

Before Jennifer could respond, a man's voice cut in. Aggressive. Controlled. He said he had her daughter. He said he'd have his way with her and drop her off in Mexico. He demanded $1 million. Then, when Jennifer said that was impossible, he came down: $50,000. He told her not to call the police.

The call felt completely real. The voice was completely real.

Her daughter was at home the entire time, safe, oblivious. The voice on the phone was a fake — built by AI from audio scraped off the internet in less time than it takes to make a cup of coffee.

Jennifer's story ended well. She didn't lose a cent. But she's the exception.


This Is Happening Right Now

In 2024, Americans reported losing $12.5 billion to fraud — a record high, according to the FTC. Imposter scams — where someone pretends to be a person or institution you trust — accounted for nearly $3 billion of that. And increasingly, the impostor has your family member's exact voice.

A survey by McAfee found that 70% of people could not tell the difference between a real voice and an AI-cloned one. Of those who received a message from an AI voice clone, 77% said they lost money.

The technology making this possible isn't exotic. It isn't expensive. It's available to anyone with a credit card and a Wi-Fi connection — for as little as $5 a month.


How They Build a Fake Voice in Minutes

You don't need to understand the technology to protect yourself from it. But you do need to understand how easy it is.

Here's the process, step by step:

Step 1: They find you. Scammers start with social media, data broker websites, and public records. They're looking for people with family connections they can exploit — a parent, a grandparent, someone with kids who post online.

Step 2: They collect voice samples. This is where it gets unsettling. TikToks, Instagram reels, YouTube videos, voicemails posted online, TV interviews — any audio where someone speaks. Modern AI voice cloning tools need as little as 30 seconds of clean audio to build a working clone. One video of your college kid at a birthday party is more than enough.

Step 3: They clone the voice. Using tools that cost $5 to $11 per month — the kind of subscription you'd use for a streaming service — they feed the audio in and get a cloned voice out. Type any words, and the tool speaks them in that person's voice.

Step 4: They call you. The call comes in looking like it might be from your family member's area code. Or from an unknown number. The voice on the other end is your son, your daughter, your grandchild. Frightened. In trouble. Desperate.

Step 5: They create a crisis. Car accident. Arrest. Stranded abroad. Kidnapped. The story varies, but the formula doesn't: the situation is urgent, it's embarrassing, and you must act right now.

Step 6: They ask for money in a way that can't be traced. Wire transfers. Gift cards. Cryptocurrency. They know law enforcement can't easily follow those trails. They'll tell you the money is to make bail, pay a lawyer, cover a fine. They'll tell you not to tell anyone else — it'll embarrass your family member.

That last part is the lock on the trap.


Real People, Real Losses

Jennifer DeStefano's case became national news because she got lucky — people around her were able to verify her daughter was safe before she sent anything.

Not everyone is that lucky.

In 2025, a federal grand jury indicted 25 Canadians for running a grandparent scam ring that stole $21 million from hundreds of American seniors. The scheme ran for years before arrests were made.

In Florida in 2024, six people were arrested for using AI-generated voices to impersonate grandchildren and steal nearly $250,000 from seniors across four counties.

In Canada, one individual reportedly used AI voice cloning to steal $200,000 from eight victims in just three days.

These aren't sophisticated nation-state attacks. They're small criminal operations using cheap, widely available tools — targeting ordinary people, often elderly ones, in moments of manufactured panic.


How to Spot the Scam While It's Happening

The most powerful weapon scammers have is urgency. When you believe your child is in danger, your brain stops analyzing and starts reacting. That's exactly what they're counting on.

Here are the warning signs to force yourself to notice, even in a panic:

The call comes with a crisis already in progress. Real emergencies unfold. Fake ones arrive pre-packaged. If someone calls and immediately launches into "I'm in trouble, I need help, don't call anyone," pause.

They tell you not to tell anyone. This is almost always part of the script. Real people in real trouble want you to call for help. Scammers need to isolate you from anyone who might reality-check the story.

They push you toward untraceable payment. No legitimate bail bondsman, hospital, lawyer, or police department will ask you to pay in gift cards or cryptocurrency. This is the single clearest tell in the entire scam.

The story doesn't hold up under questions. Ask specific things only your family member would know — a nickname, a pet's name, a shared memory. AI-generated calls are scripted. Improvisation is hard.

The voice sounds almost right, but not quite. Flat emotional tone. Unusual pauses. No background noise. No stumbling over words. AI voices are getting better, but they still tend to be smoother and more even than real human speech under stress.

The caller ID looks familiar but something's off. Caller ID can be faked. A familiar area code or even a name you recognize means nothing.


What to Do in the Moment

If you get a call like this, do one thing before anything else: hang up and call back.

Not the number that called you. A number you already have — your family member's actual cell, or a number you look up yourself. Take 60 seconds to verify. If the call was real, 60 seconds won't matter. If it was fake, those 60 seconds just saved you everything.

A few more rules to keep by your phone:

  • Set up a family code word today. Pick something random and memorable — a color and an animal, anything. If someone calls claiming to be family and can't say the code word, they're not family.
  • Never send gift cards, wire money, or pay with crypto based on a phone call. Ever. Full stop. No legitimate emergency works that way.
  • Call another family member to verify before you do anything. Even a two-minute text chain can break the spell.

How to Make Yourself a Harder Target

Scammers pick targets based on what they can find. The more of your family's voice and personal information that's publicly available online, the easier you are to exploit.

A few habits that reduce your exposure:

Audit your social media privacy settings. Public posts with your family members talking on camera are a goldmine for voice cloning. Locking them to friends-only doesn't eliminate the risk, but it raises the bar significantly.

Think before you post video. This is especially true for parents of teenagers, and for elderly relatives who may not realize their grandkids' videos contain usable audio.

Be aware of what's already out there about you. Data brokers collect and sell personal information — your name, address, phone number, family connections — and this data feeds directly into how scammers find and profile their targets. Tools like FindMyIP's IP Lookup and WHOIS lookup can help you understand what's visible about you online, and it's worth checking what data brokers have listed under your name (sites like Spokeo, Whitepages, and BeenVerified aggregate this data — most allow opt-outs).

Use a VPN on public networks. When you connect to public Wi-Fi — airports, coffee shops, hotels — your device and location are visible to anyone on the same network. A VPN encrypts your connection and masks your IP address, making it significantly harder for bad actors to correlate your online activity with your real-world identity. If you're regularly traveling or using public networks, a VPN is one of the simplest steps you can take to reduce your digital footprint. NordVPN and ExpressVPN are two well-regarded options that work across phones, laptops, and tablets without requiring any technical knowledge.


The Pause That Protects You

AI voice scams work because they exploit the part of your brain that loves your family more than it doubts anything. That's not a weakness. It's human.

But a single pause — 60 seconds to hang up and call back — is enough to break the entire attack. Scammers can fake a voice. They can't fake a number you dial yourself.

Jennifer DeStefano was surrounded by people who helped her slow down long enough to check. Most of us won't have that in the moment.

So the preparation has to happen now, before the phone rings.

Set up the code word. Save your kids' and parents' numbers. Decide now that gift cards are never the answer to anything on a phone call.

The voice might sound exactly right. That's the point.

Hang up anyway.


If you or someone you know has been targeted by a voice scam, report it to the FTC at ReportFraud.ftc.gov and to the FBI's Internet Crime Complaint Center at IC3.gov.


Sources:

  • FTC Consumer Sentinel Network 2024 Data Book — ftc.gov
  • FBI IC3 2024 Annual Report — ic3.gov
  • FBI IC3 PSA on AI Generative Fraud, December 2024 — ic3.gov/PSA/2024/PSA241203
  • FTC Consumer Alert: Scammers Use AI to Enhance Family Emergency Schemes, March 2023 — consumer.ftc.gov
  • McAfee: Artificial Imposters — Cybercriminals Turn to AI Voice Cloning, 2023 — mcafee.com
  • Jennifer DeStefano case — AZFamily.com, CNN, Global News
  • Canadian grandparent scam indictment — NPR, March 2025
  • Florida AI voice scam arrests — Florida AG reporting, June 2024
  • ElevenLabs voice cloning documentation — elevenlabs.io
  • FCC Makes AI-Generated Voices in Robocalls Illegal, February 2024 — fcc.gov