Loading...
Loading...
Scammers use AI to clone the voices of family members, executives, or authority figures from just seconds of audio. Voice cloning has crossed the 'indistinguishable threshold' — clones now include natural breathing, pauses, and emotion. Deepfake-as-a-service platforms make this accessible to anyone for under $2. Global losses exceeded $200M in Q1 2025 alone.
Annual Losses
$200M+ in Q1 2025 alone (global); individual losses from $15K to $25.6M per incident
Avg Loss / Victim
$15,000-$100,000 (family emergency variant); millions for corporate variants
Primary Vector
Phone calls, video calls, voice messages
Peak Season
Year-round
AI can now clone anyone's voice from just a few seconds of audio — a voicemail, social media video, or podcast clip. The clone reproduces tone, emotion, breathing patterns, and speech rhythm convincingly enough to fool family members. Scammers use this to impersonate relatives in 'emergency' calls (car accident, kidnapping, arrest) or executives in business contexts. Deepfake-as-a-service platforms cost under $2 per clone, making this accessible to any criminal.
Hover or tap the highlighted text to see why each element is a red flag.
[Phone call with cloned daughter's voice, crying] Mom, I was in a car accident. I lost the baby. They're saying I might go to jail. I need you to send $15,000 right nowRed flag: Large, specific amount demanded immediately — real emergencies involve police, hospitals, not cash demands. Please, Mom, don't tell anyoneRed flag: Isolation tactic — prevents victim from calling the real person to verify — my lawyer said it could make things worse.
Emergency call from a loved one demanding money immediately
Hang up and call them back on their known number. If they're really in trouble, you'll reach them. If not, you've confirmed the scam.
Caller says 'don't tell anyone' or 'keep this between us'
Isolation is a core tactic — scammers need to prevent you from verifying their identity
Specific dollar amount with urgent payment method (wire, crypto, cash courier)
Real emergencies involve official channels. Police, hospitals, and lawyers don't demand cash via phone
Voice sounds right but the situation is bizarre or extreme
Trust the logic of the situation, not just the voice. AI can fake the voice but the story usually has holes.
Caller can't answer personal verification questions
Ask something only the real person would know — a family code word, a shared memory, a pet's name
If a family member is truly in an emergency, you can reach them by calling their phone number directly. Hospitals and police stations have verifiable numbers. Lawyers don't call family members demanding cash payments. Establish a family code word or verification question for exactly this scenario.
Just 3-10 seconds of audio is enough. Sources include social media videos, voicemail greetings, YouTube content, phone recordings, or even a brief phone conversation. If your voice is anywhere online, it can be cloned.
Establish a family code word or question that only your family knows. If someone calls claiming to be a relative in an emergency, ask for the code word. Also: always hang up and call back on a known number. Never trust an inbound call, even if it sounds exactly like someone you know.
Yes. In 2024, an engineering firm lost $25.6 million after scammers used deepfake video in a live conference call to impersonate executives. Real-time video deepfakes are improving rapidly — researchers predict fully interactive AI video agents by 2026.
Establish a secret family code word that anyone must provide before you send money in an emergency. Additionally, set social media profiles to private to prevent scammers from mining voice audio from public videos — AI cloning tools only need a few seconds of audio.
Paste it in and get an instant analysis — free, private, no account needed.
Analyze a Message