AI-Powered Scams
Deepfake CEO Fraud
Criminals use AI-generated video or audio of company executives to authorize fraudulent wire transfers, often targeting finance employees in real-time video calls.
Reported Losses
$2.9 billion in BEC losses (FBI 2024) — deepfake cases growing rapidly
Primary Targets
Finance departments, CFOs, accountants, employees with wire transfer authority
Last Updated
2026-01-07
Also Known As
Business Email Compromise 2.0
How Scammers Contact You
How This Scam Works
This is one of the most sophisticated corporate scams, using AI to impersonate executives in real-time.
**How it works:** 1. Criminals research a company — learn names, roles, communication styles 2. They gather video/audio of executives from earnings calls, YouTube, interviews 3. Using AI, they create real-time deepfakes or convincing audio clones 4. Finance employee receives urgent request from "CEO" or "CFO" 5. May include a video call where the executive appears live (but is AI-generated) 6. Employee is instructed to wire funds urgently for acquisition, deal, or emergency 7. Money goes to criminal-controlled accounts
**Real case (2024):** A Hong Kong finance worker joined a video call with the company's CFO and other executives discussing a secret acquisition. He transferred $25 million. Every person on the call was a deepfake — he was the only real human.
**Why it's devastating:** - Employees trust video more than email - Urgency and secrecy prevent verification - AI quality is now nearly indistinguishable from real video - Large wire transfers happen before anyone realizes
Red Flags to Watch For
- ⚠️Urgent, secret request for large wire transfer
- ⚠️Request to bypass normal approval procedures
- ⚠️Video quality slightly off — lighting, lip sync, unnatural movements
- ⚠️Executive behaves slightly differently than normal
- ⚠️Request comes at unusual time (late Friday, before holiday)
- ⚠️Pressure not to discuss with others or verify through normal channels
- ⚠️New or unusual bank account for transfer
- ⚠️Audio or video has subtle glitches or artifacts
📝 Real Victim Account
"The CFO called me on Zoom and explained we were acquiring a company confidentially. He looked and sounded exactly like himself. Three other executives joined the call. They instructed me to wire $25.6 million immediately. I did. Later, I called the real CFO on his mobile. He had no idea what I was talking about. Every person on that Zoom call had been an AI deepfake."
— Hong Kong Police, CNN Report February 2024
How to Protect Yourself
- 1Implement multi-person authorization for large transfers
- 2Create verification code words with executives for emergencies
- 3Always verify through a second channel (call their known number)
- 4Be suspicious of any request to bypass normal procedures
- 5Train finance teams specifically on deepfake threats
- 6Establish clear wire transfer protocols that can't be overridden by urgency
- 7If a video call seems off, ask the person to turn their head or make unusual movements
- 8Consider AI detection tools for critical communications
🆘 What to Do If You're a Victim
- 1Contact your bank immediately — wire recalls are sometimes possible within hours
- 2Notify law enforcement immediately — FBI handles BEC cases
- 3Preserve all evidence — emails, call recordings, transaction records
- 4Report to FBI IC3 at ic3.gov
- 5Engage incident response and legal teams
- 6Determine if other employees were targeted
- 7Consider engaging forensic investigators
🔗 Related Scams
Scammers use AI to clone the voice of a family member, then call claiming to be ...
Highly personalized phishing emails written by AI that are more convincing and h...
AI-generated videos impersonating celebrities, executives, or trusted figures to...
📚 Sources & References
Think You've Encountered This Scam?
Use our AI-powered scanner to analyze suspicious URLs, emails, or messages.