AI-backed Deepfake Impersonations Are Getting Harder to Detect, FBI Warns

Deepfakes are images, videos, or audio that imitate known or trusted people.
AI-backed Deepfake Impersonations Are Getting Harder to Detect, FBI Warns
The Federal Bureau of Investigation in Washington on Aug. 7, 2025. Madalina Kilroy/The Epoch Times
|Updated:
0:00
Increasingly hard-to-detect deepfake content created with artificial intelligence is being exploited by criminals to impersonate trusted individuals, the FBI and the American Bankers Association (ABA) said in a report published on Sept. 3.
In its “Deepfake Media Scams” infographic, the FBI said that scams targeting Americans are surging. Since 2020, the agency has received more than 4.2 million reports of fraud, amounting to $50.5 billion in losses. “Imposter scams in particular are on the rise. ... Criminals are using deepfakes, or media that is generated or manipulated by AI, to gain your trust and scam you out of your hard-earned money.”