The FBI's 2025 Internet Crime Report reveals that AI voice cloning scams caused over $1 billion in reported losses in the United States last year — a 400% increase from 2024. The agency calls it the fastest-growing category of fraud it has ever tracked, driven by the plummeting cost and rising quality of deepfake audio tools.
The Scale
Voice cloning fraud has exploded because the tools are now trivially accessible. A convincing voice clone can be generated from as little as three seconds of audio using free or cheap online tools. Social media videos, voicemail greetings, and recorded conference calls provide ample source material. The resulting clone can speak any text in real time with the target's voice, intonation, and speech patterns.
The FBI received over 80,000 complaints related to AI voice fraud in 2025, up from approximately 15,000 in 2024. The average reported loss was $12,500 per incident, though losses in business email compromise (BEC) variants averaged over $100,000.
How the Scams Work
The most common variant targets families. A scammer clones a family member's voice and calls a relative — often an elderly parent — claiming to be in an emergency: a car accident, an arrest, a medical crisis. The caller requests an urgent wire transfer or gift cards. The voice is convincing enough that victims comply before verifying.
Business variants are more sophisticated. Scammers clone a CEO's or CFO's voice and call a finance employee to authorize an urgent transfer. In one case highlighted in the report, a Houston-based company lost $4.3 million after an employee authorized a wire transfer following a call from what appeared to be the CEO's voice.
The most advanced attacks combine voice cloning with spoofed caller ID, making the call appear to come from the real person's phone number.
Why It's Getting Worse
Three trends are converging. First, voice cloning quality has crossed the "good enough" threshold — even careful listeners struggle to distinguish clones from real voices in phone call conditions. Second, the tools are widely available, with several open-source voice cloning models freely downloadable. Third, the raw material is everywhere: most people have hours of their voice publicly available on social media, podcasts, or video calls.
"Two years ago, voice cloning required minutes of clean audio and technical skill," said an FBI supervisory special agent. "Today, three seconds of audio and a web browser is enough."
Regulatory Response
The FCC has issued declaratory rulings classifying AI-generated voice calls as "artificial" under the Telephone Consumer Protection Act, making robocalls using cloned voices illegal. Several states have passed or proposed laws specifically criminalizing voice cloning for fraud, with penalties ranging from 5 to 20 years in prison.
At the federal level, the DEEPFAKES Accountability Act, which would require AI-generated audio and video to carry digital watermarks, is currently in committee. Privacy advocates have raised concerns that watermarking requirements could burden legitimate AI research.
Protection Measures
The FBI's primary recommendation is simple: establish a family code word. A pre-arranged word or phrase that family members can use to verify identity during unexpected calls defeats most voice cloning attacks, because the scammer does not know the code word regardless of how convincing the voice clone is.
For businesses, the FBI recommends multi-factor authentication for all financial transactions — no transfer should be authorized based solely on a phone call, regardless of who appears to be calling.



