Your voice can be cloned in 10 seconds
“The initial caller’s voice sounded very much like my nephew’s. He knew family details, pleaded with me not to call his father, and promised to pay me back as soon as he got home.”
“The voice on the other end sounded just like my grandson, and it said, ‘Gramie, I've been in an accident.’”
“I received a call and heard my daughter crying hysterically! She wasn't making sense, so an ‘officer’ took over the call. He stated I needed to come right away but would not answer my questions.”
“I received a phone call from my grandson explaining that he was in a car accident at college and needed $5,000. He sounded scared and upset and asked that I not tell his parents. So I went to my bank to get the money.”
These quotes are a small sample of the hundreds of stories that Americans shared with Consumer Reports about audio deepfake scams.
Today’s AI tools need to train on less than 10 seconds of your voice to create a clone that sounds just like you.
Earlier this summer, an imposter did just that by creating a voice-based deepfake of Secretary of State Marco Rubio. The impersonator sent voice and text messages that perfectly parroted Rubio’s voice and writing style to foreign ministers, a U.S. governor, and members of Congress.
Rubio wasn’t the only one. The FBI noticed a concerning increase in audio deepfakes that impersonated senior U.S. officials. “If you receive a message claiming to be from a senior US official, do not assume it is authentic,” the FBI said in an announcement in May.
This week’s newsletter looks at the rise of voice cloning and deepfakes—how the technology works, the risks it poses, and how you can protect yourself.
// When fake news sounds like you
It’s never been easier to clone one’s voice. An online search will yield dozens of companies offering the service. What once required sophisticated equipment and expertise can now be done on a smartphone with a free app.
While there are harmless use cases of voice clones (e.g., a voice-over for an audiobook or the creation of a podcast), voice cloning technology has also been harnessed for nefarious purposes.
A report released by Consumer Reports (CR) earlier this year found two types of threats from voice clones:
- Bad actors use voice clones to impersonate everyday Americans. Often, they are the voice of a loved one. Since November 2022 when ChatGPT was launched, deepfakes have surged more than twentyfold.
- Bad actors use voice clones to impersonate trusted public figures. The voices of influencers, celebrities, and politicians have said the darndest things (most of which were untrue). A 2024 ProPublica investigation found videos and audio on Facebook and Instagram that mimicked the distinctive tones of President Trump and President Biden, offering cash handouts to people who completed a survey. A deepfake of Taylor Swift in 2024 deceived fans with a fake Le Creuset giveaway. An analysis by The New York Times concluded that deepfakes of Elon Musk have led to billions of dollars of fraud. And yet companies like Parrot exist, selling tools to “make a celebrity say anything.” In a world flooded with fakes, they’re handing out megaphones.
The risks are amplified when voice clones are used in an attempt to sway elections.
// The rise of voice-based deepfakes
Imposter scams are not new, but AI technology makes them more believable.
A 2024 report by Deloitte predicted that AI-generated fraud could grow at a 32% year-over-year rate, exceeding $40 billion in losses by 2027 (see graph below).