Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Fortune
Fortune
Steve Mollman

Scammers are using voice-cloning A.I. tools to sound like victims’ relatives in desperate need of financial help. It’s working.

(Credit: Getty)

You may very well get a call in the near future from a relative in dire need of help, asking you to send them money quickly. And you might be convinced it’s them because, well, you know their voice. 

Artificial intelligence changes that. New generative A.I. tools can create all manner of output from simple text prompts, including essays written in a particular author’s style, images worthy of art prizes, and—with just a snippet of someone’s voice to work with—speech that sounds convincingly like a particular person.

In January, Microsoft researchers demonstrated a text-to-speech A.I. tool that, when given just a three-second audio sample, can closely simulate a person’s voice. They did not share the code for others to play around with; instead, they warned that the tool, called VALL-E, “may carry potential risks in misuse…such as spoofing voice identification or impersonating a specific speaker.”

But similar technology is already out in the wild—and scammers are taking advantage of it. If they can find 30 seconds of your voice somewhere online, there’s a good chance they can clone it—and make it say anything. 

“Two years ago, even a year ago, you needed a lot of audio to clone a person’s voice. Now…if you have a Facebook page…or if you’ve recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice,” Hany Farid, a digital forensics professor at the University of California at Berkeley, told the Washington Post.

'The money’s gone'

The Post reported this weekend on the peril, describing how one Canadian family fell victim to scammers using A.I. voice cloning—and lost thousand of dollars. Elderly parents were told by a “lawyer” that their son had killed an American diplomat in a car accident, was in jail, and needed money for legal fees. 

The supposed attorney then purportedly handed the phone over to the son, who told the parents he loved and appreciated them and needed the money. The cloned voice sounded “close enough for my parents to truly believe they did speak with me,” the son, Benjamin Perkin, told the Post.

The parents sent more than $15,000 through a Bitcoin terminal to—well, to scammers, not to their son, as they thought. 

“The money’s gone,” Perkin told the paper. “There’s no insurance. There’s no getting it back. It’s gone.”

One company that offers a generative A.I. voice tool, ElevenLabs, tweeted on Jan. 30 that it was seeing “an increasing number of voice cloning misuse cases.” The next day, it announced the voice cloning capability would no longer be available to users of the free version of its tool, VoiceLab.

Fortune reached out to the company for comment but did not receive an immediate reply.

“Almost all of the malicious content was generated by free, anonymous accounts,” it wrote. “Additional identity verification is necessary. For this reason, VoiceLab will only be available on paid tiers.” (Subscriptions start at $5 per month.)

Card verification won’t stop every bad actor, it acknowledged, but it would make users less anonymous and “force them to think twice.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.