Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Axios
Axios
National
Kaveh Waddell

A group of companies are trying to deploy deepfakes as a force for good

Illustration: Sarah Grillo/Axios

What do you do with a technology that could restore the voices of people who have lost theirs — but also sow chaos and incite violence?

What's happening: A growing group of companies are walking this tightrope, betting they can deploy deepfakes — videos, audio and photos that are altered or generated by AI — as a force for good, or at least non-malign purposes, while keeping the technology away from those who would use it to do harm.


These entrepreneurs are playing with fire. Experts have long warned that the power to convincingly alter or invent video or audio could be a dangerous weapon in the wrong hands.

  • Easily forged videos of world leaders could supercharge fake news or help trolls set off political crises from the comfort of their homes.
  • But some argue that there is no stopping deepfakes. "The technology exists," says Danika Laszuk, who leads Betaworks Camp, a New York City startup accelerator. "There are no genie-back-in-the-bottle moments."

The big picture: Deepfakes or "synthetic media" — have largely been the purview of academics and online trolls for the few years they've been around.

Details: Betaworks is convening 7 synthetic media startups for a 3-month program this summer — and investing $200,000 in each.

  • They include Radical, which turns 2D videos into 3D scenes; Auxuman, which has an AI-generated avatar that plays AI-generated music; and Dzomo, which wants to replace expensive stock photography with deepfake images.
  • They will join a slowly growing field of synthetic media companies. Synthesia, a new startup co-founded by a former Stanford professor, can convincingly dub videos into new languages. In a demo, British soccer legend David Beckham delivers a PSA about malaria in nine languages — most of which he does not actually speak.
  • Perhaps the best example of deepfakes for good: Lyrebird, a company that creates digital voices that mimic actual speakers, is cloning the voices of people with ALS in order to allow them to continue communicating once they can no longer speak.

Making money off of deepfakes requires extreme care, says Hany Farid, a Dartmouth professor and leading expert on synthetic media. Companies must build safeguards from the very beginning, he says.

"The abuses of social media should be a cautionary tale — the model of 'move fast and break things' is fatally flawed, and we should adopt a mantra of move slowly, innovate and don’t break things."
Hany Farid, Dartmouth

For now, Lyrebird and Synthesia are relying mostly on ethics policies: They say they won't alter a video or audio clip of a person without their express consent.

  • Laszuk says that testing how to keep the tech safe will be a top priority for the Betaworks startups.
  • One participant is developing technology to detect deepfakes, and its work — plus advice from outside ethics experts — is meant to push the founders to build in systems that prevent the exploitation of their discoveries.
Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.