A deepfake video of Boris Johnson appearing to endorse his rival, Jeremy Corbyn , for Prime Minister has been posted on Twitter, showing how easily the truth can be distorted.
The video was posted on Twitter by Future Advocacy, and appears to show Johnson supporting his rival.
While the video is extremely convincing, it’s actually a deepfake - a video in which a person's face and voice are digitally manipulated to say anything the programmer wants.
In the deepfake video, Johnson appears to say: “Hi folks, I am here with a very special message. Since that momentous day in 2016, division has coursed through our country as we argue with fantastic passion, vim and vigour about Brexit .
“My friends, I wish to rise above this divide, and endorse my worthy opponent, the Right Honourable Jeremy Corbyn, to be Prime Minister of our United Kingdom.”

Future Advocacy also posted a deepfake video appearing to show Jeremy Corbyn endorsing Boris Johnson .
In that deepfake, Corbyn appears to say: “Once upon a time, I called for a kinder, gentler politics. However, we, the political class here in Westminster, have failed, and the consequences have been disastrous for our society.
“That’s why I’m taking on the toxic culture in Parliament. I’m urging all Labour members and supporters to consider people before privilege and back Boris Johnson to continue as our Prime Minister.”
While the videos are likely to have been created for laughs, they could easily be misinterpreted by viewers.
For example, earlier this year, a digitally altered video appeared to show Nancy Pelosi, the speaker of the US House of Representatives, slurring drunkenly through a speech.
The video was widely shared on Facebook and YouTube, before being tweeted by President Donald Trump with the caption: "PELOSI STAMMERS THROUGH NEWS CONFERENCE".
The video was debunked, but not before it had been viewed millions of times. Trump has still not deleted the tweet, which has been retweeted over 30,000 times.
The current approach of social media companies is to filter out and reduce the distribution of deepfake videos, rather than outright removing them - unless they are pornographic.
This can result in victims suffering severe reputational damage, not to mention ongoing humiliation and ridicule from viewers.
"Deepfakes are one of the most alarming trends I have witnessed as a Congresswoman to date," said US Congresswoman Yvette Clarke in a recent article for Quartz .
"If the American public can be made to believe and trust altered videos of presidential candidates, our democracy is in grave danger.
"We need to work together to stop deepfakes from becoming the defining feature of the 2020 elections."