Manipulated videos are becoming more pervasive online and often spread misinformation and disinformation.
Misinformation is when a person shares deceptive or false information unintentionally, while disinformation is shared with the intent to deceive.
Some altered videos may seem harmless, such as ones that appear to show celebrities performing stunts. Other times, there could be a nefarious intention behind the videos -- like when a world leader's likeness is copied for political or propaganda purposes.
There are several types of doctored videos that can spread false information. Two of the most common are deepfakes and shallowfakes. A deepfake video is made using artificial intelligence technologies, like programs that can be used to replace or synthesize faces, speech or expressions of emotions.
Shallowfakes are created using simple video editing software used to crudely edit together existing videos.
Here are four key points to look for when analyzing a video, and three examples of deepfake and shallowfake videos that we fact-checked using the key points.
4 key points to spot false videos
Here are questions to ask yourself when determining whether a video is real or fake:
- Movement - How is the subject moving in the video? Do their body language and facial expressions appear odd?
- Background - What does the background look like? Is it blurry, static or does it appear out of place?
- Source - What is the source of the video? Does the video have a watermark or logo?
- Context - Is there enough context to explain what is happening in the video? Does historical context align with what you’re seeing?
In each of the three examples below, the VERIFY team used the key points, along with online tools such as RevEye, InVid and TinEye, to determine that these videos are fake.
Example 1: President Joe Biden shallowfake video
This video appears to show President Joe Biden authorizing a fourth stimulus check, but the president never did that. The video is actually a shallowfake, because it was crudely edited using actual footage of Biden.
The video footage was edited from a speech Biden delivered at COP26, the United Nations climate change conference held in November 2021. Here’s how we know this video was false:
- Movement: The president’s movements appear slower and his lips don’t match.
- Background: The flags and podium signage are the same as what could be seen in CSPAN coverage from the COP26 summit. If Biden was authorizing a fourth stimulus check, he would likely do that from the White House, not at an event like the COP26 summit.
- Source: The White House never officially released this footage - it was shared to a non-political Facebook page.
- Context: Congress and the White House repeatedly pushed back against the idea of a fourth stimulus check, so a video claiming to show Biden authorizing one was unlikely.
Example #2: Ukrainian President Volodymyr Zelenskyy deepfake
A video that has been deleted from most major social media sites since it was originally published claims to show Ukrainian President Volodymyr Zelenskyy calling on citizens to surrender to Russia in March. That video was a sophisticated deepfake, and after it was shared by Russian state media it went viral on social media worldwide. Here’s how we know it’s fake:
- Movement: In the video that was manipulated, Zelenskyy’s body never moves while speaking.
- Background: The background is static, making it appear as though he didn’t move at all. This, along with his body language, suggest the video may have originated by manipulating a still image.
- Source: The source of the video was unknown, but some Ukrainian networks did air it on television in the country. Zelenskyy also posted daily updates on his social media accounts, and this video didn’t appear on any official government channels.
- Context: Before the war in Ukraine began, U.S. officials warned Russian operatives could use misinformation or disinformation like deepfakes. Experts warned the public to be cautious of videos posted to social media. It’s unlikely that Zelenskyy, who has repeatedly encouraged his people to fight, would have called for a surrender.
Example #3: Celebrities reacting to Will Smith-Chris Rock slap deepfakes
These deepfake videos of actors Clint Eastwood and Morgan Freeman appear to show the two reacting after Will Smith slapped Chris Rock during the 2021 Academy Awards. Here’s how we know they are not real:
- Movement: In this case, we are looking for discoloration and image enhancements. The jaw was discolored in both videos, and the faces are digitally altered to make them appear younger.
- Background: The background of the videos showed the same background – a bookshelf loaded with CDs. While these celebrities could run in the same circles, it’s unlikely they were in the same room recording their reactions.
- Source: These videos were posted to Twitter, but originally came from the TikTok account of @themanofmanyvoices who, according to their account, is a celebrity impressionist.
- Context: Are Morgan Freeman or Clint Eastwood the type to post a video to social media, just to react to Smith slapping Rock? Probably not.
The MIT Media Lab, a research laboratory at the Massachusetts Institute of Technology, offers these additional tips to help detect deepfakes:
- Pay attention to the eyes and eyebrows. Do shadows appear in places that you would expect?
- Pay attention to the glasses. Is there any glare? Is there too much glare? Does the angle of the glare change when the person moves?
- Pay attention to the facial hair or lack thereof. Does this facial hair look real? Deepfakes might add or remove a mustache, sideburns, or beard
- Pay attention to blinking. Does the person blink enough or too much?
- Pay attention to the size and color of the lips. Does the size and color match the rest of the person's face?
If you have questions or want something confirmed, the VERIFY team is here for you. Send your questions to firstname.lastname@example.org if you want the team to fact-check any claims you see online or hear in person.