Doubts about Trump video show how hard deepfakes are to detect

Some Twitter users insist that a recent video of President Trump admitting his defeat in the election is a deepfake, although there's no proof. Deepfakes are difficult to detect.

A recent video posted to Twitter by President Donald Trump has some users suspicious that it's a deepfake, underscoring the difficulty in detecting what's real and what's fake in social media videos and highlighting erosion of public trust in government and the media.

In the nearly three-minute-long video, posted Jan. 7, Trump acknowledges his defeat to Democrat Joe Biden in the presidential election, and thanks Republicans for their "loyalty."

The video came as lawmakers weigh impeaching the president after he helped incite a violent mob of supporters to storm the Capitol Building in Washington, D.C., on Jan. 6.

Trump is standing behind a podium in the video, speaking directly to the camera. For some Twitter users, however, something seems off.

"Impressive deepfake video!" one Twitter user posted. "No way that's the real DJT."

"This video is a deepfake," posted another user.

Possible explanations

Forrester analyst Brandon Purcell said Trump and his team may have used a virtual background, similar to fake backgrounds available on Zoom and Microsoft Teams, in the video, making it look somewhat fake.

Donald Trump video screenshot
Some believe a recent video of Donald Trump is a deepfake, but there is no proof.

"For people who are already mistrustful of the government, many of whom didn't want to hear a concession, this was probably enough to provoke suspicions of a deepfake," Purcell said.

Deepfake refers to images, videos or audio that have been manipulated using sophisticated machine learning and AI tools. The technology to create deepfakes has grown increasingly more powerful and easier to use over the past few years, leading to a proliferation of deepfake images and videos online.

"It is easier than ever for anyone to make a deepfake video -- whereas even a year ago, it was harder and less convincing. So in a sense, the ability to make misleading videos has been commoditized," said Alan Pelz-Sharpe, principal analyst and founder at Deep Analysis.

In particular, professionally made videos are easier to manipulate, as they are made with high-quality lighting, camera and sound, he continued.

"It's ideal for faking and altering," Pelz-Sharpe said.

Eroding trust

While deepfakes are commonly created for entertainment purposes, foreign and domestic political agents and others also create deepfakes or other manipulated images and videos to influence elections and public opinion.

For people who are already mistrustful of the government, many of whom didn't want to hear a concession, this was probably enough to provoke suspicions of a deepfake.
Brandon PurcellAnalyst, Forrester

On Dec. 25, 2020, the U.K.'s Channel 4 television station aired a deepfake video by an animation and visual effects studio featuring a fake Queen Elizabeth dancing. The video followed the queen's annual Christmas address and, according to Channel 4, was intended to warn viewers that not everything they see and hear is real.

After years of eroding trust with the public, politicians and governments likely have an uphill battle to convince the public of what is real or not with deepfakes.

Social media platforms, including Fakebook and Twitter, are developing technology to detect deepfake content on their platforms, Purcell noted. But, he said, as detection technology makes advances, so, too, will deepfake technology, likely leaving detection efforts lagging a few steps behind.

In the absence of detection technology, the best defense against deepfakes is a reliable, trusted press, despite waning public trust in the news media, Purcell said.

Even so, he noted, there are still a few ways people can attempt to detect deepfakes.

For example, the eyes of the subject of a deepfake may be asymmetrical, or the lines between the subject and background may appear blurry in the video. Also, the voice may not match the actual subject's voice because deepfake audio lags video in maturity, Purcell noted.

Still, Purcell said, these methods are far from foolproof, and people should always consider the source of the content and try to corroborate it with other reputable sources. 

Dig Deeper on AI business strategies

Business Analytics
CIO
Data Management
ERP
Close