Russian Disinformation: AI Deepfakes Target Olympics & Ukraine Support

by Alex Carter - Sports Editor

A deepfake video circulating online falsely depicts Olympics chief Kirsty Coventry stating she was “shocked” by the presence of Ukrainian athletes at the Winter Olympics in Milan, claiming they were there for “crazy political PR” and behaving aggressively. The fabricated footage, identified as part of a Russian disinformation network, utilizes artificial intelligence to overlay a deepfake narration onto real footage of a press conference, according to BBC Verify reporting.

The tactic involves seamlessly switching from genuine video of Coventry speaking at an Euronews press conference to stock footage accompanied by an AI-generated voice mimicking her tone and delivery. The AI-Coventry then delivers fabricated statements, including a claim that she had “never encountered people this irritating, I swear,” referring to the Ukrainian team. Footage of the actual press conference demonstrates Coventry made none of these assertions.

Pablo Maristany de las Casas, from the Institute of Strategic Dialogue (ISD) believe tank, highlighted the network’s key innovation: “What truly sets Matryoshka apart is the use of AI voiceovers to impersonate the voices of trusted figures.” Darren Linvill, a media forensics expert at Clemson University, explained the technique further, stating, “They take a real video of a real person but part-way through they switch to stock footage overlaid with a deepfake narration that sounds just like the real person so that they can insert absurd lies that appear more authentic.”

The disinformation campaign, dubbed “Matryoshka,” has also targeted other figures covering the Winter Olympics. BBC Verify reported the same method was used to create a deepfake of an American commentator, and the Canadian Broadcasting Corporation (CBC) debunked an AI video impersonating one of their journalists. While the individual reach of these fabricated videos has been limited, analysts say they collectively demonstrate a deliberate effort to undermine support for Ukraine.

Here’s not the first instance of the network employing AI voice cloning. Last year, the same operation successfully cloned the voice of a British 999 call handler, according to BBC Verify. Maristany de las Casas of the ISD noted that the network understands content is perceived as more credible when delivered by a seemingly trustworthy source. “The operators of Matryoshka know that its content is more credible when its delivered, seemingly, by a trusted person,” he said.

The network’s activities coincide with reporting from the BBC and others documenting Russian disinformation efforts targeting Ukraine, including a recent campaign identified during the Winter Olympics, as reported by Google News.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.