Skip to main content

Deepfake Video of Zelensky Surrendering: How It Spread in 2022

Origin: RU Language: EN

THE CLAIM

Ukrainian President Volodymyr Zelensky appeared in a video calling on Ukrainian soldiers to lay down their arms and surrender to Russia.

DEBUNKED

On March 16, 2022, three weeks into Russia’s full-scale invasion of Ukraine, a deepfake video appeared showing Ukrainian President Volodymyr Zelensky apparently calling on his soldiers to lay down their arms and surrender. The video was posted to Ukrainian news websites after a hack and spread on social media before being removed by Meta, YouTube, and Twitter. It was debunked within hours. Zelensky himself posted a real-time video response from Kyiv. The incident is documented as the first intentional use of deepfake video in an active armed conflict.

The Claim

The video showed a figure resembling Zelensky, speaking in Ukrainian, appearing to announce that Ukrainian forces should surrender to the Russian military. It was shared on the hacked website of Ukraine 24, a major Ukrainian news channel, and simultaneously pushed across social media platforms. The production quality was notably poor: the head was disproportionate to the body, the voice did not match Zelensky’s known speaking pattern or regional accent, and the figure appeared stiff and unnaturally lit. Despite these obvious artifacts, the clip was shared by accounts asserting it was authentic, particularly on Telegram channels operating in pro-Russian information spaces.

How It Spread

The distribution was coordinated across multiple vectors simultaneously. The Ukraine 24 website hack gave the video an initial veneer of legitimacy — it appeared, briefly, to have been broadcast by a credible news outlet. Simultaneously, the video was pushed to social media via networks of accounts in Telegram, Facebook, and Twitter. Meta confirmed to TechCrunch that it removed the video under its policy against misleading manipulated media. Facebook and YouTube both acted within hours of the video’s initial spread. The Atlantic Council’s Digital Forensic Research Lab (DFRLab) documented the coordinated spread pattern. Critically, the Ukrainian government had issued an advance warning on March 2, 2022 — two weeks before the video appeared — that Russia might be preparing exactly this kind of deepfake, giving Ukrainian media and the public a pre-prepared debunking frame. Reuters and BBC Verify both covered the incident in detail.

The Truth

The video was a deepfake — a synthetic media product generated using AI-based face and voice synthesis technology applied to footage of a different person. Zelensky had not surrendered, had not made any such announcement, and was not in the location or setting depicted in the video. He posted a real video response within hours from the government quarter in Kyiv, calling on Ukrainians to continue defending the country.

The technical quality of the deepfake was relatively low by the standards of the technology available in 2022. Digital forensics experts noted the following artifacts: the head size was inconsistent with the body, the lighting on the face did not match the background environment, the voice synthesis was distinguishable from Zelensky’s known speech patterns, and the lip synchronization was imprecise. The poor quality was itself informative: it suggested the operation prioritized speed of deployment over convincingness, betting that initial viral spread would do damage before quality-based debunking could reach the same audience. As the Euronews analysis noted, this was the first documented use of deepfake video as a deliberate operational tool in an active armed conflict — a threshold with significant implications for information warfare doctrine.

How to Spot It

  • Proportionality and lighting artifacts: Deepfake videos produced under time pressure often show head-to-body proportion mismatches, inconsistent skin texture between face and neck, and lighting on the synthesized face that does not match the scene’s light source.
  • Voice synthesis tells: AI voice cloning in 2022 struggled with regional accents, emotional cadence, and breath patterns. A voice that sounds like a smooth approximation of someone rather than that person is a warning sign.
  • Cross-platform verification: When a video of a world leader making a major announcement appears on social media before any official channel, government press office, or credible wire service, the probability of inauthenticity is very high. Official announcements leave official trails.
  • Advance warning preparation: The Ukrainian government’s two-week advance warning that such a video might be produced demonstrates the value of prebunking — preparing audiences to critically evaluate claims before they encounter them. When you know the playbook in advance, you are harder to deceive.

Classification

This is a state-adjacent deepfake operation used as a wartime psychological operation (psyop). The goal was not primarily to convince large numbers of people that Zelensky had surrendered — the video’s low quality made that difficult — but to inject doubt, generate confusion, and undermine the morale of Ukrainian defenders and supporters during a critical period of the conflict. The simultaneous use of a hacked news website alongside social media distribution reflects a hybrid information-warfare approach: compromising a trusted source as a vector for inauthentic content. The Zelensky deepfake remains the canonical reference case for the use of synthetic media in armed conflict.

Katharina Berger

Media Literacy Researcher & Editor

Katharina has spent a decade studying digital misinformation, fact-checking methodology, and media education. She reviews all cases published on Fake Off.