Telltale Signs of Russian Disinformation
A wave of disinformation emerged since Russia invaded Ukraine, especially from the Russian side. We examined the most common patterns of how disinformation and propaganda is created and spread. This is a guide for journalists and anyone who hates to be taken for a fool…
Editors note: This is by no means a comprehensive framework of all the disinformation that emerged over the past month. But, in this two-part post we cover many of the most common techniques deployed. We describe in detail how to take them apart with open-source intelligence (#OSINT).
Part 1 —April 1, 2022
Audio manipulation: Russian Propaganda with videos:
For years, Russian actors, including the Russian government, uses #disinformation and half-truth to attack and destabilize other countries. But the Russian approach is much broader than simply sowing lies and denial — such as maintaining that Russian troops and equipment are not located where they are, a technique a NATO handbook on Russian Information warfare from 2016, mentioned.
A new wave of Russian disinformation evolved in 2022 that tried to vindicate military action against civilians. Together (me @techjournalisto) with a range of investigative journalists, including Brecht Castel from Knack and journalists at the Indian news media Alt News (@akhmxt and @ArchitMeta), we analyzed this graphic video — that turns out to be manipulated. This and other videos gave Russia the narrative to invade Ukrainian cities.
The video was posted by a number of unknown accounts around February 27, three days into the invasion. A man gets apparently shot by members of a Kyiv street patrol in broad daylight.
TLTR & Editors Note: The video was shot in Kyiv, dated 27 February or earlier. Shooters were members of the reserve of the Ukrainian army or civilians. Most likely they were given weapons by the government. The shooting itself is not in the picture. The audio of the gunfire was added to the video later on, proofing its manipulation.
Unknowns: Audio manipulation of staged and unstaged footage has been used frequently for Russian propaganda purposes. Is this video fully staged? Or was the sound of the shots added to real footage for dramatic effect? Were rubber bullets used? We cannot answer these questions with the information we have been able to gather so far.
The street patrol we see in the video, are allegedly members of the “Kyiv Territorial Defence Unit”. They wear bands on their arms for identification. The video was shot in Kyiv, Ukraine, which has been verified by us.
Why is this Video of importance in Russia`s information war? The video spread like wildfire through Telegram, Facebook, VK and Youtube. Shortly after it was first posted, on February 27 at 12:54, over the course of a few hours, it garnered no less than 600,000 views. Almost immediately after being posted, Russian State TV broadcasted it. The Kremlin justified Russia's offensive into cities, including Kyiv, killing many civilians who remained. On Russian TV channel Perviy Kanal, it is possible that one hundred million Russians may have watched this footage.
On the News in Russia: (Newsreader`s statement): “The authorities in Kyiv continue to organise panic by arming everyone uncontrollably. Uncontrolled and uneducated citizens occupy residential intersections, control the streets and empower themselves to conduct counter-espionage and track down saboteurs. Here you can see how one of these so-called controllers shoots an unarmed citizen.”
Russia`s narrative: The troops must attack cities like Kyiv to protect the civilian public from lawless individuals who received weapons (including machine guns) from the Zelensky government.
Shooting date: Metadata suggests the upload date was the 27th — besides metadata can be changed. There is little chance snow covered the ground on that day (or the day before or after) as satellite data indicate. Yet this alone isn’t enough of an indication that it was shot on that day.
It wasnt possible so far to verify: that the person in the video was really shot. We do not see the man being hit by the bullet, as the camera turns away at the wrong moment. Was this video tinkered with?
The audio raised a number of questions. The quality is bad and the sound appears “unnatural” — according to a number of expert sources we shared it with.
Is the Audio real?
So our small team of journalists, including Brecht Castel and others showed the video to audio specialists.
What we hear and see:
- The sound of bullet casings falling after the shots — at second 0:18 — is very clear and distinct as if recorded at close range.
- In contrast, the sound of the approaching car at the beginning is quiet and muffled. So the audio perspectives do not seem to match.
We have the audio analyzed by sound designer Ciaran Walsh. He looks at the spectrogram of the video for us (A spectrogram is a visual representation of the sound in which the pitch or frequency, on the y-axis, and plotted against time — x-axis.
The red dotted line on the video’s spectrogram indicates a critical feature. Above that line, the spectrogram is completely black. It is not cloudy (as with the siren) so there is no ambient noise or noise present.
Sound experts dismissed the audio track as fabricated. Explanation: “These sounds could not have been part of the original recording”.
This type of manipulation is not new. In fact, the platform TikTok makes is relatively simple to plant audio onto a video, as Abbie Richards says, who researches disinformation on TikTok for ARC, says:
Our example on the shooting shows how simple it was for the Kremlin to misuse a manipulated video to misinform the russian public about the invasion into civilian areas. This is only one example that raises human rights questions. We expect more, similar examples to emerge as troops march on.
Bots are not new to feature in Russian disinformation campaigns. But it seems especially on Twitter they were employed to carry the message of Russian disinformation far and beyond Western social networks.
Hard evidence shows that an armada of pro-Russian accounts was created shortly before and during the invasion. Tim Graham, lecturer at the QUT Digital Media Research Centre in Australia analyzed pro-Russian accounts and their conception dates, all the way back to 2009.
After parsing thousands of accounts through an API of a bot identification algorithm (which has arguably its limits to be 100% correct in judging), it was found that during the past weeks, thousands of accounts were created to lift the appearance and clout of official social media channels organized by the Kremlin.
Here is my own analysis on bots supporting and following the Twitter account of a Russian embassy in Europe (I withhold the name of the Embassy) shows, there are plenty of examples:
False News Agency reports: Russian Disinformation appears sometimes a lot blunter when considered in retrospect. Take the news reports by the Russian news agency TASS weeks before the invasion. Here the Russian media lashes out at western news media, calling the satellite images verified military buildup (that later really occurred) “fake news”.
Russia used therefore a similar strategy to western media, which fact-checks the news. This way, a mentality of “it is their word against ours” is created, which allows discrediting News Media from verified sources. It is becoming increasingly commonplace, not only in Russia but also in China, to spin and falsely label disinformation in an attack against Western media and to cause confusion.
In one report from January 10, TASS alleged that media outlets spread disinformation on invasion plans, which we all know, turned into reality only a couple of weeks later. In other examples, official Russian social media channels use fake news clip art as a stamp.
Alleging that western fact-checked media is “fake news”, reminds us of narratives communicated by specific Chinese officials’ accounts. Lijian Zhao, foreign minister of CCP similarly presented to western media at the start of the Covid-19 pandemic.
TASS also claimed that Ukrainian President Zelensky abandoned Ukraine. This is easy to dismiss. We check images and news reporting by reputable sources or verify videos posted by Zelensky himself. TASS also published the fabricated Bioweapon labs narrative. All this contributed to the decision by Reuters to remove TASS from its content marketplace.
Propaganda≠Disinformation: Propaganda doesn’t equal disinformation. One is to convince people of something. The other is an attempt to sow confusion, distrust, and irritation with verified information. In both disciplines, the Kreml excels.
Example: “The west staged this…”:
Claims made by Vasily Nebenzya, the Russian ambassador to the UN, boldly claimed that the bombing of a maternity ward, that left three dead and 17 wounded, was staged.
We should not ignore that most of the Russian propaganda and disinformation is concentrated on domestic Russians. However, that notion is certainly worth contesting and there are examples here and there — such as the false claim that a Russian national was killed in Germany (reporting ZDF) — that suggest otherwise.
Yet, several things speak against the fact that most Russian disinformation is directed outside of Russia. First, there is the question of why Putin should keep caring about what the West thinks — so why spend more resources on it? Second, historically we know that there are examples that reveal the distribution of Russian disinformation efforts. Staff at the Internet Research Agency, a Russian troll farm, accounted for several hundred people. Only a small share, around 10 percent worked in English-speaking targets. So most of the efforts were directed at the non-English speaking world.
Take the above: This type of propaganda is common in official Russian communication channels. The narrative for Russia is to convince that the invasion in Ukraine is “just”, and not a war but a so-called “special military operation”. Soldiers fighting ought to be supported and celebrated.
The Russian Defense Ministry propaganda claims: “…huge number of people support our servicemen…“. In reality, the crowd of lights, consisted most likely of no more than a few hundred people — filmed in Volgograd, 350 km East of the Ukrainian border.
Showing off weapons for civilian destruction
More recently, war propaganda concentrated on the sheer might of potential civilian casualties — in the form of showing off how capable Russian forces are to attack civilian targets.
With the help of burn marks (the ground around the firing site ignites, see in the video) it was later possible to geolocate it no more than 6km east of besieged Mariupol, exactly the range to fire the improved version. The city faced a violent humanitarian crisis and hundreds of buildings were partly or completely destroyed (you can see here the data on were buildings in the city were affected, Source: UNITAR).
Showing the destructive power of the TOS-1a:
Weeks after British intelligence confirmed the existence of thermobaric weaponry that can cause unspeakable effects among civilian when fired at residential targets, Russian statemedia bousted with a video of a fired TOS-1a thermobaric MLRS.
With the help of burn marks (the ground around the fireing site ignites, see video) it was later possible to geolocated it. The location is no more than 6km east from besiged Mariupol — a city facing a violent humanitarian crisis and which is covered under a blanket of russian attack on civilian targets.
“Staged with actors” allegations
More zealous attempts of disinformation occurred around severe crimes against civilians. After the attack on the maternity hospital in Mariupol, fake accounts falsely claimed that an injured woman, Marianna Podgurskaya, was an actor who pretended to be pregnant. The woman was in fact a blogger but was really pregnant and injured by the strike.
Other accounts followed up, alleging that the woman was the same on the photos taken, only with other makeup and closing. Facial recognition analysis undoubtedly shows this is false.
On March 10, even the Twitter account of the Russian Embassy to the UK followed up with a post repeating it and trying to corroborate the false narrative.
Twitter deleted these posts because the posts violated their guidelines. A quick look at Podgurskaya’s Instagram also reveals the fact that the woman was pregnant (as Eliot Higgins from Bellingcat pointed out) — a fact that would have taken the embassy social media team two seconds to verify. But from other missteps it appears that a more sinister reality, that of disinformation, is more likely at play here:
Another example shows how Pro-Putin accounts re-tell the (false) story of how a Ukrainian military staff member moved into the acting role of a Russian prisoner of war (POW). Using an image of two vaguely similar people is a recurring theme that facial feature comparison can dismiss. Sadly, by the time fact-checkers came to the scene, the narrative spread across the web.
There are also misleading claims of humanitarian aid delivered, such as one video of Russian soldiers giving out “120 tonnes of food”. The convoy, of 13 small trucks, is quickly analyzed. Classic Kamaz 6x4 carries a maximum payload of 12.000 kg. But these are half full and many load diapers, toilet paper as well as food. So let’s generously assume half the max payload: equals 78 tonnes of humanitarian aid, and much below the promised 120 tonnes claimed in the post.
At the same time, official Russian accounts try to spread disinformation about real posts to refute material by Western media, such as here, claiming (falsely) an injured woman is a staff member of the 72nd Info warfare PsyOps Center in Ukraine. Such claims are entirely fabricated, The image originates from AFP, which runs an extensive Fact-checking service.
Spelling and language
Small spelling errors make a big difference in Russian state propaganda, it seems. In the past weeks, the Hashtag #Berdyansk trended on social media after Russian troops took control of the 100,000 people strong city. In early March, a video was posted by the Russian MoD channel, reading:
The problem is that the city name is spelled wrong. Aid was not sent to “Berdyansk” but to “Berdyanske”, a tiny little cow town. Whether this was simply a spelling error is not clear. The verification is straightforward. The video shows the village’s town sign. With Yandex reverse image search, we can pull out the name and verify via Google Maps.
(There are other freely accessible OCR tool — Lang.: Ukrainian)
Not misspelled but misread
Other examples show how Russian media intentionally misread documents. Reverse image OCR processing of documents with Yandex Images reveals false claims made about alleged evidence that “Kyiv planned attack of the Donbas region”. If the document would have been read correctly, the mentioned military training place would have been identified as an area hundreds of kilometers off the area in question.
Satellite image verification: Claiming destruction of buildings happened much earlier
A reoccurring theme is to claim that destroyed buildings were not destroyed in the Russian invasion war but had been demolished much earlier. Often it takes no more than checking the record of satellite images to dismiss those claims:
Military planned Ukraine invasion
Examples such as that of the Russian soldier Valery Berezovsky show that at least before the invasion, Russian generals and troops knew what was going to happen.
Berezovsky was stationed in Ukraine in 2014, where he took part in hostilities under a contract. A social media post revealed that he was asked to travel on a “business trip” on 02/21/22, days before Russia’s offensive started.
Indicators to verify time and date:
We have seen a few examples in which we can use means to tell the time when video footage was produced. There is the simple reverse images search of the Ukrainian president in military uniform, which has been part of propaganda used to show Zelensky is fighting with his country as the invasion took place. In reality, these pictures are old with no link to the Russian Invasion in Ukraine.
Save the image and upload it to reverse image search services (Bing, Yandex, Google Images, TinEye etc). We find the images are from 2021 and taken by news Agencies and used in their reporting.
There are other ways to spot false/misleading claims when a video or photo was posted. In the following case, we are told by the social media channels of the Russian Ministry of Defense was posted at least a few hours later, judging by the watch Colonell General Mizintsev wears.
Other examples show (above) how indicators such as watches appearing in videos allow telling when videos published by the Russian media channels were recorded. But now that this is known also by those who produce these videos, it warrants to question those indicators as they can be tinkered with (e.g. intentionally setting the wrong time on clocks or watches)
Quality of footage
A low-quality video uploaded and compressed by Twitter of Putin meeting Aeroflot staff was used to claim that Russia produced disinformation. The Ukrainian President Zelensky repeated this claim, which is dangerous. A high-resolution video shows the truth. How to find better quality versions of a video? YouTube allows filtering for pixel quality. You can also search for better-resolution video on Google.