Russian Disinfo Campaigns During the 2024 Election

During the 2024 U.S. presidential election, Russian actors were shown engaged in targeted disinformation campaigns aimed at undermining the election’s legitimacy and promoting Donald Trump by spreading false claims of voter fraud. A notable incident involved a fabricated video purporting to show election fraud in Arizona, part of a broader strategy to incite distrust and discord among voters.

Techjournalist
15 min readNov 24, 2024
Careful, fake videos alert

The Intelligence community was actively warning Americans, be prepared for a disinformation storm. This is how US media framed it at the beginning of November, when at least two fake videos appeared on the internet, that wanted to divide, decept and spread anger. After careful analysis, the videos were found to be created by russian actors. Its these and other examples that will remain with us, in this review of OSINT cases on the 2024 US presidential election.

Not long before the election, a Homeland Security bulletin warned that domestic extremist groups could plan on sabotaging election infrastructure including ballot drop boxes, so-called soft targets. Back then, a little more than two weeks away, there was a federal intelligence alert warned that domestic violent extremists consider election equipment such soft targets.

In the run-up

A leak on the 2022 Midderns featured Telegram chats of known members trying to convince others how to work the field to intimidate voters. In the DDos Leak called “Paramilitary Election Interferences”, that covers leaked chat logs and files detailing alleged efforts by U.S. paramilitary and far-right groups to disrupt elections, a peak behind right-wing extremists groups in the US shows how plans were cast to affect the 2022 midterms. In general, it’s about the American Patriots Three Percent militia group (AP3 or APIII).

American Patriots Three Percent: AP3 or APIII

American Patriots Three Percent (AP3), a far-right militia group aligned with the Three Percenters movement, has actively used social media platforms like Facebook for recruitment. A 2021 leak from a right-wing website exposed the identities of numerous members, shedding light on the group’s network. Founded by Scot Seddon during Barack Obama’s first presidential term, AP3 continued to grow, with Seddon turning to TikTok in July 2024 to recruit new members after an assassination attempt on Donald Trump.

Jesse Eisinger, Propublica Editor, called Seddon in August “one of the most dangerous men in America”. He features in the files and we can search for him and his details.

In the roughly 200GB of data from DDos Secrets, a search for the key members of the right-wing group America First Precinct Project (on a system called Aleph), highlights the group’s practical motivation to election interference.

Sweet talking Carolyn

In one message, Carolyn S., one of the key influencers, who, when asked “How to get started,” wrote, “…I imagine there are drop boxes in rural areas as well to help those who cannot or don’t want to drive to the city to get the ballots in.”

Chatlogs (Leak by DDos Secrets)

In another exchange, when asked if the strategy is illegal, Carolyn S. replies: “There is a boundary line to the boxes, and it is documenting and recording those attempting to drop ballots after ballots in the boxes. States are also training Americans to work the polls and be poll watchers. That’s what we will be doing. Nothing illegal about that…

The bottom line

Turns out that this freely available data is quite the goldmine for exposing right-wing voters’ fraud. Several investigations used it to show the data confirms organized, far-right movements weaponized voter intimidation and election fraud myths to manipulate and undermine election outcomes. It’s not just rhetoric — it’s documented plans, coordinated efforts, and detailed evidence of action.

Let us dig into the data in Aleph. Let us open Hunter, DDos Secrets data sharing platform. “Hunter” is similar to the interface OCCRP is providing its users. The Aleph data interface is a powerful platform designed to help investigative journalists search, analyze, and cross-reference large datasets of leaked documents, public records, and structured information. It allows journalists to uncover hidden connections between people, companies, and assets by linking entities across various sources, such as leaked emails, corporate registries, and sanctions lists. Journalists use Aleph to streamline complex investigations, identify leads, and expose corruption, organized crime, and financial mismanagement.

Website to do OSINT searches: https://hunter.ddosecrets.com/

The content

Let us speak about the big issue: Voter suppression efforts. The data shows detailed plans to monitor polling stations with the intent of intimidating specific voter groups, including but not limited to minorities, immigrant communities, and marginalized populations. Leaked documents showed direct calls for armed presence at ballot drop boxes to deter voters.

Moreover, visual evidence show photographs from these communications show individuals in tactical gear near polling sites.

The Data shows Disinformation campaigns and evidence of false narratives about election fraud spread in targeted communities. Coordinated sharing of templates for social media posts and flyers designed to mislead voters about voting dates, locations, or eligibility requirements.

The logs also reveal internal group communications. Logs detailing step-by-step strategies to execute voter intimidation, by wearing uniforms to look official, filming voters to scare them. Chat discussions encouraging members to focus on precincts with high turnout from opposing demographics.

There was also evidence of Election System Infiltration. Documents outlining efforts to take over local election boards or influence precinct-level operations, leveraging Steve Bannon’s “precinct strategy.” There were Names of individuals who were tasked to infiltrate and disrupt election certification processes. We used this in the search process to find more data on those actors.

Strong evidence on media influence is shown in the chat logs. A coordination with right-wing media outlets was to amplify conspiracies, framing the interference as “patriotic” election oversight. The Groups also used media narratives to recruit members for their election interference operations.

Violent rhetoric is also visible in the data. Explicit discussions about using intimidation tactics or even violence to “secure elections.” Threats directed at election officials, volunteers, and journalists who opposed their views.

The research is accompanied by videos, wants to recruit people to the AP3 Group:

Video from an AP3 for recruiting

In a lengthy monologue, Scott Seddon details his reasons for founding AP3 and describes how he formed an intelligence team to counter the fallout from the 2021 membership leak. Following the exposure, members of the group were contacted by the media, with some being publicly “doxxed.”

Seddon assured his followers, stating, “Anyone who has an issue with someone contacting them, AP3 has your back.” This declaration extended to journalists who reported on Seddon or his team, implying potential retaliation.

Video that leaked in DDos Secrets Media files

Election Fraud Videos

Shortly before election day, a video emerged on social media platforms, including X and Telegram, featuring an anonymous individual alleging large-scale voter fraud in Arizona. The video claimed that election officials were involved in a coordinated effort to manipulate votes against Donald Trump. This content quickly gained traction among conspiracy theorists and influencers.

U.S. federal agencies, including the FBI and the Cybersecurity and Infrastructure Security Agency (CISA), identified this video as part of a Russian disinformation campaign aimed at undermining public confidence in the electoral process.

The Arizona Secretary of State’s office refuted the video’s claims, confirming them as false.

The video surfaced on social media, allegedly showing an anonymous whistleblower claiming large-scale voter fraud in Arizona.

It quickly gained traction on platforms like X (formerly Twitter) and Telegram, where it was amplified by conspiracy theorists and influencers.

The video suggested election officials were involved in a coordinated effort to “steal votes” from Trump supporters.

Link: Easterly says, “foreign adversaries are attempting to influence the election, you see it in the indictment of Russia”

The Indictment read that “the investigation has revealed that Doppelganger purchased numerous social media advertisements targeting U.S. politicians and relied on artificial intelligence to generate the content”. In the focus here since 2022, Sergei Kiriyenko, the First Deputy Chief of Staff of the Presidential Administration of Russia.

Indictment, explaining how FB was used to spread disinfo

Abover all, one person was named in the indictment documents: Sergei Vladilenovich Kiriyenk. 33 times he appeared in the 277 page long document. According to the in London living researcher Kamil Galeev Sergey Kirienko born Sergey Israitel in a mixed Russian-Jewish family - rose from modest beginnings in subtropical Sochi to become a pivotal figure in Russian politics. Galeev written an excellent X thread on the figure Kirienko.

After adopting his mother’s Slavic surname, likely to improve career prospects in the USSR, he began his bureaucratic path as a Komsomol leader, benefiting from the organization’s influence during the Soviet-to-post-Soviet transition.

A shipbuilding graduate and former army officer, Kirienko leveraged his connections in Nizhny Novgorod’s political and business circles, including Boris Nemtsov, to secure key roles in Moscow. Appointed Prime Minister at just 35, his tenure faced economic catastrophe during the 1998 default, but his career revived under Vladimir Putin’s technocratic governance. Known for his pragmatism and adaptability, Kirienko remains a central figure in Russia’s managerial elite. Since 2022 did the US impose sanctions on Kiriyenko, along with his son Vladimir for their connections to the Russian government.

Another fake video, and another

“The FBI said it is “aware” of two fake videos claiming to be from the agency and related to the 2024 election”, so the report by several US media early November. The FBI has then issued warnings about two fake videos circulating online, part of a larger Russian disinformation campaign targeting the 2024 U.S. presidential election.

The videos, flagged for spreading false claims about ballot fraud and Vice President Kamala Harris’ husband, Doug Emhoff, are part of a sophisticated network producing hundreds of similar fakes this year.

A video featuring a Haitian immigrant claiming he is planning to vote multiple times in Georgia has been viewed more than half a million times. It hit where it was heard. For instance, by Amy Kremer, a republican politician, who used it to question the democratic process.

The video featuring a supposed Haitian immigrant claiming intentions to vote multiple times in Georgia has amassed over half a million views and became a talking point for Republican politician Amy Kremer, who used it to cast doubt on the integrity of the U.S. democratic process. However, OSINT investigations have revealed the video as part of a larger Russian disinformation operation designed to destabilize trust in the electoral system.

Source: media reporting

On November 1st, the the Office of the Director of National Intelligence (ODNI), the FBI, and Cybersecurity and Infrastructure Security Agency (CISA) announced on Twitter it can link the videos to Russian influence actors, who “manufactured a recent video that falsely depicted individuals claiming to be from Haiti and voting illegally in multiple counties in Georgia”.

Another call for a fake video, again with the FBI’s and CIA branding (shared by Juan Jose Garcia on X). The video showing alleged FBI personnel, was debunked by the FBI on its X account and 1.1 million times viewed.

OSINT Analysis

A week ago, I caught up with an old friend about the US election. He mentioned watching a stream on the conservative platform Rumble and confidently asserted, “It’s all rigged anyway.” I reassured him that the U.S. election system is widely regarded as secure, with multiple studies, reports, and expert evaluations supporting its resilience against large-scale fraud.

This backdrop makes a recent CNN report all the more concerning a pro-Trump influencer from Massachusetts, using the X handle @AlphaFox78, was reportedly paid by Russian propagandists to circulate misleading videos promoting election fraud narratives. One video, widely debunked now, featured a supposed Haitian immigrant claiming “he planned to vote twice in Georgia for Vice President Kamala Harris”.

@AlphaFox78 on Twitter

SimeonBoikov, aka ‘Aussie Cossack’, a Russian propagandist podcaster who was granted Russian citizenship, according to a decree signed by President Vladimir Putin last year, has allegedly offered the 650,000 X Follower strong AlphaFox78 $100 to post the fake video.

I scanned the account @AlphaFox78 for details. The website in the X Bio is completely redacted, so no real news there. The account maintains an online shop with questionable merch, and its X and YouTube account show no signs of intel on the person behind it. More telling, is leak data. It unearthed the name of “Josh K.”, and an email that leads to two email addresses (one for a gaming account) and one to East Longmeadow and a prominent Cash App, an account at the conservative streaming platform Rumble (AlphaFox1978), 24 items sold on eBay, and interestingly, a now-private “Bible” account, as well as Google images from a remote lake in Maine.

Investigation on WhoIx, who owned his website?

In at least one occasion, AlphaFox used a false ID/persona, under the name “Bob M.”. Additionally, and perhaps telling, K. owns an Adobe account, which might be used for creating family images but might also be used to create videos — and possibly not merely “sharing videos” as claimed, but potentially producing videos himself for the purpose to mislead and misinform.

For Simeon’s publicly shared email account, I came across some stiff-lipped Russian thumbnails featuring Russian buildings and uniforms (see above). On his Google Review account, one review read (redacted for privacy):

Really nice girl works at XXX shoes (a shoes store in Austr.) who is very polite and beautiful. She is very knowledgeable and helpful. I will definitely buying my next pair of Orthopedic shoes from XXX. P.S. I promise not to be jealous when the other customers come to buy shoes from her”.

Network of russian disinfo videos link to new fakes videos for US election

Story by the BBC: Over 300 similar videos were discovered by Logically, a UK tech company using AI to detect disinformation on the internet. Guilaume Kuster von Checkfirst said that his team can link the operation of the fake videos “thanks to assets that we know were produced by the Russian company, a company registered in the country”

To link it to Russia, analysts need to compare it to other material. Such material exists. In the form of 300 Video found by online research firm Logically. The video, that even featured the FBI logo, matched a certain video style of those videos. Foremost, the type of “convincing graphics and text to look like content from US government agencies as well as more than 50 news organizations” convinced the OSINT researchers.

CheckFirst, the Finland-based analytics company, independently investigated the network behind the videos and traced their origins to a Russian marketing agency and a Russian-associated IP address.

In a report from September (PDF), a identifying feature was the following: “…Content amalgamation, or the blending of different content formats to create multi-layered stories, remains a key tactic of the campaign”.

“CheckFirst found that the style, messages and themes of the videos align with other operations connected to the Kremlin, an assessment backed up by BBC Verify research.” BBC Reporting

Among the targeted organizations of the so-called “Operation Overload” back then was interestingly also the German media, among it, Süddeutsche Zeitung, according to CheckFirst:

Firstcheck speaks of another clue that links the videos to Russia: “The other source of evidence is a data set we got access to that proves that one of the machines that was used to send emails [by the group] was located in Russia”, says Guillaume Kuster from CheckFirst.

But how? Professional tactics suggest this can be done by using metadata revealed the email’s point of origin, the use of WHOIS data that tied domains to a Russian agency or a network analysis linked the IP address to Russian-controlled infrastructure. Here is a tool set for this:

Metadata Analysis of Emails: MXToolbox Email Header Analyzer (By analyzing the email headers, investigators can trace the email’s journey and identify the originating server’s IP address, which can then be geolocated to determine its origin)

WHOIS Lookup for Domain Registration: ICANN WHOIS Lookup (Investigators can uncover the ownership and registration details of domains used in disinformation campaigns, potentially linking them to known entities or regions)

Reverse DNS and IP Address Tracing: Shodan (By inputting an IP address, investigators can discover associated domains, services running on the IP, and other connected devices, helping to map the infrastructure behind a disinformation campaign.)

Network Traffic and Botnet Analysis: Maltego (Investigators can map relationships between domains, IP addresses, email addresses, and other entities to identify patterns indicative of coordinated disinformation efforts.)

Data Set Correlation and Infrastructure Analysis: VirusTotal (By submitting suspicious files or URLs, investigators can determine if they are part of known malicious campaigns and identify commonalities in infrastructure used across different disinformation efforts)

Efforts to fight disinfo

Much cant be done, once a video is shared millions of times. For the US government, it then becomes a matter of damage control. That means, according to officials, flooding the scene with counter material, footage and information that outperform the fake stuff. One of such initiatives is CICA’s following campaign:

Cica is fighting disinformation during the election period with a website called Rumor-vs-reality. It is supposed to answer false suspicions such as “A malicious actor can easily defraud an election by printing and sending in extra mail-in ballots”, by responding with: “Committing fraud through photocopied or home-printed ballots would be highly difficult to do successfully…. (by applying) information checks, barcodes, watermarks, and precise paper weights.”

The thing with AI

Research from the Institute for Strategic Dialogue found interesting evidence that not the AI-generated content is a problem. Much more, the fact that opposition voters will double the authenticity of content online. Most often of content, that was authentic.

The found that during the U.S. election, the primary issue was not the prevalence of AI-generated content, but rather the widespread misidentification of authentic material as AI-fabricated. ISD’s analysis found that users misidentified content in 52% of cases, often claiming authentic content was AI-generated and justifying their assessments with flawed OSINT strategies or unreliable online tools. (ISD Global) This trend underscores a growing skepticism among voters, leading to doubts about the authenticity of genuine information and highlighting the need for improved digital literacy and critical evaluation skills.

I collected a number of OSINT tools to spot false or AI generated content.

Since a friend asked me recently about verifying the authenticity of images or videos — especially in the age of AI-generated content — here are 6 free tools/platforms for forensic image analysis:

Forensically: online suite offering tools like clone detection, error level analysis, and metadata extraction to analyze digital images. https://lnkd.in/dnm3HBBY

VideoCleaner: A free, open-source forensic video enhancement software used by law enforcement and investigators to detect tampering and enhance video quality. https://lnkd.in/d8zfJhYH

Ghiro: An open-source digital image forensic analysis tool that automates the process of analyzing image files and extracting metadata.
(Image Forensic) https://lnkd.in/dXBND5x6

Sherloq: An open-source image forensic toolset designed to analyze and detect anomalies in digital images, including potential manipulations. (GitHub) https://lnkd.in/dqZn_hX7

PhotoRec: A free, open-source utility for recovering lost files, including images and videos, from various storage media. It can be useful in forensic investigations to retrieve deleted media. PhotoRec: https://lnkd.in/d6W7ZUC9

InVID-WeVerify: A browser extension that assists in verifying the authenticity of images and videos shared online by providing tools for reverse image search, keyframe extraction, and metadata analysis. https://lnkd.in/diqsrMVH

Verdict is: Not so greatly effective. While direct data on user engagement is lacking, the “Rumor vs. Reality” webpage serves as a vital component of CISA’s strategy to counter election disinformation, providing reliable information to both the public and election officials.

The Russian embassy, as expected, dismissed the allegations of fake videos linked to Russian actors as “baseless” and “slander.”

However, the impact of these videos was minimal. They garnered only a few thousand views and were primarily spread by likely bot accounts with negligible influence on the platforms mentioned.

On the other hand, the recognition that Russian actors are trying to meddle with US democratic process, may be even more important. Just check out the post by Scripps News national correspondent Elizabeth Landers, former Vice reporter when she announced the video of ballots cast for Trump allegedly being ripped up in Bucks County was manufactured and disseminated by Russia: almost 3 million impressions online. As if Russian hopes to be recognised as being a disruptive factor.

Post by US reporter, shared widely on X

The final verdict: If Russia intends to influence a U.S. election of this magnitude, it must deploy far more sophisticated strategies than poorly crafted fake videos mimicking legitimate media reporting. Even with AI-generated content, such efforts fail to make a significant impact.

However, these seemingly small attempts send a loud and troubling message: that democracy is under attack by Russia. This signal, though subtle, carries dangerous implications that should not be underestimated diplomatically.

On a positive note, many of these videos were swiftly removed from the internet — a notable achievement for tech companies like X and YouTube, which have historically struggled with such enforcement. In this instance, they acted decisively, aligning closely with the directives of election and intelligence authorities.

TJ

--

--

Techjournalist
Techjournalist

Written by Techjournalist

Investigative journalist with a technical edge, interested in open source investigations, satellite imgs, R, python, AI, data journalism and injustice

No responses yet