Free from fear or favour
No tracking. No cookies

Under the Radar: Unmasking the Coordinated Reach of Russian Doppelgänger Bots

A new generation of information warfare tools still pose the same threat to Ukraine, unity in Europe and the US elections

Russian President Vladimir Putin delivers his state-of-the-nation address in Moscow, Thursday 29 February 2024. Photo: Alexander Zemlianichenko/Associated Press/Alamy

Newsletter offer

Subscribe to our newsletter for exclusive editorial emails from the Byline Times Team.

The Intelligence Committee under the President of Ukraine, warned in a statement on 27 February 2024, of Russia’s plans to undermine Ukraine using information operations to divert global attention from the ongoing war. According to the statement, objectives include spreading false narratives about Ukraine and its partners, demoralizing Ukrainians, sowing panic, and creating divisions in military and civilian spheres.

Russian special services spent $1.5 billion on information operations in November 2023, with $250 million for the ‘Maidan-3’ special operation on Telegram. Maidan-3 is due to peak in March-May 2024, intensifying global destructive narratives, questioning government legitimacy, spreading panic, creating artificial divisions, causing discord with allies, and promoting conspiracy theories. Ukrainian security services are calling for joint resistance and comprehensive security measures, especially in the information space, to counter global threats from Russia’s ongoing hybrid war.

One of Russia’s most widespread tactics is the “Doppelgänger bot network” in which state actors utilize Doppelgänger bots on X/Twitter to disseminate misleading narratives, sow discord, and influence public opinion globally. Researchers have revealed the scale, methods, and adaptability of this disinformation campaign, emphasizing its impact on Western democracies.

Did Cambridge Analytica Collude with Russia’s Intelligence Services to Interfere in US Elections?

Wendy Siegelman looks at new evidence about the US polling data handed to Russian agent Konstantin Kilimnik, and the role of the now-defunct election campaign company co-founded by Steve Bannon


“A Dark Double”

In 2022, EU DisinfoLab revealed a Russia-based influence operation active in Europe. The operation was cleverly named Doppelgänger. German folk legends considered a “doppelgänger” a shadow, a malevolent entity, offering deceptions and lies. In a less romantic twist, the Kremlin state-sponsored actors mimicked authentic publications, cloning at least 17 genuine media outlets like Bild and The Guardian. They used fake, look-alike domain names and designs to create “a double” garnished with fake articles, videos, and polls.

The Doppelgänger network uses a cross-platform approach, operating across web pages and social media networks (Facebook, X/Twitter, YouTube), employing diverse formats such as videos and online ads, to build a mirror labyrinth, using repetition as a mechanism for forming a parallel reality.

At the core of the operations are fake websites. They post the original fabrication, and social media platforms jump in, echoing and amplifying the impact. The content is usually comprised of three elements:

The elements are coordinated between Doppelgänger assets across the platforms.

Since 2022, several research groups have been tracing thousands of Doppelgänger bots operating on various platforms. A US group, dTeam, followed Twitter/X campaigns in Ukrainian, German, French, and English languages, providing the discovered data to various US government partners, including the FBI and DSG (Department of Security and Governance). The Insikt Group identified several Twitter/X campaigns targeting various demographics in early November 2023. WIRED obtained data from two disinformation research groups revealing a coordinated Russian Doppelgänger bots’ effort on Telegram and X/Twitter.

All the researchers came up with mind-boggling revelations.


How Doppelgänger Bot Networks Operate

Attacks aim to undermine trust in the authenticity of social media interactions, manipulate online conversations, and influence public sentiment. Artificially inflated follower counts may also be leveraged for deceptive purposes.

The network can generate up to 5,000 new Doppelgänger bot accounts within a few hours on a single day. Doppelgänger bots clone the profiles of ordinary users to create realistic-looking fake identities. The network initially faked celebrity accounts: in the latest campaign, it created and disseminated memes in many languages depicting stars denouncing Western aid to Ukraine. The majority of engineered accounts mimic regular users, making it challenging for the average social media user to discern between genuine and fake profiles.

There are two types: content poster accounts, also known as primary bots, generate posts, memes, and videos. Quote bot accounts amplify the impact by replying and reposting the original content. The operational structure involves the content poster accounts creating a three-layered post on X/Twitter: an initial link to articles, a follow-up comment, and a subsequent link to visual content. This approach enhances the reach and impact of disinformation.

The network demonstrates continual activity without a single break exceeding three days. For instance, during the last week of January 2024, approximately 1,000 bots were suspended by X/Twitter. After a pause, they resumed activity on 31 January, unveiling new content and a fresh set of bots. X/Twitter suspends the accounts, and the researchers utilize downtime as an opportunity to carry out continuous tracking for data collection and comprehensive documentation.

The campaign adapts narratives to target diverse demographics, impacting many regions around the globe.

Putin’s War: Money, Ideology, Troll Farms and TV Stations

Zarina Zabrisky talks to Russian cyber warfare and security experts Andrei Soldatov and Irina Borogan about the Kremlin’s evolving information warfare


Objectives and Narratives

All Russian malicious actors’ networks have the objective of sowing unrest and chaos in Western democracies, but the narratives vary depending on the demographic targeted. The networks create, promote, and amplify narratives supporting far-left and far-right groups and causes. Campaigns attack the EU, NATO, the US, the UK, and Canada. They disseminate narratives portraying Ukraine as “not a democracy,” “corrupt,” “losing the war,” and emphasizing “forced conscription”. They attempt to influence politicians and the global community to withdraw support for Ukraine by propagating domestic issues, such as “our tax dollars support corrupt Ukraine, leaving veterans homeless”.


Targeting Ukraine

A campaign aimed at impersonating reputable Ukrainian news organizations, including the Ukrainian Independent Information Agency of News, or UNIAN, Obozrevatel, and RBC-Ukraine, targeted Ukrainian citizens. According to Insikt Group research, tactics employed “brandjacking and utilizing well-crafted domains to mimic authentic news organizations.”

The researchers observed common themes across articles: emphasizing Ukrainian military struggles, and portraying Ukraine’s western allies as unreliable. Some accounts promoted images and memes critical of President Zelensky, attempted to spread a rumor that Ukrainian former commander-in-chief General Valeriy Zaluzhny was planning to run for President in 2024 and alleged that Zaluzhny would win if elections were held.

The researchers found out that the campaign impersonating Obozrevatel created and amplified malign narratives such as emphasizing the alleged US prioritization of Israel over Ukraine, questioning the EU’s ability to manage multiple conflicts, and predicting a discouraging future for Ukraine’s military prospects.

In a fake version of UNIAN, the content creators portrayed Ukraine’s military strategy as weak, expressed doubts about Ukraine’s ability to win, and exaggerated the casualty rate among Ukrainian soldiers.

In the RBC-Ukraine fake articles, the impersonators implied that western support for Ukraine is waning, suggested the US is considering redirecting military aid from Ukraine in support of Israel, and criticized Ukraine’s alleged military recruitment expansion amid personnel shortages.

Insikt Group identified over 800 social media accounts engaged in automated Coordinated Inauthentic Behavior (CIB). Almost every account followed a similar username model: firstname[letter][roughly 5 or 6 digits]. The majority of accounts recycled common Western first names like “jeff,” “donald,” “donna,” or “dorothy.” All accounts were registered on the social media platform in October 2023. The network displayed a reliance on two stages of link obfuscation, likely aimed at evading detection.

EXCLUSIVE

Big Lies and Rotten Herrings: 17 Kremlin Disinformation Techniques You Need to Know Now

As the US Presidential elections approach, Russian Intelligence expert Zarina Zabrisky provides a comprehensive guide to Putin’s latest propaganda ploys.


Targeting the US

In the US, Russian state-sponsored malicious actors are attempting to influence the presidential election of 2024 with inauthentic news outlets focused on US politics and elections. Usually, they claim to be a non-partisan source to build credibility, the Insikt Group reported. Articles, potentially AI-generated, criticize the Biden administration, suggesting its waning popularity. One fake publication, MyPride, published an inauthentic editorial outlet critical of LGBTQ+ rights in the US, fueled hostile rhetoric with articles portraying LGBTQ+ activism negatively, and criticized the US military’s inclusivity.

WIRED obtained data from two disinformation research groups revealing a coordinated Russian effort on Telegram and X/Twitter to exacerbate tensions by pushing the narrative of the US being on the brink of civil war.

Starting in late January 2024, after the US Supreme Court ruled in favor of President Joe Biden’s administration on a Texas border issue, Russian politicians spoke of a “bloody civil war”, “the weakening of US hegemony” and a “People’s Republic of Texas.” Russian state media, influencers, and bloggers amplified these narratives.

A network of bot accounts previously associated with the Doppelgänger campaign joined the discussion of the Texas border issue on X/Twitter. It differed from the previous Doppelgänger campaigns which shared links to fake websites mimicking legitimate ones. The Texas border campaign linked to websites run by Doppelgänger operatives. These websites published articles tailored to the desired narrative. For instance, an article on the fake site Warfare Insider claimed Texas had become a symbol of the clash between state and federal authorities.

ENJOYING THIS ARTICLE? HELP US TO PRODUCE MORE

Receive the monthly Byline Times newspaper and help to support fearless, independent journalism that breaks stories, shapes the agenda and holds power to account.

We’re not funded by a billionaire oligarch or an offshore hedge-fund. We rely on our readers to fund our journalism. If you like what we do, please subscribe.



Targeting Germany

In Germany, the focus of disinformation campaigns was on European migration challenges, global migration, disseminating content stoking nationalist and anti-immigrant sentiment, predicting further migration due to environmental disasters and the war in Ukraine, suggesting the German government’s failure in managing the refugee crisis, exploiting political polarization, portraying Ukraine as a source of German political divisions, as the Insikt Group research showed.

The campaign aimed to cast doubt on NATO unity and effectiveness, criticizing German leadership and military spending. Narratives of impending economic decline, casting doubt on the effectiveness of sanctions against Russia, and portraying the rise of the far-right Eurosceptic party AfD as a response to government failures, were prevalent. The networks used German-language inauthentic news outlets Besuchszweck, Grenzezank, Haüyne Scherben.


Countermeasures: Addressing the Doppelgänger Threat

X/Twitter suspension efforts are ongoing but have proved insufficient in preventing the creation of new bot batches. The strategy of suspending fake accounts after content has been posted is deemed ineffective as the damage has been inflicted. The resilience of the network in generating new accounts and its adaptability pose a challenge to mitigation efforts.

The Doppelgänger bot strategy is aimed at changing the political landscape by influencing public opinion and changing behaviour. It is broadly employed as a weapon in Russian information warfare. According to researchers, the tactics evolve and include innovative technologies, such as the use of first- and second-stage websites, the creation of original but inauthentic news organizations, and the use of AI.

Urgent countermeasures are essential. Increased awareness, information warfare literacy, and further research are crucial components in mitigating the impact of the Russian Doppelgänger threat. Researchers call for urgent, adaptive countermeasures against this persistent and increasingly sophisticated threat to democratic institutions.


Written by

This article was filed under
, , , , ,