Big Lies & Rotten Herrings17 Kremlin Disinformation Techniques You Need to Know Now
As the US Presidential elections approach, Russian Intelligence expert Zarina Zabrisky provides a comprehensive guide to Putin’s latest propaganda ploys.
Newsletter offer
Subscribe to our newsletter for exclusive editorial emails from the Byline Times Team.
The Kremlin’s disinformation playbook for waging its self-declared global information war was originally developed by Soviet military strategists. During the Cold War, the Soviets conducted over 10,000 disinformation operations — among them was the fabrication and dissemination of conspiracy theories about FBI and CIA involvement in the assassination of J.F. Kennedy and the US government inventing the AIDS virus. At the height of the Cold War, up to 15,000 KGB officers are said to have worked on psychological and disinformation warfare, which they learnt in a special “active measures” course created by then KGB head Yuri Andropov.
Russia’s disinformation campaigns evolved with the times to include spreading disinformation through international news agencies, TV channels and online news websites. Today, they have become increasingly sophisticated, moving on from outright falsehoods to more subtle forms of audience manipulation — all with the help of RT, Sputnik and others in the network.
There’s one principal goal to Russia’s disinformation efforts: weaken the Western democratic system from within.
According to experts, Russia has been trying to destabilise Europe through disinformation since 2015. However, it was the aftermath of the 2016 US Presidential elections that attracted international attention to the issue as US intelligence agencies called out Russia’s propagation of false news and inflammatory media stories meant to sway the vote in Donald Trump’s favour. There’s one principal goal to Russia’s disinformation efforts: weaken the Western democratic system from within. While RT and Sputnik use a plethora of disinformation techniques derived from the Soviet era, we are concentrating on the most pertinent ones in the Kremlin’s playbook.
1. Pushing Kremlin Narratives
The Kremlin has been working on creating a new “post-Western world order” narrative, in which Western countries and institutions are no longer enjoying the same amount of power and influence they once did. Major Kremlin narratives include negative narratives: anti-elite, anti-EU, anti-NATO, and promote a danger discourse (the rising extremist, migrant and Islamic terrorism threat) and positive narratives: Russia and Putin as saviours of traditional values.
2. Amplifying Extreme Voices
This is one of the Kremlin network’s most used tactics, with a variety of far-right activists, conspiracy theorists and extremists featured as commentators on its media outlets. Manuel Ochsenreiter, the editor of the neo-Nazi magazine Zuerst!, is a frequent guest speaker on German politics on RT.
Other examples of extreme voices featured on RT include a neo-Nazi and white supremacist Richard Spencer. The types of groups and opinions these extreme voices are meant to amplify include the following, as explained in a textbook taught in Russian military education institutions: “Encouraging all kinds of separatism and ethnic, social and racial conflicts, actively supporting all dissident movements — extremist, racist, and sectarian groups, thus destabilizing internal political processes.”
3. Rapid Fire Conspiracy/Overload
This method promotes an overflow of images of violent protests, vandalism, fire, injuries, deaths intended to shock the nervous system and cause stress in the target audience. Repetition of triggering words and images, rapid switching of stories and inconsistency are intended to result in depression, panic, fear and confusion.
During global information campaigns, like those propagated by RT and Sputnik, large amounts of conflicting information disorient the target audience and it seems that their programmes deliberately create an atmosphere of danger and crisis. With sufficient exposure, PTSD symptoms develop and cause the inability to think critically and act rationally.
The propagandists offer solutions at the time of perceived crisis. Physiological changes to the brain alter individual and collective behaviour. As an example, during the Yellow Vests movement, RT, Ruptly and Sputnik streamed live reports from protests in France, Belgium, Germany and other countries on YouTube and Facebook focusing on violence and graphic and disturbing images. The difference between the coverage of violent events by RT, Sputnik et al and that of most international news outlets is the volume of disturbing coverage designed to overwhelm the viewer and the deliberate strategy of audience manipulation associated with Russian state coverage.
4. Influence by Suggestion
Suggestion is used to target groups that lack critical thinking, such as the very religious, under-educated, conformists, youth, and people in a state of panic, fear or stress. Using the “authority aura” effect, recognized leaders deliver emotionally charged speeches. Group environments, such as rallies and online forums, create the effect of “psychological contagion” and help to consolidate people around the ideas presented.
For example, in 2016, trolls from the Internet Research Agency (IRA), a St.Petersburg troll farm, organized pro-Trump rallies in Florida, Idaho and Philadelphia, targeting miners and low-income demographics.
5. Influence by Persuasion
By targeting groups prone to thinking critically, with a special focus on leaders and decision makers in society, group RT editor-in-chief Margarita Simonyan said the outlet has won over in Europe. This method uses logic, comparisons, arguments, multiple points of views, sociological polls and statistics, but at the same time reporting is selective, omitting key facts in order to create a biased, but persuasive story.
6. The 60/40 Method
Invented by Joseph Goebbels, Hitler’s Minister of Propaganda during the Third Reich in Germany, the theory is that if 60% of coverage is objective and reported to establish trust, 40% injects the disinformation at critical moments, taking advantage of the trust that’s been established to convince the audience. The proportion may vary depending on the specific propaganda outlet, but the principle remains the same.
7. “Angle”/Perspective
This centres on creating a single perspective by presenting only one point of view. According to disinformation expert Ben Nimmo the angle tactic involves selecting an “interviewee on the basis of their beliefs, rather than their credibility, and not providing balancing coverage. […] Rather than misreporting facts, RT relies on the selective use of interviewees and quotes, giving substantial air time to pundits who validate the Kremlin narrative, and little or no air time to its opponents.”
8. Distraction/Deflection
This reflexive control technique shifts attention away from stories unfavourable to the Kremlin, towards school shootings, murders, catastrophes, terrorist attacks and natural disasters is a key technique.
For instance, in the wake of the Parkland, Florida, school shooting, troll and bot-tracking sites reported an immediate rise in related tweets from Russia-linked Twitter accounts, bots and trolls. This took place in the same week that the media started reporting a story, unfavourable to the Kremlin, on the ties between Donald Trump’s former campaign manager Paul Manafort, Russian oligarch Oleg Deripaska and the Russian government. The Kremlin’s ire about the media coverage is illustrated by its threat to block access to YouTube and Instagram in Russia if the sites didn’t remove video and photographs related to the story.
9. Trolling
Trolling people that Russia perceives as its opponents in the West is long established ploy across forums, social networks, portals, chat rooms and news sites. This includes aggressive, mocking or abusive messages.
Trolls can cause significant damage: spoil online public discussions, spread harmful or destructive ideas, destroy a sense of mutual trust in a community and discredit opponents. RT’s use of trolling methods are acknowledged by its editor-in-chief, Margarita Simonyan:
“Actually, I am just trolling them [Western governments and press] … Since we were registered as Foreign Agents, we made it into a trolling show. At all conferences and forums, we put up a giant banner “Foreign Agent.” We changed the perception of the meaning of the word. Now everyone knows it is delirium imposed on us by the US government, there is nothing terrible or ominous about us and we are just laughing at it. These days, it is the best method out of these situations – to laugh at yourself and make everyone laugh at our opponent.”
A good example is a video produced by RT, with a telling name: “RT exposed in leaked video: Watch how evil ‘Kremlin propaganda bullhorn’ REALLY works.” By using sarcasm, RT dismisses allegations, discredits opponents and distracts from its offensive campaign.
10. Repetition
One of the basic and most effective methods of propaganda is the tireless repetition of the same statements, so that people get used to them and begin to accept them, relying not on critical thinking, but on faith mechanisms. Disinformation is spread via seemingly unrelated sources and echo chambers to create credibility and an effect of omnipresence. The methods below all use repetition to achieve this effect.
11. Positive Image Engineering
This involves creating a positive image of politicians or parties, in order to popularize Russia’s foreign and domestic policy agenda within foreign audiences.
For example, in 2017, Ruptly produced a video making a fake claim that a New York restaurant served a burger named after Putin to celebrate his 65th birthday. In 2016, Ruptly, RT and Sputnik reported Putin’s announcement that Russia had developed and registered the most effective Ebola vaccine in the world, a claim described as “shocking” and “kind of mind-boggling” by Professor Ira Longini, who helped develop the only Ebola vaccine to have passed the highest stage of testing.
12. Rotten Herring
This propaganda technique prompts a direct association between a person, group or institution and a scandalous accusation.
The sequence can be the following: first, a false accusation is circulated on social media. Then, bots and trolls push the false accusation through related hashtags, for example. Subsequently, the mainstream media reports the accusation as a story, which is then echoed by many outlets. A public discussion ensues. Experts, prominent public figures, laymen join in with opinions. After a certain number of repetitions, a negative emotional response is formed about the person, group or institution being falsely accused.
Even if the fake is debunked, the negative association with the object of the accusation remain, just like the smell of rotten herring.
This tactic was used against Hillary Clinton during the 2016 US presidential elections and, less successfully, against Emmanuel Macron in the 2017 French presidential elections.
13. The Big Lie
This tactic is similar to the rotten herring technique, but in this case the fabricated narrative must have monstrous proportions.
Examples include Sputnik circulating a false story during the French Presidential elections about Emmanuel Macron being a “US agent” lobbying in favour of banks; and the “Lisa Case” in Germany in 2016, where Russian media outlets widely circulated a fake news story of a 13-year-old Russian-German girl kidnapped and raped by three men of Middle Eastern or North African origin. Russian Foreign Minister Sergey Lavrov then falsely accused German officials of a politically-motivated cover-up.
According to cybersecurity expert, Bruce Schneier, a single “big lie” was more common in Soviet disinformation during the 1980s, but “today it is more about many contradictory alternative truths—a “firehose of falsehood”—that distorts the political debate.”
14. Rumour
Rumours can significantly enhance certain stereotypes among the target population groups.
“Spetz-propagandists classify rumours into categories such as absolutely false; unreliable with elements of credibility; plausible and authentic rumors with elements of improbability… Rumours are also divided into categories by their emotional origin: rumour-desire; rumour-scare; aggressive-rumour. Rumours are released at certain intervals (10-12 days), and are designed using certain rules, like layering.”
15. Half-Truths
Only reporting part of the truth in a ploy to mislead the audience and fit the Kremlin narrative is illustrated by how RT emphasizes negative events in the West, ignoring positive developments, while hardly covering or criticising the Russian government. According to Sara Firth, a former RT reporter:
“Crucial information is regularly omitted from stories, and often because those in charge are not capable of identifying what makes a strong news story. They’re not interested in fact checking and creating valuable, balanced journalism. Their main agenda is that it fits the narrative.”
16. False Equivalence
This involves comparing two or more things that appear to be logically equivalent, but in fact are not.
Disinformation expert, Ben Nimmo, describes the use of this technique by RT’s editor-in-chief Margarita Simonyan who: “often cited the BBC charter, stating that one of the broadcaster’s missions is to “bring British values to the world”, and accused it of British propaganda as a result.
In fact, the BBC Charter is worded slightly differently: ‘To reflect the United Kingdom, its culture and values to the world: the BBC should provide high-quality news coverage to international audiences, firmly based on British values of accuracy, impartiality, and fairness. Its international services should put the United Kingdom in a world context, aiding understanding of the United Kingdom as a whole, including its nations and regions where appropriate. It should ensure that it produces output and services which will be enjoyed by people in the United Kingdom and globally.’ […]
Simonyan’s parallel is false. The BBC Charter focuses on culture and values, not government policies. The values themselves — accuracy, impartiality and fairness — are core attributes of bona fide journalism. This is light years away from Simonyan’s description of RT as “waging the information war” on behalf of the Russian government.”
17. Gaslighting
Gaslighting makes the opponent doubt reality and facts. This technique has been used by Baltnews, for example, describing a visit by US Vice-President Mike Pence to Estonia as a “collective psychotherapy” session to deal with a non-existent threat by Russia. Baltnews’ depiction is classic gaslighting given that Estonia had suffered cyber attacks on its Parliament and other major institutions in 2007, which it suspected the Kremlin of orchestrating, and in light of Russia’s aggressive expansionism with military intervention in Ukraine and annexation of Crimea.
Another form of gaslighting concerns Russia’s self-declared information war. The Russian government has accused the West of conspiracy — “orchestrated campaigns […] aimed at discrediting the Russian press” with the aim of “suppressing the voice of the Russian media.” Following Russia’s use of disinformation to influence the US elections, Putin’s press secretary Dmitry Peskov claimed “that this was not an information war of Russia’s choosing; it was a ‘counteraction,’” in an interview with the New York Times.