This week, British Prime Minister Theresa May came out and attacked Russia’s attempt to “weaponize information” in hostile actions against western states. This comes on the back of a wave of news that’s covered “fake news” and the U.S. elections. At the latest count, Russia-linked Facebook posts reached 126 million users during the U.S. election period. This makes great headlines and fascinating reading, but what does it all mean?
We must remember that the use of social media bots is nothing new. Nor is influencing elections; using social media to influence the outcomes of elections isn’t event particularly new.
Our latest paper covers four areas:
- Disinformation is different than fake news;
- Disinformation campaigns have financial motivations too;
- There are a wide range of tools available, which extend beyond social media;
- Understanding these motivations and tools allows us to look disrupt disinformation campaigns.
FAKE NEWS VERSUS DISINFORMATION
Let’s get a boring, semantic (yet important) clarification out of the way. Fake news and disinformation are different, albeit related, terms. The confusion between the two terms holds us back from having a sensible conversation.
Fake news refers to all manner of things, including disinformation campaigns, partisanship, and honest journalist errors. Disinformation campaigns are specifically those that deliberately spread false information in order to deceive their target or audience.
One of the greatest quotes on this comes from the former Director of Department X for the East German foreign intelligence: “Our friends in Moscow call it ‘dezinformatsiya’. Our enemies in America call it ‘active measures,’ and I, dear friends, call it ‘my favorite pastime.’”
This need not be limited to the geopolitical sphere, it can apply to ideological and financial motivations too.
It would be wrong to assume that the sole target of disinformation campaigns is the electorate and political parties. Given how easy it is to access and wield these online tools, organizations can easily be slandered and their share prices can change. We’ve seen such activities already, particularly surrounding BioTech companies and accusations about the role of Martin Shkreli and an online actor named Art Doyle.
Actors might not even need to get into the weeds of these tools. TheInsider is a dark web “Pump and dump” service that encourages users to invest in their scheme. The scheme itself looks to manipulate interest in cryptocurrencies to pump up the price and sell shares for profit.
Regardless of the motivation behind disinformation campaigns, these do not happen in isolation. Instead, malicious actors take advantage of a wide range of tools available at a very lower barrier to entry.
DIGITAL SHADOWS’ DISINFORMATION CAMPAIGN TAXONOMY
Digital Shadows’ Disinformation Campaign Taxonomy is based on a three-stage attack chain (creation, publication, and circulation), which includes an overview of the methods, tactics and tools associated with running such an operation.
By using Digital Shadows’ Disinformation Campaign Taxonomy, we can see that there are different stages that defenders can target to help disrupt disinformation campaigns in their infancy. Early identification of these campaigns is critical to increase the likelihood of successful disruption.
Download a copy of our latest research report, The Business of Disinformation: A Taxonomy, to see tools actors can turn to when waging disinformation campaigns and what it means for organizations in the next year.