We still do not know the full extent of Russian influence activities on social media platforms during the run-up to the 2016 U.S. election. A full accounting may not even be possible, but that does not mean we shouldn’t try. What has been publicly disclosed about Russian actions reveals an adversary with deep knowledge of the level of polarization in America today, an intuitive understanding of the psychology of politics, and no intention of stopping.
Russia’s “active measures,” the term used to refer to Soviet-style political warfare, focused on the most polarized topics, targeted the most ideologically-driven groups, and exploited the way information is propagated.
In early November 2017, representatives from Google, Facebook and Twitter testified in front of Congress on the topic of Russian activities before the election. Much of the testimony focused on the political ads known to be sponsored by entities associated with Russia’s propaganda arm, the Internet Research Agency (IRA).
Certainly, the fact that political advertising could go without signifying who paid for it (lumped in as sponsored content like any old ad) is something that should be corrected immediately. However, the relatively paltry sums spent on these ads, approximately $150,000 on Facebook ads and $274,000 on Twitter posts, does not account for two things: first, the extent to which these ads and similar content were further spread by others, and second, the extent to which content was organically propagated after being planted by the IRA and associates.
This is important because of the psychological principles behind how we are influenced. When reading through social media feeds, ads are marked and many people skip over them. Some of the ads, however, are designed to look like organically generated meme content, of the sort you often see on sites like imgur, Reddit and 4chan, or official and authoritative content with high production value, full of flags, eagles and other insignia.
From the information disclosed by Facebook and Twitter, many of these ads clearly tap into the principles of commitment (by asking people to share or like if they agree), consistency (by pushing the same messaging on a routine basis), and consensus (by being propagated by groups that appear to be representing larger movements) — thereby conferring legitimacy based on numbers (e.g., United Muslims of America, Black Matters, Army of Jesus, three groups that were later discovered to be Russians impersonating American activists of all political persuasions).
The importance of social proof cannot be understated and is further amplified by two factors. First, is through authority (another principle of influence), conferred on the fake Russian content by retweets and likes from government and campaign officials. The second factor is the unique way in which algorithms serve up content.
What appears in our Twitter and Facebook feeds is, in large part, the result of what our friends and family engage with — but whether we like to admit it or not, our social media feeds are not wild forests of intellectual diversity but essentially walled gardens of like-minded individuals. By targeting the most ideologically driven groups, the content generated and propagated has the greatest appeal and increases the likelihood of spawning direct shares or derivative content.
Such “preaching to the choir”-type content can lead to deliberate disinformation rising to the surface, which is what appears to have happened when the Sutherland Springs shooter was identified as a member of antifa.
Those less ideologically driven are not immune either — again, the brilliance of the content generated by the IRA and associates is that it plays into three aspects of our collective psychology: first, our drive to maintain consistent beliefs; second, our tendency to seek out information that confirms those beliefs; and third, our tendency to start with a conclusion and then use modes of reasoning to get to that desired conclusion.
The case of a “freelance” writer “Alice Donovan,” who pretended to be a legitimate journalist but was likely a Russian propagandist fanning the flames of discontent against Hillary Clinton, is illustrative for two reasons. First, the target audience likely believed that this planted content was both genuine and organic, spreading the illusion of increasing Clinton discontent. Second, it demonstrates a level of knowledge of American media, politics and culture that shows the sophistication of the overall influence operation, let alone the individual operator.
Not even the most careful consumer of internet content would have realized that Alice Donovan was not who she said she was. This wasn’t the first time a Russian operation against the United States showed this level of deep target knowledge — in 2010 the “Illegals” spy ring was broken up, a program in which numerous Russian operatives were leading seemingly “all-American” lives before they were thrown out of the United States. It also likely won’t be the last.
Exactly how the Russians interfered with our mental machinery might never be known with certainty. But Russian intentions were clear — to sow discontent and further divide an already-divided populace.
Russian bots on Twitter continue to echo divisive topics and make it appear that there are more clamoring voices than actually exist. While unilateral disarmament of American information operations capability probably hasn’t helped in terms of deterring future Russian action, getting back on a Cold War-like footing is not going to help our near-term disinformation problem, which remains likely to rear its head in the 2018 midterm elections.
Beyond holding social media platforms to account and having them enact policies to limit the spread of disinformation, it is ultimately up to each individual information consumer to act more responsibly.
While a tall order, psychologists have had some success in training people to be more “actively open minded,” putting them in a frame of mind where they are more likely to seek disconfirming information. The training, while not quite as invisible as a “nudge” but not as intensive as taking an entire online course, could go a long way in terms of making people more savvy information consumers.
By asking people to take this extra step in evaluating the source of the information they are engaging with online could reveal that the information that they are about to share is not rooted in reality. Such active open-mindedness may be exactly what we all need in order to resist Russian active measures.