Propaganda is changing in a Digital Age. Audiences are no longer passive consumers of persuasive content, but active in its creation and spread, helping to further the agenda of propagandists whose messaging resonates with the target’s world view.
Participatory propaganda moves beyond a one-way form of communication (the propagandist using mass media to persuade a passive target audience), to a “one-to-many-to-many more” form of communication (the propagandist engaging in dialogue with the target audience such that more people are recruited to spread persuasive messaging to others, essentially snowballing the effect). Participatory propaganda offers the ability to truly dominate the information space through volume of messaging, delivered through a mix of real people and automated accounts, effectively making it difficult to discern where fake ends and authenticity begins.
A modern political campaign fits the model of traditional propaganda as defined by Jowett and O’Donnell, namely the “deliberate, systematic attempt to shape perceptions” (e.g. popular opinions of Trump and Clinton) such that it “directs behaviour to achieve a response” (e.g. support for Trump in the form of online participation and voting) furthering “the desired intent of the propagandist” (e.g. the Trump campaign).
In the Digital Age, this traditional approach is evolving into a participatory propaganda model in which the target audience is no longer mere passively consuming persuasive messaging but also becoming active in producing and distributing such content. The original propaganda message triggers, reinforces, or exacerbates pre-existing sentiments associated with the message in a way that prompts the consumer to actively engage in its propagation through available social networks, both on and off-line. Even if modified through the consumer’s own interpretation, the core message remains intact, and even acquires ‘new life’. At the same time, online monitoring tools enable the original propagandist to follow and assess the spread of his or her messaging, adapting strategies in a constant feedback loop.
In this context, then, a more appropriate definition might be:
Participatory propaganda is the deliberate, and systematic attempt to shape perceptions, manipulate cognitions, direct behaviour, co-opting grassroots movements as well as recruiting audience members to actively engage in the spread of persuasive communications, to achieve a response that furthers the desired intent of the propagandist.
In reviewing the Donald Trump 2016 presidential election campaign, seven steps emerged that clearly demonstrated the application of a Participatory Propaganda model:
- Conduct hyper-target audience analysis;
- Develop inflammatory content that erodes faith in the opponent and manipulates audience cognitive biases: Fake news; Memes; Data Leaks/Hacks;
- Inject this content into echo chambers identified through audience analysis;
- Manipulate Feed and Search Algorithms;
- Mobilize followers to action;
- Win media attention: Be a trend; Stage a Scandal; or Commune with the news; and
- Rinse and Repeat.
In analysing this participatory propaganda model, what follows is a combination of a literature review bringing together research conducted by others on the Trump campaign and background information of each of the seven steps, as well as ongoing work aimed at filling in the gaps. Specifically, this original research includes social network and content analysis of Facebook pages including three that supported Trump during the election, as well as seven conservative-leaning and seven liberal-leaning media outlets. The pages analysed are as follows:
The three pro-Trump pages were chosen as a sampling of those supporting his candidacy, with one showing its open support through the name (Citizens for Trump), another having been found spreading fake news supporting Trump (Eagle Rising), and a third standing out as a node in initial, exploratory network analysis (Wake Up & Reclaim America).
Drawing from a Pew Research Centre survey on Political Polarization and Media Habits, seven media outlets trusted consistently by respondents who self-identified as liberal or conservative were selected. One substitute was made on the Conservative-trusted side, which was Infowars, given the role it played in the election.
The following publicly available data for all of these pages was collected using Netvizz:
- Facebook Page Like networks. Beginning with an initial “seed” page, all of the other Facebook pages liked by the seed are collected in a directed network of pages, meaning the data shows which page likes which. Using an analytical took called Gephi, these networks can be visualised. The data in this pull also included information regarding page categories, follower numbers, and rates of engagement.
- Facebook page posts. All of the posts made by these pages during the month leading up to the election (7 October to 7 November 2016) were also collected, including information regarding the type of post, engagement rates and embedded links.
With that, let’s look at each step of the Participatory Propaganda model in more detail.
1. Conduct hyper-target audience analysis
Everything you do online leaves a trace. Have you ever searched for something in Google – maybe information about a town you planned to visit or something you wanted to buy – and then suddenly noticed ads for those very same things beginning to appear in your Facebook feed? That highly targeted ad placement is part of the wonderful world of behavioural advertising.
In the emerging field of behavioural advertising, marketers collect information about what you do online to then position extremely targeted ads in front of you. Activities that can be tracked include what websites you visit, the Facebook pages you have liked, and the things you’ve searched for online. This information is cross-referenced against your online profiles to match demographic data such as geographic location, age, gender and other publicized interests. Armed with this information behavioural advertisers help match those who have something to sell with those who are most likely to buy it.
The use of online trackers to capture this information is pretty common. Researcher Jonathan Albright found in looking at 114 conspiracy websites supporting Trump during the election that he had inadvertently connected to 474 so-called third parties. This means that in visiting those 114 sites, Albright wasn’t just reading the online content in front of him, but unwittingly receiving upwards of 3000 tracking cookies, such that 474 other entities could now follow whatever else he might do on the internet.
Nearly every website analysed by Albright was connected back to Facebook, mostly via “like” buttons for both posts and related promotional pages on the social network. This is important, because your Facebook activity reveals even more about you than cookie tracking could ever hope.
In analysing the things you like on Facebook, one researcher at Cambridge University is able to know you better than your parents or friends. Dr Michal Kosinski developed an algorithm that correlates the things you have liked – posts, pages, comments – on Facebook to Big Five or OCEAN personality traits. The accuracy of this model is astounding. According to Kosinski as per a feature in Huffington Post:
“With a mere 10 ‘likes’ the model could appraise a person’s character better than an average co-worker. With 70, it could ‘know’ a subject better than a friend; with 150 likes, better than their parents. With 300 likes, Kosinski’s machine could predict a subject’s behavior better than their partner. With even more likes it could exceed what a person thinks they know about themselves.”
Such analysis of the things you like on Facebook help pull together a psychographic picture of you: how you think, what your tastes are, what bothers you, with whom you talk, and essentially how you will react to a message. This data can be acquired through Facebook apps and quizzes that request access to your account information – and can be bolstered through all the other personality tests and surveys you might complete in the course of your life online.
There has been some debate as to how genuine Cambridge Analytica’s claims are – but it isn’t so much whether these techniques happened in the 2016 presidential campaign or not, it is that this is where things are headed regardless. The fact is, that there is a wealth of information collected about each and every one of us when we engage online. This data can paint a very accurate picture of who you are, which can in turn be used to segment you into ‘target audiences’, and feed very manipulative content aimed at provoking an intended response.
For example, voters in the 2016 U.S. presidential election might be broken out in the following broader audience groups:
- The Supporters – This group consists of people who already support the candidate and as such make the most ideal candidates to become propagandists in a participatory propaganda model. In terms of Trump, this would include the 681,864 people following Eagle Rising, the 249,720 behind Citizens for Trump, and the 29,121 who like Wake Up & Reclaim America. Content delivered to this group will aim to encourage their active support in spreading a propaganda message. The Supporters will be called on to help recruit the next group.
- The Winnables – these people are possible supporters who are likely to be swayed with the proper message delivered by the right person at the appropriate time. The Winnables might not be on board at the outset, but via their connections to The Supporters or sentiments on key issues they can be persuaded. People are more likely to believe those familiar to them or those they view as influential, and 20% of those surveyed by Pew Research Centre in 2016 said “they’ve modified their stance on a social or political issue because of material they saw on social media, and 17%” had perspectives changed this way about a political candidate.
- The Unlikelys – this group has fundamentally opposing views to the propagandist such that they are never likely to be swayed, however, they are also not yet fervently behind the opposing camp. In a political campaign, the aim with this group is to discourage them from supporting the opponent. No vote at all is better than a vote for the other candidate
- The Forgetables – this group is not likely to change their mind. In the 2016 election example, these are the die-hard Clinton supporters. Nothing Trump could say would ever persuade this group to switch sides. The tactic for mitigating this group is to drown out their attempts to propagandize The Unlikelys.
Of course, each segment can be further divided for hyper-targeted messaging campaigns, but this is a quick generalisation. With knowledge of who the audiences are and what makes them tick – it’s on to step two.
2. Develop Inflammatory Content
With audiences segmented and psychographics mapped, the next step is to develop inflammatory content that erodes faith in the opponent and manipulates audience cognitive biases.
Such content need not only be produced by the campaign itself. Indeed, spreading and amplifying content created by supporters is sometimes more effective as it helps create an appearance of grassroots support for the campaign. It is this mix of content originating from the official campaign as well as that by supporters that contributes to a participatory propaganda model.
The Trump campaign benefited from at least three types of content: fake news; memes; and data leaks.
Fake news isn’t a new phenomenon. Disinformation, “inaccurate or manipulated content that is spread intentionally”, has been used in conflict for centuries. In 480 BC, for example, the Athenian Themistocles beat the Persian Xerxes with disinformation – tricking his opponent into believing Greek recruits were deserting. Disinformation has continued to be used in conflict, including in the Falklands War when BBC airwaves were commandeered to broadcast a fake radio station aimed at demoralizing Argentine troops. And if Trump’s recent presidential campaign is any indication, disinformation remains a common tactic in politics aimed at discrediting an opponent.
What is new is the ease with which disinformation can now be published and spread. With tools, such as WordPress, an online content management system, it really takes little effort to create a website capable of publishing whatever the creator wishes – and social media enables the spread of that content like never before. Indeed, the rate at which disinformation is being spread online prompted a report on the subject by Facebook in April 2017 along with disclosure about what the social network aims to do about it.
One form of disinformation, fake news, has been well documented in the context of the 2016 U.S. presidential election. As Facebook defines it in its report fake news consist of “articles that purport to be factual, but which contain intentional misstatements of fact with the intention to arouse passions, attract viewership, or deceive.”
Fake news is a global problem. Lies spread faster online than the truth. Conspiracy theories, often a feature of fake news, reduce complex issues to “binary opposition, simplifying – and misrepresenting – the political space.” And as Facebook noted in its report, governments and non-state actors alike are spreading disinformation online.
A person’s degree of partisanship is directly correlated to their likelihood of believing conspiracy theories or fake news. And news shared by known trusted opinion leaders on Facebook influences audience perspectives. Taken in this context, it is alarming that Trump spreads fake news, and in his sharing, might account for why one study found that those known false news stories favouring Trump were shared 30 million times on Facebook to the 8 million shares bestowed on those favouring Clinton in the three months leading up to the election.
Beyond Trump, his far-right supporters used Facebook pages to push fake news. According to some analysis “fake news” outperformed “real news” on Facebook during the election. Fake news sites were found by Jonathan Albright in hyperlink analysis to be choking out mainstream media in online networks. And in a more recent study by the Oxford Internet Institute looking at Twitter posts shared in Michigan during the election – 46.5% of “content that is presented as news and information about politics and the election is of an untrustworthy provenance” compared to 25.9% coming from professional news organisations.
The links shared to the three Trump supporting Facebook pages reviewed for this study were mostly non- mainstream media. On average, link posts comprised 53.22% of updates made by the pro-Trump pages. Eagle Rising shared more links than the other two (83.25% of posts), with nearly half of those links (45.4%) pointing to the page’s own website eaglerising.com, which contains coverage speculating on connections between Clinton, terrorists and Nazis, for example, and the Clinton campaign’s alleged use of psychological warfare (which in turn points back to another site shared by these pages called ipatriot.com).
Facebook Posts by Type
After Breitbart, the most shared domain to Citizens for Trump was gatewaypundit.com, a blog that has posted many questionable articles on Hillary Clinton, including that she secretly called for Trump’s assassination, had suffered a brain seizure, and that she had a gum and immune disorder. During the period between 7 October to 7 November 2016, Citizens for Trump shared 13 Gateway Pundit articles, accounting for 4.32% of all link posts, including one speculating on Clinton’s health that enjoyed 321 shares on Facebook. Wake Up & Reclaim America also shared 14 Gateway Pundit articles, including this post suggesting Clinton was involved in having Supreme Court Justice Scalia assassinated.
Nearly 80% of Americans received information about the election via a news source. The media is one of the main sources of information to help voters make informed decisions – fake news (even the satirical variety) increases feelings of inefficacy, alienation, and cynicism – sentiments that drive people towards populism.
Fake news sites, beyond sowing doubt, are also useful tools for tracking visitors, as explored through Jonathan Albright’s research in the last section.
If fake online news is the new disinformation, memes are the Digital Age equivalent to propaganda posters.
A “unit of cultural transmission, or unit of imitation”, memes are often humorous phrases, images or videos that are copied or adapted with slight variations and then shared online.
During the 2016 election, Facebook groups sprang up dedicated to sharing “dank memes” supporting all sides and Palmer Luckey, a controversial Silicon Valley tech entrepreneur, funded a “meme factory” to support Trump. So-called “meme battalions” created visual content that “relentlessly drew attention to the tawdriest and most sensational accusations against Clinton, forcing mainstream media outlets to address topics – like conspiracy theories about Clinton’s health – that they would otherwise ignore.”
Memes reduce the public policy debate to shallow sound bites and ridicule stripped of contextualized understanding of available political choices. This contributes to ‘media endarkenment’ reducing complex political issues to simplified entertainment and misinformation, which is perfectly suited to populist rhetoric.
Memes are also reminiscent of past propaganda posters that aimed to demonize the enemy. In the examples below, both sets use some form of humour to convey its message, and both attack the enemy. The propaganda posters of the past, however, were restricted to time and space, and it is reasonable to assume that the propagandist who created them could be identified through basic content and contextual analysis. With memes, however, this becomes much trickier. Obviously, in the examples below, the aim of the Clinton memes is to discourage people from voting for Hillary Clinton, but anyone could have made these – and many more participated in their spread.
Memes account for a considerable number of posts on community Facebook pages such as Wake Up & Reclaim America, which were also expressed supporters of Trump during the campaign. In analysis of 1330 posts made by Wake Up & Reclaim America in the month leading up to the 8 November 2016 election, nearly half were image posts – and 65% of those photos were shared by the page from posts made by other Facebook users or pages.
Memes make for ideal inflammatory content in the participatory propaganda model. Not only are visual posts more likely to be shared online, memes can help foster the appearance of grassroots support. Given how challenging it is to trace a meme back to its creator, campaigns can easily generate such visual content and upload it via fake accounts. Unsuspecting supporters will readily share memes on social networks, encouraging participation in a very low-impact manner. Average internet users can also easily create memes to support a campaign, thus participating in the spread of propaganda more actively. In short, memes offer variety in co-opting supporters in participatory propaganda.
Transparency is key in the Digital Age. Information has a way of coming out. With mobile devices and constant access to Internet, the concept of public space has forever changed. Everything can be captured and shared quickly – without consent. Even things thought to be secret – particularly if communicated or saved digitally – are no longer protected, as hackers can and will access such information and share it.
Through a data breach, sensitive or protected data is accessed by an unauthorized party. This can occur when an insider decides to leak confidential information, or through deliberate hacking of digital systems, usually accessed through social engineering attacks.
In terms of the 2016 election, Clinton was dogged by several hacks and leaks, giving Trump a political advantage. These included the hacking of her Chief of Staff John Podesta’s emails, and the leaking of things she said about Bernie Sanders supporters in the past. A data hack was also what led to the discovery of the private email server Clinton used while serving as Secretary of State, the FBI investigations for which hampered her campaign.
Such hacks and leaks were certainly discussed online. In analysing Facebook posts made between 7 October to 7 November 2016, all of the pro-Trump pages assessed made mention of “Wikileaks”, a non-profit that aims to “open governments”, which in that time frame had shared more of the leaked Podesta emails to its website. Of the three Facebook pages analysed that supported Trump, 65 posts mentioned “Wikileaks” during the month leading up to the 8 November election, accounting on average for 2.75% of all posts made during that period. Both the conservative- and liberal- leaning media outlets analysed made mention of “Wikileaks” in this timeframe too: the seven right-leaning pages mentioned “Wikileaks” 131 times accounting for 2.72% of all posts made on average, whereas the left-leaning pages referenced it 47 times, or in just 0.46% of all posts. The pages for InfoWars, Sean Hannity, and Wake Up & Reclaim America referenced “Wikileaks” on average more than the others accounting for 44% of all mentions found.
While it can be argued that leaks increase transparency and hold governments to account, there is an adverse effect, as well. Populists use hacks and leaks to deliberately erode trust in the government, decreasing faith in the established order fuelling anti-elite cynicism. This tactic worked well against Hillary Clinton, who as a former First Lady, and long-time politician could easily be taken as the embodiment of the established order.
Of course, sometimes data hacks or leaks aren’t really needed to turn someone’s words against them. This ad and other manifestations of it were used to dissuade African-American voters from supporting Clinton and were delivered to that demographic via Facebook’s targeted advertising platform.
The sensationalism that surrounds a data leak in the media can often distract from other important questions, such as who is behind it and what are their motives? This is not to say that the substance of a leak is not important, however, understanding the context is equally as imperative.
With inflammatory content created to manipulate the target audience in hand, it’s time for step three:
3. Inject Inflammatory Content into Echo Chambers
The online world is a crowded market space. It isn’t enough to simply create content that resonates with an audience; it must be delivered directly to them. In a Digital Age, propagandists can reach you through the Facebook pages you follow, your social media feeds and networks, trending topics on Google, and traditional media. Step Three then is to inject this deliberately provocative content into echo chambers identified through audience analysis. The key here is to have a desired actionable outcome from the content – whether that be to share it, sign up for a mailing list, or troll the comments section of news sites.
An online echo chamber is a digital space where content reflecting a specific point of view reverberates, exposing those within it to only that one prevailing perspective. Digital technologies enable the creation of echo chambers or filter bubbles. In fact, it only takes a matter of days to become part of a filter bubble, as two German journalists discovered in a recent online experiment. Once inside an echo chamber, a user is fed content fitting pre-existing views and preferences, such as political party affiliation. Echo chambers are created in part by algorithms that sort information, but more so by the choices individuals make about content consumption.
Echo chambers identified during the 2016 election were strengthened by a growing animosity between political camps, as well as a lack of media trusted by both Republicans and Democrats and thus information exchange was hindered across party lines. In a recent survey conducted by the KIND Foundation, 54% of respondents admitted that their social media feeds “mostly reflect worldviews similar to their own”, with only 5% saying they see opposing perspectives.
Echo chambers, once identified, can be injected with persuasive information that conforms to existing beliefs held by followers, encouraging the spread of that content, turning those inside the filter bubble into propagandists. Moreover, as a group of researchers at Yale University recently found, “political echo chambers not only isolate one from opposing views, but also help to create incubation chambers for blatantly false (but highly salient and politicized) fake news stories.”
Echo chambers facilitate the spread of conspiracy theories, and those supporting Trump shared fake news during the election, with some hyper-partisan right-wing Facebook pages feeding followers 38% fake content. A section of the anonymous message board, 4Chan, (called “/pol” for “politically incorrect”), as well as The_Donald, a subreddit on the social news aggregation platform Reddit, were both found to be channels for pushing memes supporting Trump and attacking Clinton.
As noted earlier, the three pro-Trump Facebook pages shared more alternative media sources than mainstream links in the month leading up to the 2016 election. Of those links shared to the pro-Trump pages and pointing to the conservative- and liberal-leaning pages also analysed, most were from either Fox or Breitbart. The page Eagle Rising shared none of the 14 media pages analysed, and the 1143 links posted between 7 October and 7 November 2016 pointed to just 14 websites, including eaglerising.com.
A manual categorization of pages based on names and content reveals that the three pro-Trump page networks are decidedly part of right-leaning echo chambers. Nearly all (94.1%) of the Citizens for Trump network are right-leaning, pro-Trump pages, while 82.7% of those within the Eagle Rising network are. As the Wake Up & Reclaim America page contained over 5,000 pages, a sampling of 1,000 pages were manually categorized, representing 18.8% of the total. While 67.8% of these were right-leaning pro-Trump pages, most other pages covered topics reflected in Trump’s campaign rhetoric, such as pro-Christian, anti-Muslim, pro-military, pro-police, anti-immigration, pro-life views and biker groups. If these topics are combined, the rate of pages within the Wake Up & Reclaim America network that reflect views shared by Trump supporters is 95.7%. Given that only two pages were found to express counter views – across all three page networks – it is safe to say these networks comprise a filter bubble of sorts.
Such echo chambers further populism by polarizing the electorate into an “us” and “them”, and reduce the ability of voters to discern relevant and factual information from bad, enabling politicians to play on confirmation bias, while also arousing suspicions about the opposition.
The polarization of American society can be demonstrated in a Facebook post found during exploratory analysis of posts made by the page Wake Up & Reclaim America
After priming echo chambers with manipulative content, the next step will be to amplify the noise – this can be done by:
4. Manipulating Feed and Search Algorithms.
Algorithms are important in the information environment. Search returns have been found to sway voter decisions. Regardless of who controls the returns, algorithms also enable echo chamber development, which polarizes the electorate.
Algorithms also had a role in the 2016 elections. Google search autocompletes and returns favoured Trump, spreading false information with a far-right bias. Fake news supporting Trump trended on Facebook through algorithms. And Trump was more searched than Clinton on Google. The more Trump was searched, the higher content about him ranked in subsequent search returns – a competitive advantage when first page search returns garner 92% of all click through traffic.
Google Search algorithms can be gamed in at least two ways:
Hyperlinking and Seeding of content
Posting content, such as fake news, on multiple websites and linking back and forth between sources helps boost content in Google search returns, and if nothing else, can bury opposing information from appearing in the first pages of returns.
In a study by Jonathan Albright, alt-right affiliated sites were found to be choking out mainstream media. Albright looked at 117 sites that had been publicly connected to the alt-right by verification sites such as Snopes, Fake News Watch, Real or Satire, and Media Bias Fact Check. Albright then crawled those websites to collect hyperlinks, finding 80,587 hyperlink connections. Using Gephi, he then visualized this network to discover that these websites were effectively dominating major news outlets.
Drawing from posts shared to the three pro-Trump pages in the lead up to the election, a simple Google search of article titles sheds some light on how such networks function. In one example, Eagle Rising shared an article from the blog the blacksphere.net entitled “Hillary Clinton: Calls Blacks Professional Never Do Wells”. This post garnered 157 shares on Facebook.
A Google search using the article’s title as exact terms, returns the original post, as well as several nearly exact reprints on other sites, with some linking back to The Blacksphere article. A search for The Blacksphere url returns 734 results, including posts from rightwingnews.com, teapartytribune.com, and thegatewaypundit.com. Some of these links are posted by other users in comment sections and online forums, and Sharescount suggests the URL was shared 12.5K times across social networks. The article was also picked up by online trend aggregators like Trendolizer, indicating the efforts to spread this content had some impact. Indeed, absent on the first page of another search return (made in a separate web browser logged into a different Google account) for the key words “Is Hillary Clinton a racist?” are any posts refuting the idea she might be. (This experiment was then repeated in a different country, on another internet service provider, on a new computer with similar results).
Through a mixture of reposting content across multiple sites, linking back, adding such articles to comment sections on other media sites and message boards, and encouraging social shares, propagandists are able to flood news and search feeds with the same message, driving the opposition’s counter message down in rankings. Through the active sharing of such content by supporters, both on social media and in other forums, the propaganda model remains participatory. This approach helps to account for the upwards of 40% of directed organic traffic to the right-wing sites Albright analysed.
Lobby groups, governments, and businesses, are among the many who are using astroturfing and bots to distort the information space for strategic purposes. Posting fake comments and reviews aims to harness the cognitive bias of “social proof”. Bots had a “small but strategic role” in Brexit Twitter chatter with 1% of accounts generating one third of all messaging on the topic, and bots have also been identified in Venezuelan political discourse.
Astroturfing is the use of fake online accounts or other means to make a message appear to be coming from another source, helping foster the illusion of grassroots support. A botnet is a series of devices connected via the internet and controlled by an owner who uses them to execute tasks, such as sharing a specific post on Twitter. Botnets can manipulate algorithms. Twitter bots gamed Google’s algorithm for displaying “real time news” into promoting disinformation during a 2010 senate election in Massachusetts.
During the 2016 election, pro-Trump Twitter Bots dominated discussion about the U.S. election 5 to 1 over pro-Clinton messaging, and “strategically colonized pro-Clinton hashtags,” according to OII research. Bots also accounted for nearly one-fifth of online discussion about the election, negatively affecting political discourse by drowning opposing views. This domination in online discourse helps explain Trump’s success in Google search rankings.
The use of astroturfing and bots to create the illusion of grassroots support is a particularly negative phenomenon in democratic societies where the vox populi provides legitimacy to decision-making by elected representatives.
Beyond botnets and hyperlink seeding, Trump supporters were also effective at encouraging regular people to become propagandists too – which leads us to Step 5:
5. Mobilize followers to action.
Once inside an echo chamber, consuming content that manipulates known cognitive biases, you are more likely to become active in supporting a candidate or a cause. Campaigns will provide followers with simple actionable steps along with provocative content to help turn unsuspecting users into propagandists – which is what makes this new model participatory. Actions might include: telling people to share content; co-opting or borrowing influencer accounts to share content; or encouraging trolling activity to stifle debate.
New research has found that anyone can become a troll under the right conditions, “behaving in a deceptive, destructive, or disruptive manner” online. And nothing quite brings out the troll in a person like a political conviction. To many, Trump is a troll, but he was also supported by a legion of online trolls during the election, spreading disinformation and attacking Clinton supporters online. The Trump campaign and his supporters certainly mobilized followers.
In Jonathan Albright’s research, he noted that many of those far-right sites analysed featured persistent email enrolment pop-up windows. Email is one of the most overlooked tools in a propagandist’s toolkit, and looking back to the 2008 presidential election, email played a major role in galvanizing support for Barack Obama.
Other existing online communities were also tapped to support Trump. Researcher Gilad Lotan found a group called the United States Freedom Army – who believes the left is engaging the right in a civil war. The United States Freedom Army offered its members a monthly directive on actions to take on Twitter, and elsewhere in the spread of their content and support for Trump.
The United States Freedom Army has also been known to ask people with Twitter accounts with more than 20K followers to either actively engage or offer their accounts on loan to contribute to the campaign, as demonstrated in this LinkedIn post calling for support.
A Case study in mobilization: pro-Trump groups
The three pro-Trump pages all attempted to mobilize their audiences. Citizens for Trump and Eagle Rising, however, were arguably more successful than Wake Up & Reclaim America, as demonstrated through the average rates of follower shares on Facebook posts.
Average Shares on Posts
All three pages encouraged followers to vote for Trump.
Citizens for Trump and Eagle Rising, however, also asked followers to share and spread messages, which might account for the higher percentage rate of shares on their posts.
Depending on your own filter bubble, the size of pro-Trump networks might come as a surprise. To some media pundits, Trump rode to the White House on a wave of fringe support – but that would be a mistake, as analysis of the pro-Trump Facebook Page Like networks shows.
Each of the pro-Trump pages Facebook Page Like networks were added to one visualisation using Gephi, which amounted to a total of 5416 nodes with 100,208 edges between them. To put that into perspective, similar data pulls were made on two media page groups. The three pro-Trump pages had 16.3 times more nodes and 55.86 times more edges than the liberal-leaning media group, and 6.45 times more nodes and 55.86 times more edges than the conservative-leaning media group.
Looking at the three pro-Trump pages separately, each network contains a considerable percentage of pages that have self-categorized on Facebook as “Community”, but also “Public Figure”, “Politician” and some form of “News/Media”.
The pro-Trump network was then analysed using Gephi. This included running the ForceAtlas2, a force-directed layout to transform the network into a map. Additional statistical analysis was conducted, using Modularity, which helps identify the various communities within a network, marked in the data visualization below by colours. The pro-Trump network wasn’t just bigger in comparison; it was also more closely integrated between pages with an Average Weighted Degree of 18.502 compared to that of the conservative-leaning media group at 9.01 or the liberal-leaning at 5.404 (the higher the number, the greater the average number of edges that touch a node in the network).
Pages liking each other demonstrate a possible channel for the spread of information, but this fact does not constitute proof on its own. To investigate further, Netvizz was used to pull all posts made by each page from 7 October to 7 November 2016, a month before the election. These posts where analysed using Excel to count the mentions of specific terms (such as Clinton, Trump, and Wikileaks), how many posts were shared from other accounts, and what web domains were shared to the page, for example. The same investigative process was then applied to analysing the two media page groups.
Around one third of the posts made by Wake Up & Reclaim America (34.1%) and Citizens for Trump (28.7%) were shares from other Facebook accounts or pages, indicating community-like behaviour on these two pages.
Some pages such as Occupy Libtards 5 enjoyed repeated shares to Wake Up & Reclaim America, including The Deplorables. This Facebook group has 472,297 Members (as of 18 April 2017) and takes its name from a comment made by Hillary Clinton during the election about Trump supporters.
More recent posts to pages and groups such as Wake Up & Reclaim America and The Deplorables, also suggests that these communities are already primed to support Trump, not to mention willing to take action. In fact, mobilizing them would take very little if this post is any example.
These pro-Trump pages are not operating in isolation. Of note as bigger nodes in the pro-Trump Facebook Page Like network visualisation are Fox News, Sean Hannity, The Blaze, and Glenn Beck (see the darker orange community in the upper left of the network) – not to mention the NRA Institute for Legislative Action and The Heritage Foundation.
Beyond the official political campaign Facebook pages, hundreds if not thousands of other pages pumped content supporting Trump to sympathetic users of that social network. Indeed, within the Wake Up & Reclaim America Facebook Page Like network, 207 page names contain the word “Trump” – many more that are pro-Trump do not, making them much more difficult to track. Together these Facebook pages support each other with reciprocal Page Likes and sharing of posts, while also mobilizing users to not just spread the message but also support Trump. In so doing, these online communities are also tapping into bigger organisations, such as media outlets, lobby groups, and think tanks – hinting at a much more systemic participatory propaganda effort.
This interaction with established organisations, in particular media, brings us to the next step. With an identified and co-opted target audience, bolstered by botnets and a network of websites boosting manipulative content in major online feeds and search returns, the next step is to translate this into traditional media coverage:
6. Win Media Coverage: Be A Trend; Stage a Scandal; or Commune with the News
Media play a critical role in furthering populist agendas; after all, “the media are a key element in the construction of public understanding.” Rates of populist politician media coverage correlate to popular support levels. And Trump was consistently mentioned more on television, online, and social media.
While distorting the information space is important, traditional media still plays a critical role in informing and swaying the masses. Fortunately for most propagandists, it isn’t too challenging to win media coverage.
Be a trend
One opportunity is to translate online activity into news. Given that some 46% of journalists use social media to either source a story or verify information, it is possible to use the momentum of online engagement to win coverage. Indeed, some news stories are simply about what topics are trending on Twitter. A Google news search for the exact terms “Trending on Twitter” on 27 April 2017 returned 371,000 results – with 14,000 published within that past week.
Stage a Scandal
A scandal will also attract media attention. Media savvy populist politicians are particularly adept at this. Through a “right-wing populist perpetuum mobile”, explains academic Ruth Wodak, populists stage scandals to gain media attention, causing the opposition to attack, then distort the ensuing debate to position themselves as victims of a system rigged against them where freedom of speech is no longer tolerated. Such scandals tend to be around a situation that can be interpreted in multiple ways.
Drawing from the Trump experience, an example of his campaign using the “right-wing populist perpetuum mobile”, could include when the presidential candidate shared an image on Twitter created by a campaign supporter. In the picture, adapted from Clinton’s own campaign material, the Trump supporter had added a symbol very similar to the Star of David. There were public outcries that the usage of this particular symbol carried anti-Semitic undertones. The Trump campaign’s response was that the media and others had it all wrong, that this symbol was, in fact, a Sherriff’s star. The entire episode becomes yet another example to Trump supporters that the liberal media is biased against him. This event also demonstrates the participatory propaganda model in action – drawing from supposed user-generated content, thus indicating grassroots support.
Such scandals helped keep Trump in the media. Trump enjoyed more media mentions, both on TV and online than the other candidates. Indeed, by the start of the primary election campaign in early 2016, Trump had been enjoying “more nightly news coverage than the entire Democratic field combined.”
Commune With The News
It would appear that the relationship between media, politicians, and online communities, however, is also very symbiotic.
The Colombia Journalism Review identified a “right wing media network anchored around Breitbart” in analysing more than 1.25 million stories posted online from 1 April 2015 to 8 November 2016. This “distinct and insulated media system” used social media to spread a “hyper-partisan perspective”, but also “strongly influenced the broader media agenda, in particular coverage of Hillary Clinton.”
Likewise, similar results were found in analysing the liberal- and conservative-leaning media outlets.
The liberal-leaning media group, visualised here below, comprised seven almost entirely independent communities. This visualisation below uses Gephi’s stronger gravity function to keep the communities closer together for ease of viewing; however, they are not linked so closely in reality. What’s more, the Facebook pages tend to be grouped into ‘ego networks’, meaning any given media outlet tends to only like pages related to that network, such as its own TV shows or journalists.
The conservative-leaning media group is quite different. The massive Infowars community dominates the visualisation, represented here below by the large yellow section, running into the Alex Jones network in blue, which comes with it. While nodes connect the Infowars monolith to Fox, the key connector page is Judge Andrew Napolitano. This is interesting in itself, as in past analysis of media Facebook Page Like networks, Fox stood out from outlets such as BBC for its connecting to personalities, both their own journalists as well as U.S. politicians, suggesting that some media outlets aren’t just covering the news, but engaging directly with the subjects making the news. This form engagement could be considered alarming, if the notion of impartial news is accepted as crucial to a functioning democracy.
When these two media groups are combined with the pro-Trump network (see the map below), the liberal-leaning outlets become islands unto themselves almost entirely disconnected (the blue communities at the bottom left), while the conservative-leaning media are absorbed into the overall community, and as noted above, in some cases becoming influential nodes.
Another key difference between the networks is the breakdown of page categories. In the liberal-leaning media network, the liked pages tend to be related to news media, such as “Media/News Company” or “Journalist”.
In the conservative-leaning media network, however, there is no clearly dominant category for pages appearing within it. “Journalist” garners a mere 2.74% to the left’s 14.08%. “Public figure” and “Movie” each represent around 12% of the page categories within the conservative-leaning media network. Similar to the pro-Trump page network, though, 9.88% of the conservative-leaning media network pages are labelled “Community.” And therein might lie the key difference in how liberal- and conservative-leaning media operate.
Only half of the Facebook pages in the liberal-leaning media network allow users to post to them, whereas 77% of the conservative-leaning media network and 70% of those in the pro-Trump network enable followers to engage this way. Arguably pages such as InfoWars and Breitbart are community builders, meaning they don’t just push content to their audiences in a one-sided affair as, say, CNN or Anderson Cooper 360 who do not allow users to post to their pages.
Indeed, the conservative-leaning media network contains 2.53 times as many nodes than the liberal-leaning group, with 4.22 times the number of edges and 3.26 times the number of strong connections. Likewise, the conservative-leaning network enjoys a higher average weighted degree (9.01) than the liberal-leaning one (5.40) meaning it has a greater average number of edges that touch nodes in the network. In short, the conservative-leaning media network is more of an ecosystem that stretches beyond news outlet borders, blending into each other and pages beyond just media and journalists, into communities.
The media landscape is changing in a Digital Age. While traditional coverage continues to play an important role in terms of exposure for a politician, there is clearly more happening between some newer outlets and their audiences than simple media consumption, whether it be in tracking audiences through cookies, supporting echo chamber development, spreading inflammatory content, or mobilizing followers to action.
In feeding back into the media, be it through coverage or engagement, the participatory propaganda model has come to its final step.
7. Rinse and Repeat.
It is important to note that participatory propaganda is a cyclical model – once steps 1-6 are complete, the next and final step is to start all over again, feeding the machine, tweaking with every new audience insight gained.
The digital environment enables real-time monitoring that propagandists of the past never
enjoyed. Platforms such as Bottlenose, Cision, Crimson Hexagon and other custom solutions help brands monitor and assess online conversations. Such ongoing monitoring is crucial in tracking the effectiveness of outreach efforts over time, whereas the findings from hyper-targeted audience analysis provides much needed baselines and parameters for continued listening.
Constant media monitoring and evaluation becomes the basis for a feedback loop. Based on regular review, strategies and messaging are tweaked to increased effectiveness, and encourage audience participation in propaganda efforts.
Thank You… and Propagandise About This Topic
Yes, that was a long read. Thank you for staying with it. My research on the subject of participatory propaganda is at its early stages. If you believe this is a topic that needs deeper understanding and broader awareness among voters, please like, share, and comment on this post. Yes, I am asking you to be a propagandist in my own participatory model – but that support will convince a publisher that this topic is worthy of the time and effort for turning it into a book. I am also interested in the constructive feedback loop – feel free to reach out.
This research is also available in a full-length talk format – please reach out if you are interested in including it in your conference or event.