Are We Prepared for Propaganda in a Digital Age?

This is the dawning of the age of persuasion and most people are simply not prepared. For most of us, the internet is an always-on source of information and entertainment – but as we become more immersed in an interconnected world, our risk of manipulation increases.

While we might like to believe Canada’s style of politics is more moderate and less polarised than that of our American cousins, we are just as digitally connected. With an internet penetration rate of 85.8%, nearly half of the total Canadian population finished high school before the web was even invented.

Raising awareness through education to foster enhanced critical thinking skills, and enhanced understanding of how easily humans are manipulated by information, would enhance overall digital media literacy but also ensure our political process remains healthy and balanced.

Take Donald Trump’s victory. While details are still emerging, the scope of what can be done to persuade people in a digital age is significant – take Donald Trump’s campaign as an example. To date it is known that the campaign: used data collected via Facebook personality quizzes to create highly targeted and persuasive campaign material – which in turn was delivered through the social network’s equally as targeted advertising services; identified and manipulated online echo chambers into spreading fake news; used Twitter bots to dominate online discourse and game algorithms to help pro-Trump content trend on social networks; discouraged dissent and reduced the public policy debate to shallow sound bites and entertainment using memes and trolls; and benefitted from data leaks discrediting Hillary Clinton.

Isolated, many of these tactics might not seem overly threatening, but each subtly shapes an individual’s perception of reality around them. This leads to an increasing polarization of society and a decreasing faith in existing political systems.

Participatory Propaganda

Content that aims to persuade is often created to play on personality traits or emotions, and is seldom fair or balanced. Given how quickly content is consumed and discarded online, most people do not allow the mental processing time required to critically assessing information, and instead reacting to it unconsciously and emotionally. This renders them extremely vulnerable to manipulation.

Few average internet users understand how information is sorted and presented to them. Algorithms used by internet giants, such as Facebook and Google, help to make accessing content easier. Content filtering in the form of search returns, however, has been found to sway voter decisions. Algorithms also contribute, along with user preferences, to the development of echo chambers online.

Once inside an echo chamber, a user is fed content fitting pre-existing views, such as political party affiliation, which reinforces beliefs and reduces the ability of voters to discern relevant and factual information from bad. Echo chambers are ideal vehicles for spreading persuasive content, encouraging those who believe such content to engage with it and spread it further, becoming proxy propagandists. Such distortion of the information space, however, polarises the electorate, which in turn leads to “unfriending” those who share opposing views (reinforcing the echo chamber effect), increasing hostilities, and trolling behaviour as people take sides.

New research has found that anyone can become a troll under the right conditions. And nothing quite brings out the troll in a person like a political conviction. Engaging in aggressive and provocative behaviour online, however, discourages rational discourse, intimidating people from participating in politics online. Trolls demoralise their targets and polarise the perspectives of others, affecting how related content is perceived.

Botnets, a network of interconnected devices or online accounts controlled by a single operator, is another method used to create the illusion of grassroots support and is a particularly damaging phenomenon in democratic societies where the vox populi provides legitimacy to decision-making by elected representatives. Creating the illusion of greater support could lead to initiatives that are not actually what the people want or need.

Fake news is extremely problematic for democracy. Nearly 80% of Americans received information about the election via a news source. The media is one of the main sources of information to help voters make informed decisions – if media is not trustworthy, or distorts understanding, it is directly detrimental to a healthy political debate. Indeed, fake news – even if satirical – increases feelings of inefficacy, alienation, and cynicism – sentiments that drive people towards populism.

While it can be argued that information leaks increase transparency and hold governments to account, there is an adverse effect. Some politicians have used hacks and leaks to deliberately erode trust in the government, decreasing faith in the established order, and fuelling anti-elite cynicism, again, rendering voters susceptible to populist politics.

Combined together, the above techniques along with other methods distort our information space. Executed in conjunction with more traditional public relations stunts aimed at dominating media coverage – as Trump consistently did – such political campaigns can easily persuade targeted populations. (And make no mistake, these techniques will and are used on all sides of the political spectrum.) Without measures taken to increase critical thinking among voters, such persuasive campaigning will only become more effective as people continue to immerse themselves in the new information space. As of 2015, 21% of American survey respondents indicated they were online “almost constantly”. By the end of the first quarter in 2016, the average American was consuming 10:39 hours of media across devices each day. These rates are only expected to soar.

In the meantime, Canadians should look to the recent example south of the border, as a cautionary tale – just because online content reflects your pre-existing beliefs, doesn’t make it true. Think twice before leaping to the share or comment button – it’s quite likely the content magically appearing in your social network feed was created to provoke you into action.

About Author

La Generalista is the online identity of Alicia Wanless – a researcher and practitioner of strategic communications for social change in a Digital Age. Alicia is the director of the Partnership for Countering Influence Operations at the Carnegie Endowment for International Peace. With a growing international multi-stakeholder community, the Partnership aims to foster evidence-based policymaking to counter threats within the information environment. Wanless is currently a PhD Researcher at King’s College London exploring how the information environment can be studied in similar ways to the physical environment. She is also a pre-doctoral fellow at Stanford University’s Center for International Security and Cooperation, and was a tech advisor to Aspen Institute’s Commission on Information Disorder. Her work has been featured in Lawfare, The National Interest, Foreign Policy, and CBC.

Comments are closed.