This piece draws from a talk delivered at the Canadian Public Relations Society’s annual conference in 2019. When first asked to speak, ethics had not been at the forefront of my mind, however, the choice of other keynote speakers made me think otherwise. In particular, the inclusion of a co-founder of AggregateIQ, the Canadian online advertising firm embroiled in the Cambridge Analytica scandal, complete with mention in his bio that the firm had been “featured in “Brexit”, an HBO film on the UK’s EU Referendum,” increased my concerns about the direction of strategic communications in a Digital Age. The following thoughts on ethics and information are what emerged in response:

It is undoubtedly an exciting time to work in strategic communications. A technology-driven Information Age has ushered in unimaginable speed, reach and focus with which messages can now be distributed. Never before have strategic communicators had the means to persuade like they can now. Today, I would like to stress an important point that, in my view, has not yet received sufficient attention – that with this great power comes enormous responsibility of a magnitude we do not yet even fully comprehend. 

While the advent of print, radio and television were all ground-breaking, their invention were ultimately a foray into the unknown. Sure, the inventors could estimate a significant increase in content distribution, in relative terms, and were likely hoping for increased buy-in from their target audiences. However, how much did they really know about the full effects their invention would have on audiences and society in general. Pioneers in these fields were experimenting. Unlike with radio and television, with the web strategic communicators could build on centuries of experience with various media, (the written word, images, videos), but also numerous studies in the fields of psychology, sociology, political science, neurology, media studies, advertising and more – many addressing points related to the impact various forms of media have on individual, group, national or international audiences, both cognitively and behaviourally.

Indeed, psychology as a scientific discipline was only developing when mass media emerged (Malone, 2009). Strategic communicators knew very little about how humans process information. Public relations pioneers like Edward Bernays, began applying the work of his uncle, psychoanalyst Sigmund Freud, to communications by the 1920s, appealing to a human sense of self to promote products (Stuart, 1996). In one famous case, Bernays positioned cigarettes as “torches of freedom” connecting it to women’s liberation, having women paid to smoke while participating in New York’s 1929 Easter Sunday Parade (O’Connor & Own, 2019). Yet the field of cognitive psychology, which studies how the brain processes information such as in decision-making, only arises in the middle of the 20th century (Neisser, 1967).

The desire to know more about what motivates people, what triggers their attention, reactions and purchasing decisions has also led to a steady improvement in research and data collection methods, from the use of focus groups or surveys, and, with advances in technologies, to online polls, psychographic profiling through social media and so on. As the 20th century progressed, Strategic communicators acquired an increasingly elaborate, detailed and intimate picture of their target audiences.

Thus, three major themes stand out that make strategic communications in an Information Age unique from other periods of technological advancement:

  1. Changes in information communications technologies have altered the way people connect, making mass communications no longer unidirectional from a sender-to-receiver, but enabling individual engagement, while also increasing the speed and reach of such messaging;
  2. Advances in cognitive and social psychology have given rise to new fields of study such as behavioural economics. Strategic communicators now know much more about how humans make decisions and what types of information presented when and by who will optimally influence a target audience;
  3. Given the widespread use of information communication technologies, abilities to collect and make sense of data on individuals have significantly increased, enabling behavioural advertising, whereby data collected about online behaviour, shopping habits and other information are combined to create a picture of how a person thinks and what might motivate them to action.

In my research with Michael Berk in 2017, we identified a communications model in political campaigning that draws from these three shifts, called participatory propaganda, which was featured in Forbes and the upcoming Sage Handbook on Propaganda. What we found looking at the Trump campaign and his supporters’ efforts revealed a 7-step process, which we ultimately discovered, in part, at work again in Canada and the U.K. in 2017 and Ontario in 2018. Put simply, the participatory propaganda model consists of:

  1. Conducting target audience analysis using behavioral advertising to understand what motivates a specific group of people to then;
  2. Develop and use provocative content such as fake or questionable news, memes and data leaks that provoke the audience into engaging and sharing it;
  3. Pumping that content into echo chambers and communities where the target audience already receives information;
  4. Boosting this content through heavily automated posting to manipulate online search and newsfeed results;
  5. Encouraging followers to take action by taking polls, signing petitions, sharing content, loaning influencers accounts for posts, and trolling dissenters;
  6. Winning media coverage by trending online (as achieved by step 4), staging scandals as a means of controlling news cycles, and fostering symbiotic relationships with sympathetic and alternative news outlets; and ultimately,
  7. Assessing and adapting efforts through a constant monitoring and evaluation of activities, repeating the model’s steps anew over and over again.

What makes this model so interesting is how the audience is co-opted and becomes a part of the propaganda effort, ultimately accepting, adapting and spreading a message. With the internet and social media, the traditional separation between ‘the propagandist’ and ‘target audience’ is rapidly blurring with the latter beginning to play a more significant role in spreading propagandistic content and influencing others through personal networks – a more dangerous development since people are more likely to believe those familiar to them (Garrett & Weeks, 2013) or those they view as influential (Turcotte et al, 2015). 

This participatory form of propaganda is more nuanced from the traditional interpretations of propaganda in that modern technologies allow propagandists to not just push a message but get audience “buy-in” through content that triggers engagement with it, including taking up and spreading that message through the audience members’ own networks. Through this engagement, propagandists can amplify their message, and by obfuscating its provenance, increase its receptivity among wider populations.

Participatory propaganda offers the ability to truly dominate the information space through volume of messaging, delivered through a mix of real people and automated accounts, effectively making it difficult to discern where fake ends and authenticity begins.

From a strategic communications perspective all of this is very exciting – never before have the means for persuading people been so truly awesome, (the communication professional in me must admit) but for the concerned citizen (which I also am) increasingly subjected to unconstrainted manipulation these advances are also equally frightening.

Just think of what such targeted influence can do in the hands of unscrupulous political machinators or marketeers. As rates of public awareness regarding the provenance of the information they are exposed to online remain low, people’s worldviews are influenced in unimaginable ways. Nearly half of the Canadian population graduated high school before the web was invented. Unless these people are actively working in fields related to the information environment, chances are most of them will have little to no idea as to how information is presented and distorted online. 

This unprecedented opportunity to persuade is also happening at a time when general trust in media (49%), government (46%), and business (49%) is not high, hovering at the 50% mark in Canada. This coupled with creeping polarisation and sensationalism around foreign activity in the information space creates a perfect storm – and strategic communicators are at the heart of it.

Information is fundamental to democracy, which is a system that derives its legitimacy from the notion that voters are making informed decisions of their own free will. Human cognition, in terms of the ability of citizens to make informed choices, is thus a form of critical infrastructure in a democratic system. The use of persuasion ultimately calls into question a target audience’s ability to be reasonably and fairly informed – for instance, where is the line between unacceptable manipulation in communications and acceptable political discourse? The use of behavioural advertising by Cambridge Analytica to support Brexit has led some commentators to suggest democracy has been hijacked, a claim which does little to reinvigorate waning faith in the established order.

We as a society are faced with a major challenge. There is a heightened ability to target and persuade audiences with few lines in the sand around its application, but general understanding is very weak for what the information environment is and how it can be manipulated. At its simplest, information is that which is processed by someone or something to form an understanding of the world. The information environment comprises the physical infrastructure, computing capacities (such as software and algorithms) and cognition where information is used to form understanding. Within this information environment is a messy throng of actors attempting to influence others including:

  • Foreign actors (Public officials, state-backed media, proxies etc.)
  • Domestic political actors (Politicians, third-party interest groups etc.)
  • Activists and ideologues, often part of transnational issues-based groups
  • Trolls weighing in for the LULZ
  • Financial opportunists (selling merchandise, spammers, fake news etc); and
  • Professional communicators (PR firms, lobbyists, advertisers etc)

Other players also affect and control different aspects of the information environment, including media and major tech companies, but ultimately these actors above aim to make use of journalists and web platforms to compete for attention and influence audiences. These influence operators are not working in isolation. Sometimes they compete with each other, sometimes they cooperate, and there are times where general understanding of these actors misleads those covering such topics to make incorrect conclusions, which in turn then is passed on to readers as misinformation.

For example, financially motivated opportunists capitalise on partisanship and patriotism to sometimes sell cheap wares or drive traffic to dodgy news websites to make money on advertising clicks. When content produced by these actors appears to support a cause, country or politician, it can sometime seem like there is cooperation with those they help when there might not be.

Activists – including journalists, academics and politicians – pushing a particular cause can sometimes have their message amplified by foreign interests. We have seen this in research around critics of the Syrian emergency first-responder group the White Helmets, whereby legitimate domestic voices in the U.K. push narratives and reports that Russian officials amplify and share at the UN level. Interests might align but that does not necessarily mean there is cooperation.

Domestic political actors and media are all too happy to take up “leaked” material if it is in their interest, take the U.S as an example. Emails taken from the Democratic National Committee, which the US intelligence agencies attributed to Russia, were widely used by Trump, via Wikileaks, to discredit Hilary Clinton (Briant & Wanless, 2018).

While much coverage of influence activities focuses on single actors, the space is far more complex and the people behind it diverse. All is not always as it seems.

Much of this activity aimed at persuading happens in a grey space, with few lines around what is acceptable. Experimentation might seem like fair game, until its public discovery says otherwise. The North Face and Leo Burnett Tailor Made are certainly feeling the sting after boasting how they replaced Wikipedia images of hard-to-reach places with photos featuring the latter’s brand in a bid to “hack” Google search returns. The tactic broke Wikipedia’s community guidelines and once noticed resulted in the removal of all associated images and accounts tied to the stunt. The North Face has since had to publicly apologize. It is too soon to say how this stunt will play out for Leo Burnett Tailor Made, but other communications firms have not been so lucky.

There are severe consequences for engaging in untoward activity. Bell Pottinger went bankrupt after it was accused of using fake Twitter accounts to stoke racial tensions in South Africa in a bid to detract attention from their client, the Gupta Family. Cambridge Analytica met a similar fate after its use of Facebook data collected in breach of the social network’s Community Standards to target voters in a pro-Brexit campaign, but also for its campaign in Nigeria backed by “a Nigerian oil billionaire who wanted to fund a covert campaign” to smear then presidential candidate Muhammadu Buhari.

In the absence of norms of behaviour many actors are behaving as if anything goes. Simply put, we lack ethics for a Digital Age. Ethics are moral principles governing behaviour or how an activity is conducted. Ethics can be adopted voluntarily, by a person, group or company, but also in part through legal means such as regulations.

At a time when audiences can now be hyper-targeted, what regulations exist to protect citizens? In terms of lines in the sand around the use of behavioural advertising and persuasion, I could find no mention in the Canadian Code of Advertising Standards or the Canadian Public Relations Society’s Code of Professional Standards. The Personal Information Protection and Electronic Documents Act only requires those collecting personal data to obtain consent and inform the consumer why that information is being aggregated. The vast majority (90%) of people do not read such terms of service and even if they do have been found not to understand that this data might be shared with third parties.

Beyond behavioural advertising, the combined use of psychology and big data to target audiences, forms of persuasion have changed. Advertising, which is the focus of the Canadian Ad Standards and covered in the Canadian Election Act, is only paid promotion, leaving out other forms of communication enabled by the web, such as social media posts or websites that might not incur costs in related content production. And while legal changes now restrict third-party advertising in the lead up to a federal election in Canada, some well-funded outfits have spent years building up a following that they can continue to communicate to for free through their web presence. Ontario Proud, which was launched by a former Conservative staffer in 2016, has 430K Facebook page followers, whereas on the left, North99 launched in turn by a former Kathleen Wynne staffer in 2017, has 90K. Other interest-based initiatives are following suit, such as the union-backed Engage Canada, which counters Andrew Sheer. These actors might aim to influence the upcoming election in Canada, but their efforts are ongoing in a perpetual campaign mode. What is striking about many of these Facebook pages is that they self-identify as being “community based” positioning themselves as organic, when in fact they are slick operations backed by political expertise, which is a tad misleading for unsuspecting citizens particularly for those with low levels of digital literacy. These “community” pages enjoy far higher rates of engagement than official political party pages, suggesting they seem more credible to audiences. This is not to say that there is something wrong with these pages and their activity, only that we as a society need to have a discussion on whether this is acceptable or not – and that really has not happened yet. It is clear that there are major gaps in codified ethics around persuasion in an Information Age.

The absence of clear ethical guidelines in a Digital Age should not be taken as an invitation to simply manipulate as one sees fit. In many jurisdictions, governments and legislators prompted by egregious cases of public manipulation online have begun to scrutinize the behaviour of tech giants and companies employing manipulative methods to promote their goals. Indeed, the consequences are not isolated to the companies that engage in them – it has a ripple effect across societies undermining popular trust in established institutions, business relations and democracy each time such influence activity is uncovered. All manipulation threatens to undermine faith in information and thus democratic systems dependent on it. Every time you, a strategic communicator, shape the information environment with the aim of influencing a target audience you are taking on a great and growing responsibility now.

So, what is to be done? I have some notoriety for painting a bleak picture of the Information Age, mostly because to date, at least, I have observed an ongoing erosion of trust in our institutions and principles, which is not counterbalanced by sufficient and adequate construction of new rules relevant to this Age. However, as we are on this road together, I did promise the conference organisers to leave you with practical ideas regarding what should be done to shape the new reality that is emerging – and I do believe the future we will end up with is in creation at this very moment. So, this is what you can do:

  1. Update Codes of Ethics for a Digital Age. Whether these are guidelines for your organisation or industry association figure out what the lines in the sand are around the use of behavioural advertising and influence. If these are not clearly articulated, someone will go too far and likely soon. Do not wait until it is too late. Winning back trust is difficult, take a page from the social networks.
  2. Just because you can do it, does not mean you should. This should become a commandment for every strategic communicator. Operate as if whatever you plan to do will become public knowledge, and likely sooner than you think. Assess communication strategies for how such disclosure will affect your organisation or client – because in a Digital Age everything comes out eventually. There are consequences for manipulating people. They might not be legal ramifications yet, but bankruptcy and loss of reputation are in many ways just as disastrous.
  3. Practice what you preach – and go with trust. Others might go dirty for short-term gains, but it is only a matter of time until they are outed. A longer-term approach is to build trust. One way to do that, in an Age full of manipulation and decreasing trust, is through transparency. If you are serious about Corporate Social Responsibility, apply your tradecraft to raising awareness for what it means to live in an Information Age. Help Canadians become digitally literate. If you help them trust in information, they will also trust you more when you have something else to communicate. 

The original title of this talk suggested we were in an information war – we are not. Given the definitions around warfare that simply is not the case. We are, however, in a real struggle for our future survival as democratic societies because the rights of individuals to make informed, balanced and unhindered decisions are assaulted from many sides. I want to be clear, we should not look for enemies or assign blame. This struggle is a somewhat natural process – as technologies and science evolve, we discover more about our environments and ways of shaping them to increase benefits to us. Humans have always done this throughout history. The extreme pace of this evolution and our inability to provide moral and ethical grounding to it, which is then codified into law – this is what our greatest problem and struggles are at the moment. And you are the front line in this struggle as strategic communicators. What you choose to do will ultimately shape the future of democracy, whether you like it or not. I, for one, hope you choose wisely.

Header Propaganda Image

The background comes from a First World War propaganda poster “Remember Belgium: buy bonds: fourth liberty loan” and evokes images of the claim that German soldiers were raping Belgian women, which in turn was used as motivation in Great Britain for entering the conflict.

Inline Citations

Malone, J.C. (2009). Psychology: Pythagoras to Present. Cambridge, Massachusetts: The MIT Press.

Ewen, S. (1996) PR!: A Social History of Spin. New York: Basic Books

O’Connor, C., and J. Owen. (2019). The Misinformation Age: How False Beliefs Spread. New Haven: Yale University Press, (Kindle Edition), Loc 1485

Neisser, U. (1967). Cognitive Psychology. Englewood Cliffs, NJ: Prentice Hall.

Mustafaraj, E., and P. T. Metaxas. (2010). From obscurity to prominence in minutes: Political speech and real-time search. In: Web Science Conference, 26-27 April 2010. Raleigh, NC: USA.[online]. Available at [Accessed 25 January 2017]

Garrett, R. K., and B. E. Weeks. (2013) “The promise and peril of real-time corrections to political misperceptions.” In Proceedings of the 2013 conference on Computer supported cooperative work, pp. 1047-1058. ACM

Turcotte, J., C. York, J. Irving, R. M. Scholl, and R. J. Pingree. (2015). “News recommendations from social media opinion leaders: Effects on media trust and information seeking.” Journal of Computer-Mediated Communication 20, no. 5, 520-535

Briant, E., and A. Wanless (2018). “A Digital Ménage à Trois: Startegic Leaks, Propaganda, and Journalism,” In Corneliu Bjola and James Pamment, eds. Countering Online Propaganda and Extremism: The Dark Side of Digital Diplomacy. Routledge,. P 44-65

About Author

La Generalista is the online identity of Alicia Wanless – a researcher and practitioner of strategic communications for social change in a Digital Age. Alicia is the director of the Partnership for Countering Influence Operations at the Carnegie Endowment for International Peace. With a growing international multi-stakeholder community, the Partnership aims to foster evidence-based policymaking to counter threats within the information environment. Wanless is currently a PhD Researcher at King’s College London exploring how the information environment can be studied in similar ways to the physical environment. She is also a pre-doctoral fellow at Stanford University’s Center for International Security and Cooperation, and was a tech advisor to Aspen Institute’s Commission on Information Disorder. Her work has been featured in Lawfare, The National Interest, Foreign Policy, and CBC.

Comments are closed.