At the 2018 RightsCon conference in Toronto, we had the opportunity to test an experimental session format that originally set out to answer the question of whether it was disinformation or persuasion that was the problem in the wake of the Cambridge Analytica revelations. As the session took shape, however, it quickly became apparent that many other issues needed addressing before the original question could be answered. During an interactive session that aimed to aggregate the expertise found at RightsCon participants, led by four Question Masters, collaborated intensely for an hour to explore complex issues around the manipulation of the Information Environment.
The aim of the exercise was to create a mapping of the Information Environment and how it is manipulated to shape perspectives.
In this context, the information environment was taken to include: the physical infrastructure; the data, such as content, software and algorithms; and the cognitive or human perspectives. Participants had three ways to contribute: directly through the Questions Masters; by adding sticky notes to the chart; or by answering on Twitter. 65 people participated in the session including Question Masters:
- David O’Brien, Senior Researcher at Harvard’s Berkman Klein Center;
- Brittan Heller, the Director of Technology and Society at the Anti-Defamation League
- Matt Chessen, Senior Technology Policy Advisor at the U.S. Department of State; and
- Chris Tenove, Post-Doctoral Fellow at the University of British Columbia.
The session was co-chaired by Alicia Wanless and Michael Berk, CEO of Alton Corp. What follows are the aggregated notes as contributed by participants at the session.
The session was framed by four key questions. Participants would move as groups between stations where the Question Masters would engage them. A select number of participants were told to move in the opposite direction to help mix group dynamics as the session progressed.
What are some of the techniques used to manipulate the information environment to shape public perception?
Techniques for Manipulating the Information Environment
Who are some of the specific actors that use these techniques to manipulate the information environment?
Given the list of actors who participants think engage in information environment shaping, the better question might have been: “Who is not attempting to do this?” Topping the list were political actors, including parties, super PACs (in the US) and other ideologues, followed by governments, military, and other agencies, specifically covert ones.
The range of individuals who were named was diverse including: Naïve people; Jilted lovers; Lulz seekers; Celebrities; Researchers; Journalists; Activists; and Conspiracy theorists.
Participants also noted that extremist elements engage in shaping the information environment.
By far the longest group of listed actors shaping the information environment were businesses and organizations. These actors included PR agencies, companies and their leaders, who are sometimes politically motivated, advertisers, media outlets, as well as social media platforms. Industry associations, Foundations, Think Tanks and NGOs were listed separately, but also noted for their attempts to shape the information environment.
A recurring theme of concern throughout the session was the roll of artificial intelligence and the use of autonomous bots – both of these actors were listed as having potential for shaping the information environment.
What techniques are most often used?
The list of high concerns related to techniques for manipulating the information environment was decidedly longer than those of medium and low concern.
Topping the list of high concerns were those related to Deep Fakes. Participants were particularly concerned with faked and forged videos, audio files and imagery, although faked scholarly works were also mentioned. In a similar vein, astrotrufing and the use of faked social media profiles remained a serious concern for participants.
The use of behavioural and extremely targeted advertising raised many alarms. Concerns here included: the surveillance resulting from personalized services (i.e. Alexa, Uber etc); microtargeting influencers; the ability to target disinformation to audiences; the application of eye tracking and wearables to manipulate audiences; social credit scores, and the use of cognitive psychology to persuade people.
Trolling and targeted attacks were also high concerns for participants. This included techniques such as trolling on Tinder, brute force attacks and hacking, doxing to stifle dissent, and “troll armies on 4Chan”.
The use of Artificial Intelligence and the role of Algorithms were raised in this working group as a high concern. The worries here ranged from a lack of accountability for how such technology works to AI enabled content generation fed through private messengers. Participants also wondered about the use of AI to divide and discriminate against target populations, and what would happen if AI and big data were combined to create and target groups based on psychometric profiling. There were also concerns that AI could be used to manipulate mood or sentiment, as well as its potential application for silencing dissent through targeted censorship.
The use of Virtual and Augmented Reality to shape the information environment were of medium concern to participants and ranged from the use of it to plan violence to the manipulation of such feeds to increase individual isolation.
While digital advertising was a high concern for participants, more traditional ad forms continued to cause worry. The use of advertorial columns, outdoor billboards, the use of drones to deliver leaflets, and the use of green screens at public events to create dynamic and responsive ads were all noted as concerns.
Cyber attacks were also noted as being of medium concern and included hacking, phishing and surveillance. The accessing of webcams, hacking of VPNs to record data, hijacking amazon web services, DNS poisoning, and using the internet of things to distribute propaganda were all noted tactics of concern.
The techniques noted of lower concern in shaping the information environment tended to be more creative and diverse, such as using AI-driven teddy bears to influence children’s political beliefs. Several participants raised issues related to biohacking, both in terms of manipulating health devices as well as hacking microchips injected into human bodies. The use of AI-backed chatbots to send messaging and make phone calls was also mentioned, as well as the issuing of fake alerts via SMS
Who is most at risk of manipulation in the information environment?
While participants grouped specific at-risk people, it is important to note that everyone is at risk from the manipulation of the information environment. The reasons for this are myriad, but participants noted a human susceptibility to and ignorance of confirmation bias, which leaves people with a tendency to want to accept what they already believe, as one core common factor. People who belong to homogenous groups, where their views are seldom challenged, were also deemed more at risk to information shaping. The fact that the evolution of technology far outstrips the evolution of human brains (and ethics) to cope with the changed information environment was also noted as a concern and reason why everyone is at risk. Add to this that education programs to help people cope have not caught up, and next to none target adults who might be the most in need of such support.
The complex nature of technologies coupled with an emphasis on using social proof to provide people with content based on the preferences of their friends and family also makes everyone susceptible to information shaping. People tend to more readily accept what their friends believe and share. This is aggravated by a declining trust in media.
More specifically, participants identified four groups of people who are more at risk than others: The Unprepared; The Disenfranchised; The Overwhelmed; and the Targeted.
The Unprepared can more broadly be described as low-information people who are neither digitally nor media literate. This group of people have received little to no education to prepare them for the information age. This could simply be a facet of education or age, having not had a high level of exposure to information communication technologies growing up or in life. The Unprepared tend to be newer to the internet and are less likely to check claims made online. So long as the content consumed fits their pre-existing beliefs, they are likely to accept it and share it (if they know how). There was a wide consensus among participants that The Unprepared tend to be older and retired, and given the trust in media they might have been raised with, they are less critical of content.
The Disenfranchised tend to feel left behind or pushed out of society, sentiments often resulting from increased globalization. While categorized here as a group, representation is diverse and included: ideologues, radicalized people, those under economic pressure, citizens in repressive societies, individuals who feel socially and culturally alienated, and communities in conflict. The Disenfranchised were believed to be vulnerable to information shaping as they are more likely to believe false promises with so few options in life and had a higher need to feel belonging. Participants also noted that The Disenfranchised would also be more likely to feel disempowered, and thus be susceptible to having perceived grievances manipulated by propagandists. As people living under social isolation and economic pressures tend to be significantly stressed, they also wouldn’t have the time to be more judicious consumers of information.
The Overwhelmed are susceptible by virtue of being constantly plugged into the information environment. Participants felt that this group might include teenagers or others who lack critical thinking and media literacy, as well as busy people who simply don’t have the energy or interest to think twice about the information they are consuming. While many of the people in this at-risk group are digital natives, they have not fostered a deeper understanding for how the information environment works and is shaped. The Overwhelmed are also more likely to be consuming more low-quality sources, a sort of “junk food” information diet.
The Targeted are a high-risk group of those who are specifically attacked online and include vulnerable groups based on gender, sexual orientation, ethnicity, religion and activism. The Targeted are most at-risk to information shaping as it is used to silence and abuse them taking away their voice and agency, while also being used to vilify this group’s members. Participants felt that information shaping would also be used to discriminate against these people.
What does it all mean?
Unfortunately, without careful intervention to mitigate how the information environment is manipulated, most participants felt the future was bleak.
A distrust in media concerned participants. Concerns raised ranged from media consolidation to the hijacking of trusted media sources. Biased media was raised by several participants as a growing concern in the shaping of the information environment.
Participants continued to see the use of social networking platforms as a core tactic for shaping the information environment. The pushing of disinformation and provocative content through social networks, the use of bots to manipulate news and search feeds, and a growing use of direct messenger platforms were all noted tactics of high concern.
Further polarization and radicalization of societies was widely predicted, with a rise in hatred, mistrust in each other, erosion of faith in the established order, civil discord and ultimately a breakdown in democratic systems. This, in turn, will lead to increased violence and targeted attacks with growing rates of misogyny, discrimination, and abuse online.
Many participants feared an authoritarian response from governments, which would include the introduction of the new laws and regulations to clamp down on the flow of information, instead of a much needed coming to grips with the use of tech-fuelled persuasion to shape the information environment.
Cover Image: British anti-Nazi poster from the Imperial War Museum “Posters of Conflict” Exhibition and permanent collection.