Originally published by The Strategy Bridge with Michael Berk
The internet, social media, artificial intelligence, and other digital technologies are enabling the manipulation of the information environment for strategic purposes. Global hyper-connectivity and increasing competitiveness have propelled information warfare as a viable means of achieving strategic objectives—but what its theory and practice means can vary from military to military and expert to expert. Indeed, a lack of universal definitions, combined with a desire by military thinkers to carve things up into military domains and focus on hardware and technology, has left the role of information in the context of 21st century warfare insufficiently examined. This article, building on a previous one entitled “The Strategic Communication Ricochet,” considers why information is more than a domain, and how military exercises must be adapted accordingly to foster increased resilience to information warfare.
The idea that information can be used in conflict is not new—great military strategists such as Sun Tzu, Machiavelli, and Clausewitz all emphasized using information to outmaneuver an enemy.1 Despite an historic awareness of information in warfare, finding an internationally accepted definition of information warfare has been a challenge. Part of the problem is the word information itself—which can encompass many things. For example, one meaning as provided by the Oxford Dictionary is “data as processed, stored, or transmitted by a computer”—this could entail everything from code to content. This understanding has been reflected in some military materials, such as the U.S. handbook on Inform and Influence Activities, which breaks the information environment out into three dimensions: the physical (e.g., infrastructure), informational (e.g., data) and cognitive (e.g., values, beliefs). Such scope, however, means that information is actually integrated into nearly everything a military does—from command and control functions to public affairs. Yet, information activities are often treated as something stand alone, or unrelated to more traditional military domains. Moreover, when it comes to the cognitive dimension of information in particular, there is an emphasis on outgoing communication with little attention to how such content will be twisted and used by adversaries against the sender to negatively affect leadership and decision-making.
What is lacking at the moment, both conceptually and from a strategic perspective, is an understanding for what information has become, and how every action, from those of the top general at headquarters to those of a private in theatre, carries with it challenges and opportunities for friendly strategic communication as well as adversarial exploitation. The ways information can be shaped demands that each activity during mission planning and corresponding exercises be reimagined for its cascading exploitative potential—which means looking far beyond the basics of public affairs or social media messaging in training scenarios.
Despite the changing information environment, most military doctrine focuses only on outgoing communications, and practitioners conduct military exercises as Ian Kippen notes in a recent Small Wars Journal article, in a linear fashion putting greater emphasis on later phases of operational preparedness (such as defend, restore, and transition). Such an approach leaves the most critical first phase (deter) aiming to avoid conflict entirely, or create a more favorable starting position for later fighting, less practiced. However, this is precisely the phase where manipulation of the information environment by an adversary would occur, enabling them to mislead friendly leadership and deny time and opportunities for preparation and coordination of activities. As it stands, most operational exercises fail to integrate information activities in a meaningful manner into all phases of training, particularly the early planning stages. As a result, while information and strategic communication activities are continuously generated in support of other operations, there is no fostering of the necessary understanding for the role of information in modern warfare at all levels of military or relevant supporting agencies within and outside of government.
NATO notes on its website that “exercises are important tools through which the Alliance tests and validates its concepts, procedures, systems and tactics. More broadly, they enable militaries and civilian organizations deployed in theatres of operation to test capabilities and practice working together efficiently in a demanding crisis situation.” Arguably, in a digital age, militaries and their partners must also do all of this within a complex information environment, and as such, training must adapt to develop appropriate skills beyond specialist units. Ultimately, these findings should bring about several key changes in how the military structures their training exercises.
VOX POPULI RULES
For starters, the broader public must be seen as more than merely target audiences, whether in the context of a conflict theatre or at home. A more diverse set of actors must be represented in training exercises, beyond the usual military, government, political figures, and extremist elements. This might include the addition of a new cell that represents a training audience’s domestic public, which could encompass academics, activists, and independent bloggers, as well as the average people, in the form of public opinion polls.
While our Western socio-cultural sensitivities to be kind and respectful of one another are noble, it presents a weakness in information warfare that can readily be exploited by the adversary. It follows that the military must conduct information activities in exercises with a gloves-off atmosphere. Having members of the same military train each other can impede the spontaneity and surprises that come from actual warfighting experiences. Militaries should consider externally sourcing communications red teams to avoid kid-glove treatment in training. While discrimination is a concern, the military must consider how key personnel can be attacked and provoked before the exercise takes place. A training exercise might be friendly, but information warfare in a digital age is not. Concerns over how such testing might risk the overall aim of an exercise must be put into context: if personal attacks cause a training audience to fail in a simulated environment, what impact will such attacks have in real conflict scenarios? This belief, that training audiences should not fail, is misguided—far from boosting morale, false-positives do disservice in properly preparing personnel for the realities of today’s information environment, both during peace and conflict.
THE INFORMATION ENVIRONMENT IS MORE THAN (SOCIAL) MEDIA
Military planners need to look beyond journalism as the sum-total of media. While traditional mass media plays an important role in the information environment, it no longer controls how information is shaped and distributed. Likewise, adding fake news or social media activity fails to encompass the complex interplay that is the digital space.
Often, what appears in news media, particularly in a conflict zone, has emerged via an interconnected system of actors, creating, shaping, and sharing information in a dynamic manner across platforms and with different tools. An instructive example is the counter-response to Syrian emergency first responders, the White Helmets.
In mapping 54-English language media items referencing the 4 April 2017 attack in Khan Sheikhun, a model emerged that illustrates this point. After the incident, emergency first responders posted accounts of it to social media. Within a few hours, sympathetic western mainstream media picked up the story, blaming the Syrian government and its Russian ally. Within ten hours of the incident, counter-claims emerged, which were reposted across websites and platforms, with official Russian statements offering counter narratives blaming terrorists by the close of the first day. For the next 24 hours following the incident, a battle of narrative played out in the media supporting and attacking the White Helmets, with sticky narratives emerging that are reused repeatedly to discredit the White Helmets and position them as terrorists, fakers, or western stooges for regime change. U.S. Politicians jumped into the fray questioning the origins of the attack and blaming domestic political opponents for the incident. It was followed by other third-party actors, such as Dilbert cartoonist, Scott Adams, who also questioned the incident’s provenance. Within forty-eight hours Russia’s Deputy Ambassador to the UN claimed at an emergency session of the Security Council that western interests in Syria were “guided by the need to change the regime,” which was picked up by Information Clearing House and then reposted on more than twenty additional websites. Within hours of President Trump’s response of air strikes to the incident, the hashtag #SyriaHoax was launched on Twitter and in a span of nine days was “used in 192,000 tweets – 85% of which originated” in the U.S. This automated and heavy re/posting of messaging also helped to distort internet algorithms, shaping what information is ranked higher in live search returns and newsfeeds and influencing what average users are exposed to about the situation.
Critical media didn’t end here. The report “The White Helmets: Fact or Fiction” by independent British journalist, Vanessa Beeley, was presented by the Russian Federation to the U.N. Security Council on 10 May 2017, while coverage critical of the White Helmets claims continued throughout a six-month period following the incident,2 including an article by former weapons inspector, Scott Ritter. In October 2017, a grassroots British anti-war group run by several reputable academics among others, Stop Frome War, hosted Beeley and other critics of the White Helmets at an event covering the Syrian conflict, which is posted to YouTube. Thus, by the time The Guardian ran a supporting article about the White Helmets in December 2017, criticizing Beeley’s work, many of the same narratives and sources producing counter-claims following the Khan Sheikhun were re-used by a mixture of real and fake Twitter and Facebook users to negatively and rapidly dominate social chatter about the piece.
Suffice it to say this is a complex and changing information environment, an environment already primed long before an event takes place, consisting of myriad actors and established networks of websites and online communities. While traditional mainstream media still plays an important role, the internet enables other actors to push the boundaries of audience engagement that often blurs the distinction between adversarial activities and legitimate domestic actors (who have a democratic right to voice criticism) in a complicated weave of online networks. This dynamic use of the information environment must be better represented in training activities to help militaries better prepare for operating in it, particularly given demands for increased transparency—red or blue media coverage and social media activity are simply not enough.
TECHNOLOGY CHANGES: FOCUS ON RESILIENCE, BUT REMEMBER INFORMATION IS CENTRAL
It can also be easy to lose sight of the forest (the scenario) for the trees (technology). Playing catch up to create simulated social networks does not provide an accurate recreation of the information environment as a whole. While digital technologies are changing at an alarming rate, the creation and maintenance of a system that mimics this environment is likely to fall short of the desired objective until artificial intelligence-driven solutions begin to simulate the inherent human behaviors and interactions with a reliable degree of accuracy. Instead, at the strategic planning or joint task headquarters level, exercise managers should opt for detailed scenarios that mimic the desired levels of interoperability and process familiarity, while developing and honing problem solving skills related to the information environment. Such an approach would foster longer term, platform agnostic decision-making skills and acclimatize trainees to operate in virtual reality.
To this end, in addition to information activities becoming a fundamental aspect of the planning phase, it must be integrated through each activity in the exercise. A decision-tree process can help military planners streamline this, whereby activities are coded and a series of information-based responses follow depending on the action taken by the training audience.
Ultimately, the best defense in information warfare is resilience—the ability to critically assess a dynamic information environment where everything is not always what it seems and manage the identified risks to ensure mission success. In a military context, this could include adapting basic and advanced levels of training to include fostering a deeper understanding for how information warfare is changing the nature of conflict, and how every service member’s actions can and will be used against them in a digital age. Greater awareness of the pervasive role of information in all military activities is required, particularly among staff-level leadership and, ideally, political leaders responsible for defense matters. Education must demonstrate the importance of information to that target audience—not just from the perspective of strategic communication practitioners. In particular, a shift in perception must occur from one of viewing cyber or information operations as a separate sphere of engagement in a conflict to one that understands that societies dependent on information communication technologies are particularly and constantly vulnerable to a new threat of information warfare. This realization would lead to transforming information operations from a merely enabling capability to a more elevated role in the national arsenal of strategic tools. This is as imperative within a full-spectrum military environment today as it is for wider society.
Michael Berk is a Visiting Research Fellow with the Center for Cyber Security and International Relations, University of Florence, and Principal at Alton Corporation.
Cover Image: Propaganda poster by the House of Seagram (Seagram Distillers) of Montreal, Canada 1942, “as part of its contribution to the National Victory Effort” by artist, Seymour Goff, (Ess-ar-gee).
- Sun Tzu, (1944). The Art of War, trans. Lionel Giles. Dover Publications Inc. P 49; Machiavelli, N. (1910). The Prince. USA: Dover Publications. P 8; Clausewitz, C. V. (1976). On War, trans. Michael Howard and Peter Paret. Oxford University Press, P 13. ↩
- Off-Guardian, (8 April 2017) “Is the sarin also a lie?”: http://bit.ly/2nW3XaQ; Scisco Media (11 April 2017) “Alleged Sarin Gas Attack by President Assad is Fake News”: http://bit.ly/2Cb6brW;; The Duran (12 April 2017). “White Helmets are lighting cigarettes up after handling “sarin””: http://bit.ly/2G61MsR; Veterans Today (22 April 2017). “Moscow demands OPCW explain how White Helmets emerged unharmed in Syrian sarin attack.”: http://bit.ly/2Eym5lK;21st Century Wire (12 June 2017). “SYRIA: Sarin Attack Narrative Destroyed by MIT National Security Expert”: http://bit.ly/2ERdW9v; NOW Report, (21 September 2017). “United Nations: Syrian Sarin Gas Attack Was ‘Staged'”: http://bit.ly/2Bo9xvi Russia Insider. (1 November 2017). “Russia Draws the Line on Phony Syria “Sarin Gas Attack” at the UN”: http://bit.ly/2ETnMI0; Consortium News (9 November 2017). “Did Al Qaeda Dupe Trump on Syrian Attack?”: http://bit.ly/2hpKDnb. ↩