54.2 F
Washington D.C.
Thursday, April 25, 2024

PERSPECTIVE: Anatomy of Narrative Warfare and Social Media Ops Since the Last Election

In the months since the 2016 election there have been countless words written regarding influence, misinformation, the ever-present “fake news,” etc. One of the most fertile social media platforms for this type of activity was and still is Twitter, which is still the “Wild West” of social media.

The Knight Foundation recently published one of the largest studies and analysis that is publicly available to date regarding “fake news” on Twitter. The 61-page study, meticulously cited, is worth a careful read and saved for future reference. “Fake news” is just one aspect of malign influence campaigns whether this influence originates from foreign adversaries or unscrupulous, witting and unwitting domestic players. Malign influence is a threat to U.S. national security and, by extension, to our allies as well.

What follows is some basic perspective about where and how this type of analysis fits into the national security landscape. Of note, I will say that the topic is complex, to say the least. A great deal of scrutiny regarding the findings of this excellent piece of work is not only due but required for national security professionals in addition to well-informed citizens from other walks of life. Protecting the nation from malign influence is not exclusively the role of professionals but also the responsibility of citizenship. In terms of protecting the homeland, this study does, though, provide some of the best large-scale insights for national security professionals to date.

As you read this article, it would be helpful to also have reviewed the study, especially the executive summary that highlights 13 primary findings. Additionally, I will also state categorically that the role of mis- or disinformation in social media is made far more effective because it plays on the principles of and directly supports malign narratives.

For the record, influence as a whole and the role of narrative in influence is my profession, honed over a career in the U.S. Army and now privately as the VP of Narrative Strategies, a U.S.-based think-and-do tank that focuses on the role of influence in national security matters.

Narrative is as natural to human beings as breathing. We are meaning-seeking animals and our primary means of meaning-making is narrative. Narrative is the way we create, transmit and, in some cases, negotiate meaning. Without narrative, life would be experienced as an unconnected and overwhelming series of random events. We organize, prioritize, and order our experiences through narratives that we usually inherit. What’s more, we understand not only the world around us but also ourselves through the narratives we live by; our personal narratives inform our personal identities, our tribal/familial narratives inform our tribal/familial identities, and our national narratives inform our national identity.

“Life stories do not simply reflect personality. They are personality, or more accurately, they are important parts of personality, along with other parts, like dispositional traits, goals, and values,” writes Dan McAdams, a professor of psychology at Northwestern University, along with Erika Manczak in a chapter for the APA Handbook of Personality and Social Psychology.

The 13 primary findings from the Knight Foundation study

  1. This study is one of the largest analyses to date on how fake news spread on Twitter both during and after the 2016 election campaign.
  2. Much fake news and disinformation is still being spread on Twitter.
  3. Just a few fake and conspiracy outlets dominated during the election – and nearly all of them continue to dominate today.
  4. Our methods find much more fake and conspiracy news activity on Twitter than several recent high-profile studies – though fake news still receives significantly fewer links than mainstream media sources.
  5. Most accounts spreading fake or conspiracy news in our maps are estimated to be bots or semi-automated accounts.
  6. Our maps show that accounts that spread fake news are extremely densely connected.
  7. Fake news during the election did not just adopt conservative or Republican-leaning frames – though it has become more ostensibly Republican since.
  8. There are structural changes in the role of Russian-aligned clusters of accounts post-election.
  9. Most of the accounts that linked repeatedly to fake and conspiracy news during the election are still active.
  10. A few dozen accounts controlled by Russia’s Internet Research Agency appear in our maps – but hundreds of other accounts were likely more important in spreading fake news.
  11. There is evidence of coordinated campaigns to push fake news stories and other types of disinformation.
  12. Coordinated campaigns seem to opportunistically amplify content they did not create.
  13. One case study suggests that concerted action against noncredible outlets can drastically reduce their audience.

The KF study provides a great deal of insight into the mechanics of Twitter in regard to the spread of mis- or disinformation. In this article, I’d like to offer some limited perspective regarding the specific threats of this type of warfare to supplement the points made in the study. I say “limited” only because influence is far more complex than a short article. Done well, understanding influence is at best a graduate course but, more likely, graduate studies interpreted through a career of implementation.

My definition of an influence campaign such as we have and continue to experience via Russia and other primary threats: Influence done well is a complex and intricate choreography of sustained actions, words and related activities wrapped around a core narrative.

The primary points of the study:

As per the study’s first point, this analysis is massive in scope and data-driven. One of the most challenging aspects of many of the pieces written regarding social media influence is that while presented by remarkable experts, research must be qualitatively/quantitatively data-driven in order to effectively plan and execute responses. Yes, facts, especially facts, in context matter.

One of the most disturbing findings of the study is that 80 percent of the Twitter accounts disseminating “fake news” are still active. Quantity matters for a variety of reasons. First, reaching as much of an audience possible means a higher probability of triggering the behavior intended. Secondly, and this is a big deal, is that data provides the insights needed to adjust messaging to become more effective. More information (more data) means that the perpetrators can better “tune” the message and learn more about their audience’s psychological vulnerabilities. As a frame of reference, this is why the massive amounts of data in the Cambridge Analytica/ Russian fiasco matter.

Data referenced in the third point reveals that the sites hosting/disseminating the most malign messaging still exist. The important point here is that “if” we were to use the data to implement a proactive and defensive effort, the data points us toward the most dangerous offending sites/ actors. In military terms, knowing where your adversary is and what they’re doing provides targeting information. This information is of no value if no significant response is implemented. This, far and above all other points, is most tragic. The U.S., and to a large extent our allies, have no significant strategy and execution to make use of this targeting insight.

Point No. 5 is a critical vulnerability and regards the automation of malign accounts. This is the equivalency of your adversary being massively armed while your forces are not only few in numbers but lightly armed. AI and other automation that amplify adversarial efforts simply means that unless we employ automation that mitigates adversarial automation, our forces will be overwhelmed and rendered ineffective. If we have targeting information regarding automation supported sites, then currently available technology is an effective tool to “level the playing field.” Current social media platforms primarily focus on identifying individual “bad actors” but largely ignored the originating sites responsible for content and automation. This is often referred to in print as “whack-a-mole.”

Point No. 6 highlights the ultra-dense centrality of specific sites. “Centrality” in social media terms simply means that of all the “malign actors,” witting or otherwise, are connected to a select few sites responsible for providing the content and connectivity to witting and vulnerable users. Once again, the sites identified through centrality analysis provide targeting information if and when we decide to mitigate this type of influence. Yes, for those asking about the “800-pound gorilla in the room,” Russian efforts are central to these types of sites. For critical reference, I highly recommend following Hamilton 68, a German Marshall Fund Alliance for Securing Democracy project that identifies Russian activity on Twitter in near real-time. Of note, this site also connects Russian Twitter activity to the most prominent websites and URLs responsible for disseminating malign influence on social media. When viewing the Hamilton 68 site, you will no doubt recognize many of the most active URLs/domains since they are prominent U.S. media sites.

The following two points are not only related but are indicative of the “learning” by Russian influencers regarding the potential to effectively influence their targets. While during the 2016 election a great deal of their effort focused on divisive issues between ideological “sides” in U.S. politics, post-election they have evolved toward a more “right-leaning” posture. This evolution demonstrates that they have determined, through analysis, that there is more “bang-for-the-buck” in these audiences. As with the evolution of U.S. targeting, Russia has learned through success in the U.S. (and, by extension, with UK audiences) that social media/digital influence is worth the investment. Russian activities are now a predominant influence effort in every global arena of value to the Kremlin. To make matters worse, our other primary threats such as Iran, China, North Korea and violent extremist organizations have also absorbed the lessons learned by Russia in the U.S. elections.

Points 10-12 represent one of the most troubling aspects of this study. While Russian accounts are relatively limited, they perform in “lockstep” with hundreds of other, often domestic, accounts. There are countless complicated reasons for this, but the most acute aspect is that Russian accounts provide content to linked accounts. Analysts in Russia also provide analysis to content and responses to content so that they can evolve a message to become more effective at accomplishing their objectives.

This brief analysis merely offers some personal/professional insight into the “so-what” of this study. I’d also like to offer a few thoughts regarding “why” social media is so powerful when applied by influence professionals.

Why is “fake news” on Twitter and other social media so powerful and what can we do to mitigate its power?

Here’s the crux of the matter: social media is the vehicle that transmits influential content. This begs the question: “What makes the content so influential?”

The answer to this question is complicated – in fact, too much so for a short article like this – but I’ll try to pose the answer as succinctly as possible. Humans are tribal by nature. It’s simply a matter of who we are. Tribes were formed in order to provide security, sustenance and community. Over a couple hundred thousand years we’ve evolved, but our DNA still prompts us to reflexively act in order to sustain our own “tribe.” One of the most effective “triggers” to prompt these responses is to threaten or imply a threat to “our tribe.” Dividing audiences as a precursor to triggering the threat, implied or otherwise, is a tried-and-true tactic of influencers. Hence, the divisive content favored by Russian “bad actors.”  The so-called triggers are encoded in our tribal narratives, which is why a full understanding of narrative principles matters to analysis.

In order to apply narrative principles, analyzing mountains of data that illuminate the narrative identity of their audience is essential. In our current environment, access to these mountains of data is largely provided by hundreds of millions of users so habitually involved in using social media. As noted earlier, Cambridge Analytica, Russia and other adversaries use this data to their advantage and our disadvantage. Simply put, unlocking the specific narrative identity of an audience is the key to being able to predictably trigger “tribally encoded” responses. Advertisers use much the same techniques to get us to buy their products. Yes, we are being manipulated subconsciously every time we log on to a device.

At Narrative Strategies we believe that what Russia and our other adversaries are involved in to our disadvantage is “narrative warfare,” not information warfare. The truth is, we are not engaged in a war over information but a war over the meaning of information. Narrative is about meaning, not about truth. As in the KF study, you can see that the overwhelming majority of tweets are tied in some form to some underlying narrative. Social media and other digital content are the vehicles for delivering content that, based on narrative principles, is designed to predictably trigger responses by users.

So… what does this mean to national security professionals and our citizenry?

Defending the homeland in modern times by default means that we must compete in the cognitive realm. Our conflicts these days are far more about influence than traditional warfare. This means that we need a full-spectrum working knowledge of how influence works. Even more important is that we need an information strategy that implements the knowledge of influence. For more on this point, please see this white paper on Homeland Security Today.

As with the final point in the KF executive summary, the situation is not hopeless. There are many actions available to the U.S. government that would and could go a long way toward mitigating Russian and other adversarial influence. The real question is will we employ available knowledge and technology to our defense. Across the spectrum of the U.S. government there are many entities capable of executing pieces of such a strategy. Currently, no entity exists that could effectively lead or conduct such a strategy.

The bottom line is that there is no single tactic or effort that will effectively mitigate this threat. We in the homeland security community must immediately and effectively work together. We are literally within days of our midterm election and we’re as prepared as the defenders of the Alamo against overwhelming odds. Courage and tenacity are great but no substitute for being well-armed in the face of a numerically superior and well-armed adversary.

I will also note that outside homeland security and other national security professional communities every single citizen has the responsibility to be as well-informed as they are activist. Media/digital literacy is a critical component of mitigating outside malign influence. Arming citizens with knowledge is the influence equivalent of a well-armed and trained “national guard.” Honest, well-informed citizens are exponentially more capable of not being triggered by the “us vs. them” divisive narratives disseminated by our current threats such as Russia.

Finally, I wish to again thank the Knight Foundation and their colleagues for this extraordinary work. I encourage every single person in the national security arena, cyber, IC, influence or otherwise to read and re-read this study. Strategy is inextricably linked to knowledge. The KF has provided the knowledge. Now, it’s up to the U.S. government and our citizenry to apply it.

 

The views expressed here are the writer’s and are not necessarily endorsed by Homeland Security Today, which welcomes a broad range of viewpoints in support of securing our homeland. To submit a piece for consideration, email [email protected]. Our editorial guidelines can be found here.

author avatar
Paul Cobaugh
Mr. Paul Cobaugh retired from the US Army as a Warrant Officer after a distinguished career in the US Special Operations CT community, primarily focused on mitigating adversarial influence and advancing US objectives by way of influence. Throughout his career he has focused on the centrality of influence in modern conflict whether it be from extremist organisations or state actors employing influence against the US and our Allies. Post military career he was offered and accepted the position of Vice President at Narrative Strategies, a US based Think-Do Tank which specializes in the non-kinetic aspects of conflict. He has also co-authored, Soft Power on Hard Problems, Hamilton Publishing, 2017 and Introduction to Narrative Warfare: A Primer and Study Guide, Amazon, 2018
Paul Cobaugh
Paul Cobaugh
Mr. Paul Cobaugh retired from the US Army as a Warrant Officer after a distinguished career in the US Special Operations CT community, primarily focused on mitigating adversarial influence and advancing US objectives by way of influence. Throughout his career he has focused on the centrality of influence in modern conflict whether it be from extremist organisations or state actors employing influence against the US and our Allies. Post military career he was offered and accepted the position of Vice President at Narrative Strategies, a US based Think-Do Tank which specializes in the non-kinetic aspects of conflict. He has also co-authored, Soft Power on Hard Problems, Hamilton Publishing, 2017 and Introduction to Narrative Warfare: A Primer and Study Guide, Amazon, 2018

Related Articles

Latest Articles