Responses to the type of cyber threat that aims to influence populations, to radicalize or to destabilize targets, generally fall into two broad categories. First and most popular are attempts to contain the information (or disinformation). This is what we hear most about: regulation, restriction, and kinetic targeting of adversarial propagandists. The second but too often over-looked security measure is to harden the targets themselves. It is the second option that interests me. What would it take to harden the targets of malign influence?
This debate has been going on since at least 500 B.C. in Athens. Plato wanted to protect the public from influences that arouse passions rather than reason by physically keeping the influences at bay. Aristotle, on the other hand, suggested that the materials we are tempted to censor ought to be our textbooks – that there are essential things to learn from the very materials that worry us.
Again, I situate myself in the later camp. I am suggesting that we should do in the cognitive realm what we do in the physical when we are protecting civilian populations from kinetic harm: Arm the sheep.
How do we arm civilians faced with psychological threat? Some have suggested teaching critical reasoning en masse. Others have suggested methods of tracing sources of (dis) information so audiences can understand the intent of the “messages.” But those suggestions make a critical mistake about the nature of the threat. They assume the threat is disinformation and therefore revert to information security tactics even in the psychological realm. Disinformation itself is not the threat.
Our adversaries are engineering the way information is being received. They are not simply disseminating “alternative facts”; they are providing an alternative way of understanding the facts. They are demonstrating an alternative way to give meaning to the facts.
That is the threat we face. The threat is a manufactured and weaponized narrative frame that undermines audience identity and meaning-making capacities.
Why does disinformation stick even when it has been proven false? The answer is because the disinformation is more meaningful to the audience than the truth. The new narrative does something for the audience. It fills a need. This is more easily accomplished when there is a disconnect between inherited narratives and lived experience.
In the physical realm, while most revolutionary theories tend to view violence as a necessary means to an end, the violence of terrorism transcends its instrumental utility. Terrorists use violence against certain targets to terrorize witnesses; the physical victims are only collateral damage to the real psychological goal of terrorizing the living.
The same is true in the psychological realm. The goal of disinformation is not simply to get the audience to believe something that is false. The goal is to affect the mental mapping of the audience by encouraging mis-identification. Once that is achieved the terrorists’ work is done. The audience will carry on and categorize incoming information according to the new narrative frame and will act accordingly in such a way that again undermines the values that have held together their very identities. The cycle reinforces the weaponized narrative.
How do we prevent this? By getting clear with ourselves and with the public about the nature of the threat and how we are being targeted. The audience needs to know what their vulnerabilities are, what a narrative attack is, and what it looks and feels like. That information we want to censor is the textbook.