50.9 F
Washington D.C.
Monday, November 28, 2022
spot_img

Understanding Why We Are So Susceptible to Mis/Disinformation

One of the most important things we can do to decrease our susceptibility to mis/disinformation is to be motivated to have an accurate understanding.

The human brain is pretty amazing. While the world’s most advanced supercomputers and artificial intelligence algorithms are able to surpass human capabilities when it comes to a handful of very specific “cognitive” tasks, no technology comes close to the brain when it comes to its ability to reason. Though evolution has provided us with something that will be difficult for technology to ever equal, our brains do not function in a flawless manner. From misperceiving something visually to attributing the wrong cause of an event, cognitive errors are commonplace.

Unfortunately, this means we are all potentially susceptible to mis/disinformation and influence campaigns. While many believe training that revolves around cognitive or unconscious bias is a remedy for cognitive error, those who study psychology know that the path toward reducing the likelihood of falling for mis/disinformation is much more complex and nuanced. This path begins by understanding how we reason.

Our brain engages in two types of information processing, known as System 1 and System 2. 1 System 1 thinking is typically quick, automatic, and outside of our awareness (unconscious). It is a cognitive short-cut. System 2 thinking is essentially the opposite; slow, effortful, and deliberate (conscious). System 1 processes include things like perceptions (from our five senses), heuristics (mental shortcuts or guidelines), and emotions. System 2 processes include judgment (e.g., What do I think of this music?), decision making (e.g., What should I have for lunch?), and other forms of active reasoning.

Both of these Systems can influence how we think about and respond to information, but System 1 (unconscious) processes are generally considered to be more difficult for us to change, due to having been formed and activated outside our awareness and due to the typically short-lived effects of debiasing strategies. 2 Since System 1 processes are unconscious, we are often unaware that our thinking could be flawed and are unlikely to attempt to correct those flaws. For example, if a store’s customer service representative has an unconscious bias against individuals of a certain race, he may be more likely to treat these customers impolitely or even illegally. Because the representative’s bias is unconscious, he is likely unaware of his bias and certainly unaware that his bias influenced his negative treatment of customers.

For the reasons mentioned above, we are often better able to avoid the influence of mis/disinformation when using our System 2 process (e.g., active reasoning). We can question the validity of sources and ask ourselves if we should seek out more information if we are sufficiently motivated to have accurate information on a topic. Seeking more information affords us an opportunity to adjust any initial impression we may have formed regarding information presented to us.

Since we are largely unaware of our unconscious processes and their effects, unconscious processes are more difficult for us to change, and most of this change is short-term versus long-term, what can be done to lessen our susceptibility to mis/disinformation?

The bad news first: In most cases, we need to accept the fact that our unconscious processes are working as they were designed to, sometimes leading to better information processing and sometimes not; and use (conscious) information processing strategies to compensate for those times when unconscious processes lead us down the wrong path. In many cases, proper preparation and use of sound reasoning strategies (described in more detail below) will eliminate many of the negative effects of cognitive bias. This is especially true for the average person trying to avoid falling for mis/disinformation.

Now, some good news: A large amount of research has focused on topics like reducing cognitive bias and, for some biases, their negative influence can be proactively mitigated through bias-specific reduction strategies. Researchers have identified over one hundred cognitive biases, so adopting strategies to mitigate each one individually across the different contexts in which they might arise is not really reasonable.

Information designed by someone to get you to think or behave in a certain way is especially tricky from the perspective of an information consumer interested in knowing the truth. For example, the conveyor of mis/ disinformation may say that their information came from a source that many trust. This activates an unconscious bias that influences consumers to think positively about the content of the information because they have a positive view of the supposed source, and increases the likelihood that they will accept the information as true (due to a form of a cognitive bias called the halo effect). This is why our favorite actors and sports figures are used so frequently in advertising.

The mis/disinformation conveyor might also include a kernel of truth in what is otherwise false or misleading information, making the broader message seem at least plausible. So, when it comes to being a consumer of possible mis/disinformation, or to avoiding influence campaigns, it is especially important to apply active cognitive strategies to improve our ability to reason.

One of the most important things we can do to decrease our susceptibility to mis/disinformation is to be motivated to have an accurate understanding of the information we consume. Research has demonstrated that accuracy motivation can reduce the effects of a number of cognitive biases, including in situations where personal biases come into play (e.g., political contexts). As a result, accuracy motivation increases our tendency to seek information that can factually confirm our beliefs, or automatically avoid and reject information that is factually incorrect.

Other strategies include developing a basic understanding of research methods (especially concepts like the scientific method, correlation versus causation, and graph/chart design); numericity (the basic ability to understand and manipulate numbers and quantities); statistics, and probability; seeking out multiple points of view; not accepting information at face value; using validated processes to do things like reduce biases and cut out “noise” (e.g., like how intelligence analysts use structured analytic techniques); and enlisting other people to check our assumptions (e.g., like occurs through structured debates, red teaming, and critical reviews).

The creation and spread of mis/disinformation is an ongoing concern for all of us. By understanding and addressing the sources, causes, and mitigations to mis/disinformation, we can reduce the amount of inherently inaccurate information that we communicate to our colleagues, friends, and families. Malign influence campaigns, especially from foreign organizations or governments, only serve to separate us while our ultimate goal is to be more united.

Read more at NCSC

Homeland Security Todayhttp://www.hstoday.us
The Government Technology & Services Coalition's Homeland Security Today (HSToday) is the premier news and information resource for the homeland security community, dedicated to elevating the discussions and insights that can support a safe and secure nation. A non-profit magazine and media platform, HSToday provides readers with the whole story, placing facts and comments in context to inform debate and drive realistic solutions to some of the nation’s most vexing security challenges.

Related Articles

- Advertisement -

Latest Articles