54.9 F
Washington D.C.
Friday, April 19, 2024

Four Reasons Why It Will Be Harder to Catch the Next Insider Threat

As insider threat initiatives proliferate across industry and government, professionals are increasingly being relied upon to make quick and accurate assessments of people, their intentions and surrounding circumstances. Annual Insider Threat Training and Vigilance campaigns[1] promote workforce awareness and encourage employees to report regardless of how certain they may be.

Does more reporting lead to faster and more accurate insider threat detection?

As history shows, accurately judging an employee’s level of trustworthiness is not as straightforward as it seems. Incorrect determinations of trust, whether being over- or under-confident, can pose significant and long-term ramifications for the individual, the organization, and national security. In 2015, Xiaoxing Xi, one of the world’s leading experts on superconducting thin films and interim chairman of the Physics Department at Temple University in Philadelphia, was arrested for espionage. Despite being cleared of all charges four months before his court date, Xi was left with no savings, enormous legal fees, and a damaged reputation.[2]

Incidents such as these are not hard to imagine. Years of scientific research have shown that our intuition is anything but rational.[3] [4] [5] When there is no standard by which to identify and differentiate threats from non-threats, we are naturally inclined to respond based on what we believe to be true, rather than what is objectively true. Preconceptions, expectations and emotional tone can easily override critical thinking and bias what we focus on, who we focus on, and what we think may be occurring. Be it in medicine,[6][7] the courtroom,[8] military,[9] DNA analysis,[10] or fingerprint analysis[11], intuitive decisions lead to negative outcomes.

While insider threat reporting in the workplace is minimal,[12] a system that lacks consistency and relies on personal judgments cannot accurately detect threats or assess risk. As the dragnet expands to detect insider threats, so should the responsibility of those involved to do everything possible to collect the most accurate information. Understanding the limits of objectivity when we judge others, and adopting strategies to overcome their effects, are arguably two of the most important and overlooked areas facing security and insider threat professionals today.

This is the first of a three-part series.

More Reporting Isn’t (Necessarily) Better Reporting

It should be safe to assume that if someone initiates a report, the information is well-informed, accurate, and based on real evidence. However, this is not always the case.

In October 2014, “Sherry” Chen, a hydrologist employed by the National Oceanic and Atmospheric Administration (NOAA), was falsely accused of illegally accessing information about the nation’s dams and covering up a meeting with a high-ranking foreign official. Even after the federal charges were dropped, Chen was unable to work based on allegations of “untrustworthiness, misrepresentation, misuse of a federal database, and lack of candor” in the workplace. Years later, an administrative judge on the case explained that the false charges and allegations were due to “mishandling of the situation on a number of different levels.”[13]

How do such serious allegations come about, if not based on credible evidence? A review of court documents[14]  suggests that a number of erroneous factors – such as an over focus on negative events, lack of consideration of positive evidence, inattention to source credibility, and a failure to consider more plausible explanations for the behaviors – likely contributed to the outcomes.

Four Reasons Why We Fall for False Reporting

The human tendency to assign too much weight to first impressions can compromise the accuracy of our observations, interpretations, and ultimate reporting about an individual. No matter who we are, we tend to believe information that confirms what we already think, put more stock in people whom we view as similar, and base our judgments on information that’s easily accessible and imperfect, rather than seek out comprehensive and complete evidence.[15]

There are many reasons why insider threats are hard to detect, and some of them are based on faulty reasoning. Improve insider threat detection by challenging your thinking.

Reason 1: We think we have a good idea about whom to trust and whom not to trust

In an office environment, there are hundreds of factors that affect how we feel about someone and whether we trust them. However, trust is subjective. Whether and to what degree we have confidence in a co-worker is based on how likeable, benevolent, honest, and accepted by others he or she appears at that particular moment in time.[16] [17] [18]  As cases of espionage show, the more likeable a person appears, the less likely his or her behavior is to be scrutinized or reported.[19] Conversely, people who are disliked not only garner more negative attention (making them more susceptible to being reported), but are more likely to have their actions misinterpreted.

The fact that we are quick to trust, when there is no obvious reason not to, leads to biased decision-making. Unless insider threat professionals are made aware of how natural biases can affect observations, some individuals may be reported for reasons unrelated to risk, while others who may in fact pose a real risk will be overlooked.

Reason 2: We assume if someone goes to the trouble of reporting, it must be true

Sound investigative principles require that all source information be vetted prior to further use and application. However, given time constraints, minimal training, and our propensity to trust, reports are often accepted at face value.

Unless there is an obvious reason to question it, we often accept the premises that:

  • The directions, including the words and phrases, and criteria are clear and objective and tied to a conceptual framework.[20]
  • The source understood exactly what information to collect and how to identify it.
  • The source of the report is well-informed, unbiased and did not have a stake in the outcome.
  • The report contains accurate observations of what really happened and is not informed by opinion.
  • Alternative explanations were ruled out before making the report, and
  • The individual is certain that there is something of concern and would not have made the report otherwise.

While this may be true for some types of reports, accepting insider threat reporting at face value increases the likelihood of false accusations (even when believed by the reported to be true).

Reason 3: We think that human behavior is pretty simple

Even in the most controlled circumstances, forecasting how someone is likely to act at a future time is an incredibly complex process. There are literally thousands of variables to choose from, all of which could theoretically be correlated with an individual’s decisions and actions and result in an infinite number of responses — making the act of prediction highly error prone.[21]  Someone who may appear to be a low risk today may encounter unforeseen life circumstances that may overwhelmingly increase his likelihood of risk in the future.

Moreover, every person responds to influencing factors differently. Even the same person cannot be expected to respond the same way twice. Given the many complexities of human behavior, accurately determining who will remain steadfast overtime – based on minimal and subjective reporting – is a significant challenge.

Reason 4: We trust that our common sense will lead to the right conclusion even when the directions are vague

On the face of it, allowing professionals to make common-sense judgments seems rational.  However, the more unstructured and discretion-based a system is, the more difficult it is to ensure accountability and transparency. When a task is ill-defined or there is systemic pressure to get results, our tendency toward bias is even more pronounced. Opinion-based systems in the workplace, such as 360-degree feedback programs intended to assess performance and increase productivity, penalize people who may be different, less social or not well-liked.[22]  Vague questions, lack of specific guidance, and opinion-based judgments lead to misinterpretation and can harm employees and the system in the long run.[23]

For example, a behavior that is associated with counterintelligence and security problems is  “extensive use of copy, facsimile, or computer equipment to reproduce or transmit classified material that may exceed job requirements.”[24] However, what is “extensive” to one person may not be to another, especially if the observer is unaware of the person’s workload and responsibilities.

Vague indicators such as these can lead to greater numbers of reports, and more false positives, delaying focus on more evidence-based threats.

 

[1] “The Insider Threat Vigilance Campaign is a CDSE program built on the idea that for any Insider Threat program to “be fully successful, it must keep the awareness message first-and-foremost in the mind of the workforce”.
[2] Matt Apuzzo. (11 September 2015). “U.S. Drops Charges That Professor Shared Technology With China”The New York Times.
[3] Kahneman, D. (2003). Maps of bounded rationality: Psychology for behavioral economics. The American economic review, 93(5), 1449-1475.
[4] Tversky, A., & Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science211(4481), 453-458.
[5] Kahneman, D. (2011). Thinking, fast and slow. New York, NY, US: Farrar, Straus and Giroux.
[6] Mamede S., Schmidt H.G., Rikers R.M., et al. (2010). Conscious thought beats deliberation without attention in diagnostic decision-making: at least when you are an expert. Psychol Res. 74:586–92.
[7] Croskerry P., Singhal G., Mamede S. (2013). Cognitive debiasing 1: Origins of bias and theory of debiasing. British Medical Journal, Quality Safety, 22. ii58-ii64.
[8] Guthrie, C., Rachlinski, J. J., & Wistrich, A. J. (2007). Blinking on the bench: How judges decide cases. Cornell L. Rev., 93(1).
[9] Bryant, D. J. (2006). Rethinking OODA: Toward a Modern Cognitive Framework of Command Decision Making. Military Psychology, 18(3), 183-206. doi:10.1207/s15327876mp1803_1
[10] Dror, I.E. & Hampikian, G. (2011). Subjectivity and bias in forensic DNA mixture interpretation. Science and Justice, 51(4), 204-208.
[11] • Dror, I. E., Champod, C., Langenburg, G., Charlton, D., Hunt, H., & Rosenthal R. (2011). Cognitive issues in fingerprint analysis: Inter-and intra-expert consistency and the effect of a ‘target’ comparison. Forensic Science International, 208, 10-17.
[12] Nelson, L.,  Beneda, J., McGrath, S, & Youpa, D. (2019). Enhancing Supervisor Reporting of Behaviors of Concerns (PESREREC-TR-19-03 OPA Report 2019-033). Defense Personnel and Security Research Center Seaside United States.
[13] Merit Systems Protection Board Central Regional Office, Ziaofen Chen, Appellant, v. Department of Commerce Agency. Docket Number CH-0752-17-0028-I-1. April 23, 2018 Before Michele Szary Schroeder Chief Administrative Judge. INITIAL DECISIONS.
[14] Merit Systems Protection Board Central regional Office, Ziaofen Chen, Appellant, v. Department of Commerce Agency. Docket Number CH-0752-17-0028-I-1. April 23, 2018 Before Michele Szary Schroeder Chief Administrative Judge. INITIAL DECISIONS. p.4.
[15] Kahneman, D. (2011). Thinking, fast and slow. New York, NY, US: Farrar, Straus and Giroux.
[16] Larzelere, R., & Huston, T. (1980). The Dyadic Trust Scale: Toward Understanding Interpersonal Trust in Close Relationships. Journal of Marriage and Family, 42(3), 595-604.
[17] Nicholson, C. Y., Compeau, L. D., & Sethi, R. (1999, January). The critical role of interpersonal liking in building trust in long term channel relationships. In American Marketing Association. Conference Proceedings, 10 p. 246). American Marketing Association.
[18]  Cialdini, R. B. (2001). Harnessing the science of persuasion. Harvard Business Review, 79(9), 72-81.
[19] Jaros, S. L. (2018). A Strategic Plan to Leverage the Social and Behavioral Sciences to Counter the Insider Threat (No. PERSEREC-TR-18-16, OPA-2018-082). Defense Personnel and Security Research Center Seaside United States.
[20] Without a mental model or framework from which to think about the issues, identifying, interpreting, and predicting behaviors that may constitute a risk is a subjective and arbitrary process.
[21] Ouellette, J. A., & Wood, W. (1998). Habit and intention in everyday life: The multiple processes by which past behavior predicts future behavior. Psychological bulletin, 124(1), 54.
[22] Wilkie, D. (March 31, 2016). Are Anonymous Reviews Destructive? Amazon’s system allowing workers to privately comment on one another reportedly led to sabotage attempts. Society for Human Resource Management. Accessed February 19, 2019. https://www.shrm.org/resourcesandtools/hr-topics/employee-relations/pages/360-degree-reviews.aspx
[23] Jackson, E. (2012). The 7 Reasons Why 360 Degree Feedback Programs Fail. Forbes online. Accessed February 19, 2019. https://www.forbes.com/sites/ericjackson/2012/08/17/the-7-reasons-why-360-degree-feedback-programs-fail/#2c86f990279d
[24] Wood, S., Crawford, K. S., & Lang, E. L. (2005). Reporting of counterintelligence and security indicators by supervisors and coworkers (No. PERS-TR-05-6). DEFENSE PERSONNEL SECURITY RESEARCH CENTER MONTEREY CA.

https://www.hstoday.us/subject-matter-areas/information-technology/perspective-3-ways-to-help-secure-industrial-operations-from-insider-threats/

author avatar
Judy Philipson, Ph.D.
Dr. Judy Philipson is President of Behavioral Sciences Group LLC. A recognized expert in threat detection, risk assessment, operational tradecraft, and investigative methods, she has over 20 years of experience providing advice, analysis, and training to clients in the Intelligence Community, Department of Defense, Law Enforcement and Homeland Security. From 2009-2014, she served as Social Influence Advisor to the Counterterrorism Center at the Central Intelligence Agency. She is a Senior Associate Fellow at Narrative Strategies, LLC, a consultant to the Insider Threat Management Group, and a Lecturer in Criminology and Criminal Justice at the University of Maryland, College Park. She has a Ph.D. in clinical psychology from Drexel University where she focused on forensic populations and issues relating to deception and risk assessment.
Judy Philipson, Ph.D.
Judy Philipson, Ph.D.
Dr. Judy Philipson is President of Behavioral Sciences Group LLC. A recognized expert in threat detection, risk assessment, operational tradecraft, and investigative methods, she has over 20 years of experience providing advice, analysis, and training to clients in the Intelligence Community, Department of Defense, Law Enforcement and Homeland Security. From 2009-2014, she served as Social Influence Advisor to the Counterterrorism Center at the Central Intelligence Agency. She is a Senior Associate Fellow at Narrative Strategies, LLC, a consultant to the Insider Threat Management Group, and a Lecturer in Criminology and Criminal Justice at the University of Maryland, College Park. She has a Ph.D. in clinical psychology from Drexel University where she focused on forensic populations and issues relating to deception and risk assessment.

Related Articles

Latest Articles