From schools to concerts to malls, shooters have unleashed untold terror and violence upon communities, leaving people in grief and officials looking for ways to get ahead of a situation before it occurs. Often, there are warning signs ahead of these types of incidents.
The question is when these warning signs arise, what can be done? Currently, there is no standardized training in deep-diving social media. It takes years for an investigator to develop the necessary tradecraft needed to become good at it. They need an algorithm that helps speed up the process along with their tradecraft.
Open-Source Intelligence (OSINT) technology is a viable solution for trained investigators to do that deeper dive into a person of interest. Many of these active shooters have an online presence, but investigators must go beyond the typical open web to the deep and dark web and other online forums where disaffected people might be influenced by various hate ideology or conspiracy theories.
Rise in Shootings
Here in the U.S., FBI data for the period 2017-2021 reveals an upward trend in active-shooter incidents. The FBI designated 61 shootings as active-shooter incidents. In these incidents, 103 people were killed and 140 wounded, excluding the shooters. The number of active-shooter incidents identified in 2021 represents a 52.5 percent increase from 2020 and a 96.8 percent increase from 2017, according to the FBI’s Active Shooter Incidents in the United States in 2021 report.
Automation technology is presenting new opportunities for community safety – specifically, aiding in threat detection and violence prevention. Because of the interoperability of various social media platforms, OSINT can search these platforms to identify individuals who display early signs of feeling angry, isolated, or disgruntled – before that behavior escalates into violence.
For example, memes, emojis, and other symbols are frequently used on social media as communicators of fringe mentality, conspiracy theory beliefs, or hate-group allegiance. If law enforcement and the public had a better understanding about what these logos or acronyms represent and how they can be used to wield influence, there is a tremendous opportunity for intervention. OSINT can also be used to identify sentiment shifts in individuals who could potentially be worrisome.
However, few businesses, police departments, school districts, or other prime stakeholders have access to the resources or time needed to comb through all available sources and analyze the results. Although there is clearly no “silver bullet” to stopping violent behavior, here are three steps to help be more proactive:
- Education: Get to Know What You Don’t Know
Looking at the backyard through the kitchen window is only going to give you a limited view. Law enforcement, school districts, government agencies and even parents need to widen their scope to better understand how influence works before interdiction can occur. Everyone should be familiarizing themselves with the various lingo and slang associated with various groups or movements and the online forums or cesspools where hate or fringe ideologies are being discussed. Various organizations have online hate symbol databases and tools designed to help law enforcement, educators, and members of the public identify potentially hateful images. For instance, how did an innocent comic strip of Pepe the Frog become a symbol of hate? If this symbol is on a child’s backpack a parent who is aware of its darker meanings would be able to investigate whether their child is being influenced. OSINT can help identify sentiment shift – if a young person is posting kitten pictures and then changes to posting images from ISIS or pictures of assault rifles, parents and educators can intervene to help the individual before things escalate.
- Augment Lack of Resources with Automation
Too often the right mechanisms have not been in place to alert law enforcement agencies to people who might be potential threats, and often when they do zero in on a person of interest it is too late. Most agencies and school districts do not have the luxury of consistently monitoring online activity and social media, so they rely on services that are not sufficient. Social media usage is one of the most popular online activities and in 2021 82 percent of the U.S. population had a social networking profile, according to Statista. Keeping up with the scope and breadth of online activity across the open, deep, and dark webs as well as online forums is a daunting task. An automated platform powered by artificial intelligence (AI) is essential for law enforcement and school districts to tackle time and resource constraints.
- Adopt Directable AI (backed by filtering and automating mechanisms)
Agencies need toolkits that actively search and bring in unique sources of information. An automated, AI-powered OSINT platform gives analysts and investigators the ability to filter for relevant information, aggregate it, and collate it for the end user. Additionally, these searches can be localized and run constantly in the background to help augment human analysts. Human intelligence is still needed in the mix. Agencies should never rely solely on technology.
OSINT in Action
Investigators used OSINT to track down a person of interest when the Joker movie was released in 2019. An agency received a threat from a fictious Twitter account where a man wearing a Joker mask threatened to shoot up theaters. After investigators identified the suspect, they found the threat was a hoax.
Education and access to the right tools are crucial components for early detection of potential threats. OSINT can equip agencies with the knowledge and technology needed to stem the rising tide of mass shootings.
The views expressed here are the writer’s and are not necessarily endorsed by Homeland Security Today, which welcomes a broad range of viewpoints in support of securing our homeland. To submit a piece for consideration, email editor @ hstoday.us.