ISIS roared onto the social media landscape not with a gruesome beheading, but with a fluffy kitten. The beheadings would be infamously broadcast by the terror group, leading to scattershot social media bans followed by new accounts created by terrorists and their loyalists to replace and augment the ones that had been removed. But a social media censor searching for terror-related keywords or avatars of the ISIS flag may have been thrown for a loop by images of jihadists cuddling with calicos produced by the now-defunct Islamic State of Cat account on Twitter that in 2014 began posting retooled kitty memes such as felines cooing about their “mewjahid” as they lapped up milk.
These tweets of jihadists and their furry friends changed the game of public-facing extremist communications. For every photo of a stoning or beheading released by the terror group’s social media army, there were pics of caliphate ice cream shops or carnival games. The terror group churned out videos, e-books, magazines, apps, and more in a multimedia push to draw foreign fighters to their so-called state while attempting to engender broader sympathy or at least understanding for their goals among targeted segments of the global community. To get that propaganda stew to their intended audience, they needed fingers at the keyboards in the form of a global army doing little more than retweeting, sharing videos, liking posts, and using all social media and messaging platforms to outreach and network. Not just sharing official media, but creating their own propaganda. Not just using a couple of favored platforms, but injecting content across all spaces. And infusing such a deluge of propaganda into the online ecosystem that gray-hat hackers eventually had to take it upon themselves to try to control the wave by taking down accounts and content faster than the social media companies.
The ISIS cats were eventually deplatformed, but their symbolism continues in social media campaigns across a range of ideologies that strive to humanize the extremists. When one ISIS guide crowed about the pickles in the caliphate being better than Walmart’s, they weren’t out to recruit dill connoisseurs but were attempting to make the space in which their hate and horror unfolded relatable, humanized, and inviting. When ISIS social media accounts churned out pulse-pounding, slickly produced propaganda videos that could have come from the mind of a gamer or director of action flicks, the purpose was to make the terror group look cool to recruiting targets in this online space. Other extremists such as the neo-Nazi Atomwaffen Division followed with their own training camp videos that reflect this style.
As social media platforms are asked to self-police – with questions about when content crosses the line and some users favoring a free-for-all on these private platforms regardless of the extremist content or operation – it’s critical to understand how social media is used not just as a venting space for unpopular or hateful opinions but as a tool for extremists to grow their numbers, further violent acts, inspire lone actors or cells to violence, and even information-share with extremists of unrelated ideologies.
1. Recruitment into groups or movements
At its core, extremist activity on social media is an advertisement. Extremist movements need strength in numbers and the strength of their adherents’ convictions. Social media posts are intended to sell people on their ideology. Whether recruited into a terrorist group, recruited into associated movements, recruited into a media army that spreads this ideology beyond the control of censors, or recruited into a legion of sympathizers, those targeted by extremist propaganda ranging from tweets to memes to videos have been targeted for recruitment. Some on the verge of an attack may use social media to advertise their actions, but even that is an advertisement for their cause and, ultimately, future recruitment in the hopes that others of like mind follow in their footsteps.
The construct of social media has enabled groups to target particularly vulnerable individuals for recruitment. In 2016, as the #OpISIS movement was in full swing with gray-hat hackers taking down as many ISIS accounts and websites as they could, one hacktivist collaborative told me that ISIS recruiters hiding behind kitten or baby avatars on Twitter appeared to be specifically seeking non-Muslim American girls around the ages of 13 to 15. They would befriend what appeared to be a lonely young teen, cultivate an online friendship, isolate them from friends and family, then try to recruit the girls to conduct attacks in their hometowns or travel to wed ISIS fighters. The hackers said they were receiving new direct messages each week from frightened girls who tried to back out of the relationships only to come under threats; in deep by that point, they had no idea where to turn for help. Recruiters took advantage of their anonymity while exploiting the personal details that were accessible through the teens’ use of social media.
The first step in ideological recruitment can come from anonymous or well-known social media figures who may never have the personal intention of committing a physical attack. For extremism to flourish – including pushing extremist actors lured into a movement to commit violence – it needs to retain and nourish its followers. To do this, extremist movements have to regularly connect with adherents. To be constantly rallied to the service of any ideological cause, to maintain a sense of outrage based on current events, to build an even more fervent desire to act in furtherance of that cause, purveyors of extremist thought need to keep pushing the movement’s beliefs and deeds. “The physical battle can be lost even before it starts,” ISIS Khorasan warned in their English-language online magazine calling for “social media warfare” this spring, if people who might support them “are defeated or at the least trapped in the battle for the hearts and minds.”
Extremist movements also use public-facing social media to demonstrate resiliency. When accounts are suspended for disseminating extremist content, users quickly pop back up with new accounts. When platforms become inhospitable for extremist groups, they migrate to other platforms seen as more tolerant or ignorant of their activity. These migrations are often done loudly while touting the group’s resilience and attempting to stoke anger among their followers or sympathetic onlookers by claiming they are victims of heavy-handed censorship, or even directly threatening social media companies as seen in memes from both neo-Nazis and ISIS supporters depicting the beheading of Facebook’s Mark Zuckerberg.
3. Extremist networking with the potential to form cells
Social media is social for extremists as well, giving those with extremist views and those who wish to accelerate those views into action an anonymous space in which to connect, network, and develop relationships that could remain online or extend to the real world. Much of the investigation around the storming of the U.S. Capitol, for example, has centered around pre-Jan. 6 discussion, encouragement, or coordination among participants on social media, whether individuals or groups were planning to be there for a demonstration or to try to forcefully stop the certification of the presidential election.
A superseding indictment filed in August against a group composed of former and active-duty military along with members of the now-defunct Iron March neo-Nazi online forum alleges that they discussed attacking the power grid both “for the purpose of creating general chaos and to provide cover and ease of escape in those areas in which they planned to undertake assassinations and other desired operations to further their goal of creating a white ethno-state.” The Justice Department said two of the defendants “met through the forum and expanded their group using an encrypted messaging application as an alternate means of communication outside of the forum”; what began online expanded into live-fire training near Boise that was filmed for a propaganda video.
The accused also “discussed using homemade thermite to burn through and destroy power transformers” and “researched, discussed and critically reviewed at length a previous attack on the power grid by an unknown group,” the indictment states. “That group used assault-style rifles in an attempt to explode a power substation.” That attack is not named but can be assumed to be the 2013 attack in which multiple gunmen opened fire on the Pacific Gas and Electric Company’s Metcalf Transmission Substation south of San Jose, Calif., causing more than $15 million in damage to 17 transformers; the perpetrators have not been caught and the incident is commonly referenced in online extremist discussions about critical-infrastructure vulnerabilities. Links to video of the Metcalf attack are regularly shared on social media.
4. Recruiting sympathetic supporters
The most powerful propaganda may be intended not to woo a cell of recruits but to assemble an army of sympathizers who might engage in social media battle for the movement or its ideological core. This sort of recruitment can be harder to spot in social media postings that are not necessarily in violation of a platform’s rules but implicitly or explicitly argue in favor of an extremist movement’s core beliefs even if not promoting violence, engage in whataboutism to justify or attempt to lessen the gravity of the movement’s threats or violent acts, harass perceived foes of the movement, or simply look the other way.
All extremist movements use current events to try to make their movement more palatable, hoping that even if a viewer or reader doesn’t identify with the movement they may be less likely to raise the alarm or may even tacitly agree with the group’s motives if not their tactics. This can be seen in the leafleting campaigns in which KKK chapters have targeted neighborhoods with recruitment fliers that use phrases such as “leave our history and heroes alone” and “preserve our culture” while trying to gain new followers off current events such as the removal of Confederate statues or renaming of landmarks.
5. Creating fertile ground for extremist development
Much of the debate over the level of censorship on social media platforms centers around offensive or hate speech being free speech, and whether a private company should have to host that speech. Hate speech does more than offend, though. Speech that singles out a demographic as less worthy of respect, protection, or even life, speech that accelerates animosity toward a certain group, or circulating canards with the intent of whipping up hate all contribute to an environment in which extremism grows. The ADL’s Online Hate Index discovered 27,400 antisemitic tweets on Twitter and 1,980 antisemitic comments on Reddit in an analysis of English-language posts from Aug. 18-25, 2021, with a potential reach of 130 million users on Twitter alone. After two months, more than 70 percent of that content was still on the platforms.
Consider this type of content to be discoverable – and even more easily so – in the same vein as terror guides that shepherd an extremist from recruitment to attack. The accused Buffalo shooter credits the words as well as the deeds of Christchurch killer Brenton Tarrant in his manifesto, stating that studying Tarrant “started my real research into the problems with immigration and foreigners in our White lands.”
6. Desensitization to extremist violence, actions, or language
Violent extremism needs people to look the other away during the stages from when a plot is formulated to the execution. Depending on the social media platform, threats or similarly disturbing behavior may be removed or get a user suspended, or be relatively ignored or even cheered on by other users. Or the number of users may be so large and the audience of the extremist seemingly too limited for other users to report the threats or behavior and for platform staff or law enforcement authorities to see it in time.
The inundation of extremist content on social media platforms can desensitize users or even moderators to its existence at a time when users need to be reporting this content given the sheer volume of content that companies need to first detect and then moderate. A half hour before the Buffalo supermarket attack, at least 15 other people joined the Discord account of shooting suspect Payton Gendron in which the planning of the attack was discussed; the livestream of the attack was then broadcast on Twitch. Uvalde school shooter Salvador Ramos left a trail of Instagram chats and Yubo messaging. To be motivated to see something/say something, a bystander first must feel that words or actions are a red flag.
It may not seem like an illustration depicting Santa standing next to a crate of dynamite in Times Square would have much real-world impact when it comes to getting an extremist to pull a trigger, yet Everitt Aaron Jameson, who plotted a 2017 Christmas attack on San Francisco’s Pier 39, “liked” and “loved” this ISIS supporters’ propaganda image on Facebook before his arrest. Much of the extremist propaganda circulated on social media consists of wildly pitched incitement – however generalized or specific – in posts, memes, videos, magazines, etc., with the hope that something sticks. Some posts or propaganda materials encourage would-be attackers to use specific weapons or tactics, to pick soft or symbolic times to strike, to choose certain locations, or to target certain people.
There is a window after attacks when the incitement on social media frequently peaks as extremists scramble to encourage copycats. But once this rush hour has subsided a bit, the incitement content distributed online remains there for use by future violent extremists. The Buffalo shooter said in his manifesto, for example, that he searched for and found Tarrant’s video online long after the mosque attacks after learning about the killer in an online forum. Al-Qaeda in the Arabian Peninsula, which publishes the English-language Inspire magazine that has enjoyed staying power as an online practical terror how-to resource, has produced a series of supplemental guides studying select attacks to assess what was done well by the attacker and what could have been done to inflict more harm – with the goal of inciting followers to try the methods and modifications when planning and committing their own attacks. “There is nothing easier in America than obtaining weapons, and therefore do not begin carrying out operations involving stabbing with knives and running over with cars until you search for these weapons and use them in your operation,” the terror group said in discussing gun acquisition after the 2021 mass shooting at a Boulder, Colo., grocery store.
8. Attack discussion before or after
Overt plotting can occur on social media as would-be attackers talk out their plans. Part of the copycat danger after an attack stems from discussion of the incident on social media. Users share attack images and video – after the Buffalo shooting, these included stills of the first victim being shot in the head outside of the store – and news stories along with screenshots of previous social media posts attributed to the killer. Lines are crossed among online extremists from discussing the details of the attack to circulating the information as a tribute to the terrorist. “Individuals in online forums that routinely promulgate domestic violent extremist and conspiracy theory-related content have praised the May 2022 mass shooting at an elementary school in Uvalde, Texas, and encouraged copycat attacks,” the Department of Homeland Security noted in the latest National Terrorism Advisory System Bulletin.
Social media gives extremists a forum with a like-minded sounding board to discuss attacks and, like the AQAP guides, posit how they believe it could have been done “better”: different targets, different timing, different strategy or weapons. Particularly in attacks where the killers have used message boards and social media to convey their plans or livestream their attack, gruesome play-by-play discussions ensue. As the hostage standoff at the Congregation Beth Israel synagogue in Colleyville, Texas, unfolded on Jan. 15, users on the white nationalist Stormfront message board posted comments including “I’m disappointed that its [sic] not a Christian” and a suggestion that “attacks on houses of worship are not nearly as effective as attacks on ‘holocaust’ museums, monuments, and memorials could be.”
9. Live broadcast of extremist violence
Strict censorship of specific extremist content tends to last only a little bit longer than public shock or outrage over the associated violent acts. Even as platforms were quickly ripping down posted copies of the Buffalo mass shooter’s manifesto in the days after the attack, it was easy to find a copy of the Christchurch shooter’s manifesto online – cited as the inspiration and source material for the Buffalo shooter, portions of Tarrant’s rant were copied into the longer Buffalo manifesto. Once a livestream of an attack or accompanying documents or photos get pushed out via social media, they’re impossible to truly get back into the bottle. And the sheer volume of this content online reflects how overwhelming it can be for moderators to even get a clear picture of what is out there, nevermind get to the removal stage: Even as social media platforms made an effort to crack down harder on ISIS content in recent years, al-Qaeda content has often flown under the radar and remained online for lengthy periods of time.
The livestreaming of terror attacks has been not only an especially brutal way for violent extremists to spread their undiluted message and gain notoriety in real time, but the self-broadcasts have shown staying power beyond the live event as a tool of incitement and recruitment. By the time a social media company is alerted to an attack livestream and takes it down, it’s already been recorded by someone – perhaps by the one of the online communities paying rapt attention to the unfolding event or cheering on the terrorist – and will be disseminated to inspire copycats, attempt to demonstrate the strength of an ideological movement, or further terrorize the community targeted in the attack. All groups have taken note of the deadly domino effect of an attack livestream. In its special Inspire edition “Praise & Guide: Colorado Attack,” AQAP reviewed the mass shooting at the King Soopers market in Boulder and encouraged those who would emulate such an attack to “give out a media message” before or during the attack — such as livestreaming the attack, contacting media directly, posting on social media using one’s real name, etc., “as it multiplies the results” by inspiring others.
In the online manifesto attributed to the Buffalo shooter, the writer said he started “browsing 4chan in May 2020 after extreme boredom” and digested racist replacement theories “through infographics, shitposts, and memes,” adding that it was at 4chan’s /pol/ that he first saw a GIF of Tarrant’s attack. He said he then located and watched the full livestream and read Tarrant’s manifesto, then “found other fighters, like Patrick Crucius, Anders Breivek, Dylann Roof, and John Earnest.” The writer said he felt “awakened” and decided he “would follow Tarrant’s lead and the attacks of so many others like him.”
10. Promoting group or individual actions
“Private (Recruit) Minassian Infantry 00010, wishing to speak to Sgt 4chan please. C23249161,” self-professed incel killer Alek Minassian posted on Facebook just before the 2018 Toronto van attack. “The Incel Rebellion has already begun! We will overthrow all the Chads and Stacys! All hail the Supreme Gentleman Elliot Rodger!”
Despite the seemingly surreptitious nature of extremism, if these movements had their way they would own the lead headline every day. Coverage is expected to bring them attention, legitimacy by following through on threats with deeds, sympathizers, new recruits, and vocal opponents whom they also try to leverage to bring new followers aboard (for example, by claiming that their free speech or way of life are under threat). Where traditional media coverage of their threats or deeds may lack, extremists try to make up for it with their own public relations efforts – and social media is critical in this endeavor. ISIS, for example, releases its weekly al-Naba newsletter with full-length stories while the group’s Amaq news agency publishes announcements framed as breaking news in nuggets tailored for sharing on social media. “In this age, social media warfare holds the utmost importance as the medias and social media personalities are enchanting the eyes of the people,” ISIS-K said. “Fighting in this field needs to be done in order to incite the believers.”
White supremacist groups have used social media to not just post online propaganda but to spread photos and video of their efforts to distribute propaganda offline. When they tack up racist and antisemitic fliers and post stickers across college campuses or drop a banner from a freeway overpass, social media is used as a vehicle to promote their actions to potential recruits as well as foes. According to ADL’s Center on Extremism update in March, this sort of white supremacist propaganda distribution “remained at historic levels across the United States in 2021, with a total 4,851 cases of racist, antisemitic and other hateful messages reported” – about 13 incidents per day.
11. Attack training
Social media helps spread the D.I.Y. terror training that turns a lone actor into an acute threat. Extremists have embraced this sort of distance learning: Foreign terrorist groups want a loyalist to be trained and execute an attack on familiar turf without the need to leave their home soil and potentially arouse the suspicion of authorities in the training and planning phases. Domestic extremists, while some have sought tactical training with others close to home, similarly understand that home and an internet connection can shield their attack preparation better than a camp. And through social media and online repositories, training materials from one ideology are available for the benefit of any another terrorist movement.
The materials that make up the wealth of this extremist how-to library – magazines, e-books, manifestos, videos, lectures, photo essays, propaganda posters, and more – are for the most part readily accessible online. And they are accessible – and can be useful – to extremists across ideological lines: the white supremacist can get a bomb recipe from al-Qaeda, while a lone jihadist can pick up pointers from a neo-Nazi training video. We are in an era of open-source terror, where tutorials offer everything from tips on target selection and how to increase body counts to specific step-by-step instructions for constructing a range of explosive devices.
12. Learning from other extremist movements and cross-movement instigation
There is an already huge and growing library of terrorists’ lessons to share about online outreach, organizing, propaganda, tactics, and lone attacks, and the body of information helps extremist movements help each other – even if they might never admit it. It didn’t matter to ISIS that the 2017 Las Vegas shooter probably never cracked open a Quran and, according to investigators, had no known religious or political affiliations while going to “great lengths to keep his thoughts private.” What they did care about is that he committed the deadliest mass shooting in American history and used a novel approach from a high sniper perch to bypass on-the-ground security at an open-air music festival. ISIS claimed the shooter as their own for as long as they did because they wanted their own followers to learn from and emulate what they saw as a great success.
FBI Director Christopher Wray recently told Congress that authorities are increasingly “seeing people with this kind of weird hodgepodge blend of ideologies” among lone actors who may be radicalized to violence. “The old-school world of kind of people with some purity of radical ideology then turning to violence is often giving way to people who have kind of a jumble of mixed-up ideas. And, you know, we’ve seen cases where somebody one month is saying they’re an ISIS supporter, and then the next month they say they’re a white supremacist.” Al-Qaeda was notably explicit about its offer of information-sharing when it told “the raiders of the Congress” and similar groups in a video that “they will find what they need in the Inspire magazine issued by the mujahideen in the Arabian Peninsula.” The inaugural Inspire magazine issue that included the “Make a Bomb in the Kitchen of Your Mom” article included the pressure-cooker bomb recipe used in the 2013 Boston Marathon bombings. Step-by-step instructions on how to create a range of other explosive devices were included in subsequent issues.
13. Meme acceleration
Making posters has been such an important part of ISIS’ media jihad that a supporter even made a poster about the importance of making posters. This visual medium is employed by various extremist groups and distributed via social media and other online forums. With some photoshop, shocking imagery, soundbites or catchphrases, branding, or messaging, these posters and memes are easily inserted into social media posts and can be tailored to fan flames of extremism or incite viewers to violence.
They also reflect how tightly interwoven the themes are across Islamist, white supremacist, accelerationist, and eco-fascist propaganda posters and memes that often include using current events to stoke grievances and ultimately recruitment, vowing ideological dominion, pressing conspiracy theories in order to accelerate slides toward extremism, using action-film-style imagery of training or operations, highlighting past attacks conducted by any type of group to show extremists how much suffering they can also inflict, using anti-government and revenge themes, promoting weapons and tactics, threatening social media over deplatforming, and displaying antisemitism or misogyny along with hatred toward minority communities, the LGBTQ community, or religious groups. (See more examples here.)
14. Spreading disinformation
ISIS saw the power in disinformation when the group claimed the Vegas shooter as their own long enough for recruitment and copycat incitement purposes. Through multiple official and affiliated media channels, including their Amaq News Agency and the group’s weekly al-Naba newsletter that is distributed via platforms such as Telegram, file-sharing sites, and social media, ISIS persisted for months in their claim that Stephen Paddock had recently converted to Islam, went by “Abu Abdul Barr al-Amriki,” and “carried out the attack in response to calls for targeting coalition countries.” By the time investigators discovered he had no known fidelity to any religion or ideological movement, ISIS had already achieved their disinformation mission of planting doubt in the minds of those who were convinced that one would have to have some extremist motive to commit such an attack while simultaneously rallying their base to the possibilities of whom they could recruit and how they could emulate the massacre.
Extremists deploy fake news as targeted information warfare, hoping that the disinformation will fuel suspicion, division, and the kind of outrage that could push an extremist from bloviating online to violent action in real life. They want disinformation to help build their ranks of sympathizers and active adherents. They count on disinformation to try to paint their extremist movement as innocuous, beneficial, or even necessary. And, increasingly, extremists of all persuasions rank the importance of this ideological warfare as high as physical attacks. ISIS Khorasan recently declared dissemination of disinformation a “duty” for jihadists in order to control the narrative, demonstrate strength, and spread fear. “If we can shake the chain of the enemy and divide them that is part of the war policy to divide them and defeat them,” the terror group wrote. “…Spreading the rumors is therefore a duty upon the Muslim armies to cause fragmentation of the enemy because that disunity will demoralise them significantly.”
15. Creating and perpetuating conspiracy theories
Conspiracy theories can be the slow or swift current that takes extremism from online ranting to violent action. On social media, these theories largely flourish unchecked save for occasional counter-notices from the social media company that juxtapose facts to the lie. While that may promote rational thought in the passerby who is not yet married to the conspiracy theory, those already immersed in that world dismiss attempts to correct mistruths as proof of feared corporate or government control. Those who act on conspiracy theories entrench the theory in social media even more, like after the 2016 “Pizzagate” attack on Comet Ping Pong in D.C. when online conspiracy theorists either declared the gunman a “crisis actor” in a “false flag” operation or rallied support for what they saw as his wrongful imprisonment. Social media helps conspiracy theories breed even more conspiracy theories. And a person may become convinced that they need to take violent action in response to the conspiracy theory.
The conspiracy theory used with the deadliest effect recently is the “great replacement” theory that claims there is a Jewish-led plot to replace white people. “HIAS likes to bring invaders in that kill our people,” allege shooter Robert Bowers posted on Gab the morning of the 2018 attack on the Pittsburgh Tree of Life synagogue, referring to the Hebrew Immigrant Aid Society that assists refugee communities. “I can’t sit by and watch my people get slaughtered. Screw your optics, I’m going in.” Bowers was active on the site in the weeks before the shooting, reposting and commenting on related topics. Minutes before the El Paso Walmart attack in 2019, the Justice Department said alleged shooter Patrick Crusius posted a manifesto on 8chan that declared, “This attack is a response to the Hispanic invasion of Texas. They are the instigators, not me. I am simply defending my country from cultural and ethnic replacement brought on by the invasion.” Tarrant and Gendron also leaned on this theory. In these four attacks, 95 people were killed.