The rise of social media has allowed for easier dissemination of conspiracy theories, according to a recent report released by the Global Network on Extremism & Technology (GNET).
Author Daniel Allington elaborates on how the world’s largest social networking platforms such as Facebook, Twitter and YouTube have played a role in radicalization and terrorism through the accessibility to misinformation and “politically corrosive” media. Not only are conspiracy theories more accessible, but there is now evidence that links the content on these popular online platforms to violent and extremist acts.
The report investigates whether unfiltered and unconstrained misinformation produces an environment in which individuals feel they can pursue violent or extremist action in order to serve their political agenda. Allington found that this is in fact the case. Highlighting not only the Jan. 6, 2021, attack on the U.S. Capitol by insurrectionists but previous accounts of domestic terrorism fueled by extremists, he reveals a common thread in the radicalization process: online conspiracy theories. These theories have historically played a key role in acts of terrorism, genocide, and radicalization.
Allington notes research stating that those who identify on the extreme end of one political side or “wing” are found to be more likely to believe in conspiracy theories. Audiences are reached through online sharing platforms such as Facebook groups that unite individuals with similar interests. The report highlights the October 2018 attempted attack by Cesar Sayoc, who mailed homemade pipe bombs to members of the Democratic Party and their supporters. His social media activity included being a member of many right-wing groups on Twitter and Facebook, where members would share conspiracy memes that included some of his targets.
The report also highlighted what steps can be taken moving forward for social media platforms to stop the spread of misinformation. Allington suggested that in order for certain platforms to be deemed more valuable and reputable, they need to ensure that they are partnering with content providers who are not spreading misinformation.
Allington addressed QAnon, a far-right group of conspiracy theories, and the dissemination of their content on social media in the year 2020: “One investigation found that copies of QAnon videos removed from Facebook and Twitter remained in circulation on those same platforms, and a cross‐platform study of items of COVID‐19 related content identified as misinformation by fact‐checking organizations found that no action was taken with regard to 59% of such items on Twitter, 27% of such items on YouTube and 24% of such items on Facebook, despite platform policies which suggested that action would be taken promptly,” he wrote, adding that there have been many loopholes through which this extremist content has been able to avoid being censored or regulated.
The report suggests “a cultural change is required in terms of how social networking and media sharing platforms understand their role,” and there might need to be change in how media platforms are set up and operated, as many times users are paired with others or shown content based on their interests.
Events of domestic terrorism such as the insurrection at the U.S. Capitol last month are only heightened by the ability to spread conspiracy theories with just a retweet or a share. The report concludes that social media platforms can make meaningful change by only choosing to disseminate reputable information, thus stopping the spill of conspiracies into media sources.
The full report can be found here on the GNET website. The Global Network on Extremism & Technology is an academic research initiative based in the United Kingdom and backed by the Global Internet Forum to Counter Terrorism (GIFCT).