In the past five years, terrorism from the extreme right has surged by 320%. In 2019, high-profile attacks in New Zealand, the US, Germany and Norway were committed by individuals with little or no connection to extremist organisations or proscribed terrorist groups. Instead, evidence suggests that these individuals were connected to loose extreme right networks largely operating online. This points to a shift towards a post-organisational paradigm whereby online connection to extremist culture and ideology could be equally important for inspiring violence as connections to “on the ground” groups.
In understanding this post-organisational landscape it is essential to analyse the online ecosystems which provide a permissive space where violent and terrorist activity can be explicitly endorsed. The encrypted messaging platform Telegram – originally established to provide users living in authoritarian states with a means of secure communication – has become one of these spaces, and is now an important platform for white supremacist actors to propagate violent and extremist messages.
Telegram has gained considerable attention as one of the key communication tools used by jihadist groups such as ISIS. A 2019 study by Georgetown University’s Program on Extremism uncovered 636 Telegram pro-Islamic State channels that contained English-language propaganda. The company has since taken action against ISIS, including a major operation with The European Union Agency for Law Enforcement Cooperation (Europol) in late 2019 to take down networks of ISIS-related channels.
Telegram has limited content-moderation policies, only banning the promotion of violence on public channels and the sharing of illegal pornographic material.5 As a result of these narrow content-moderation policies, Telegram has become an important platform on which white supremacist actors can gain momentum. This mobilisation covers a spectrum of activity, ranging from general ideological discussions to the promotion of political violence, the glorification of terrorist attacks, and even sharing guides which help individuals prepare for violence.
To better understand the scale and nature of this mobilisation and the risks posed by these communities, ISD has been monitoring 208 Telegram groups which promote white supremacist ideology, focused mainly on the US context, and analysing more than a million posts.
This research highlights how, through its limited content-moderation policies, Telegram has become a safe space for white supremacists to share and discuss a range of explicit extremist material. Furthermore, it shows that many of these Telegram communities have become permissive environments where overt calls for violence and support for terrorism is widespread. Much of the content which we have identified appears to breach Telegram’s Terms of Service which prohibit the promotion of violence, suggesting that the platform’s current enforcement of its policies is ineffective.
Due to the highly egregious nature of violent and pro-terrorist content which we identified in this study, ISD believes that an “early warning system” which facilitates the semi-automated identification of high-risk content should be trialled in order to detect and mitigate calls for violence arising from these channels.