53.6 F
Washington D.C.
Friday, April 19, 2024

Hate and Harassment Drive One-in-Four Moderator Actions on Minecraft Servers, Study Finds

The presence of slurs previously only affiliated with white nationalism and hate groups suggests the normalization of extreme language in gaming spaces.

One-in-four moderation actions across three private servers of the popular video game Minecraft are in response to online hate and harassment, according to a study published today by ADL (Anti-Defamation League) Center for Technology and Society, in collaboration with Take This, Gamersafer and the Middlebury Institute’s Center on Terrorism, Extremism, and Counterterrorism.

“As with many online games, we’ve found that large numbers of Minecraft users experience hateful speech and harassment while using the platform,” said Jonathan Greenblatt, ADL CEO. “From this snapshot, it is clear that Minecraft and the gaming industry broadly must do more to ensure their online spaces have robust community guidelines and that they provide researchers access to more data and information on its servers.”

The study found that of all content that elicited a moderator response (including a ban, warning, mute, or kick from the server), 16 percent were the result of harassment and 10 percent were the result of identity-based hate.

Through the course of this study, ADL found that:

  • Many in-game offenders are repeat offenders. Almost one fifth of offending users had multiple actions taken against them during the data collection.
  • Hateful messages are 21% more likely in public chats than private chats. Messages with identity-based hate were 21 percent more common in public chats.
  • Servers with in-depth community guidelines were associated with more positive social spaces. Of the three servers reviewed, the server with the most extensive community guidelines and highest ratio of moderators to players had the lowest frequency of sexually explicit, hateful, and severely toxic behavior between users, suggesting the positive impact of robust guidelines.
  • Temporary bans proved to be an effective solution for reprimanding bad behavior. Early evidence shows temporary bans to be more effective than muting in reducing the rate of offending behaviors by the moderated player.
  • Hateful rhetoric is common in gaming spaces. The presence of slurs previously only affiliated with white nationalism and hate groups suggests the normalization of extreme language in gaming spaces.

A previous survey published by ADL last year revealed that extremist messages continue to be a concern in online games: One-in-10 young gamers and 8 percent of adult gamers were exposed to white supremacist ideologies in online multiplayer games.

To better address hate and harassment across its platform, ADL recommends that Minecraft takes the following actions:

  • Invest in content moderation efforts and robust community guidelines. Active, effective human moderation and community guidelines are critical to reducing sexually explicit, hateful and severely toxic behavior in gaming spaces as the server with the most staff and most extensive guidelines had the fewest incidents of these kinds of behaviors. Industry leaders need to continue investing in moderator training to better understand and respond to toxic behaviors.
  • Increase researcher access to data. Without providing researchers access to unfiltered data, the games industry cannot identify or address the challenges of hateful, harassing and toxic behavior.
  • Conduct additional research on content moderation and complementary tools and techniques. Moderator intervention seems to reduce harmful behavior in the short term and individual level, but it remains unclear whether this holds over time and across the server. Future research should focus on determining the long-term and aggregate effects of moderator intervention.
  • Standardize reporting categories. To better understand the frequency and nature of hate in online spaces, we recommend an industry-wide standardization of moderation reporting, including defined categories and violating offenses with clear descriptions. This would help facilitate future research, particularly in regards to documenting how moderation actions change user behavior over time. The ADL’s Disruption and Harms in Online Gaming Framework could be used as the foundation for this effort.

Building on ADL’s century of experience building a world without hate, the Center for Technology and Society (CTS) serves as a resource to tech platforms and develops proactive solutions to fight hate both online and offline. CTS works at the intersection of technology and civil rights through education, research and advocacy.

Read more at ADL

author avatar
Homeland Security Today
The Government Technology & Services Coalition's Homeland Security Today (HSToday) is the premier news and information resource for the homeland security community, dedicated to elevating the discussions and insights that can support a safe and secure nation. A non-profit magazine and media platform, HSToday provides readers with the whole story, placing facts and comments in context to inform debate and drive realistic solutions to some of the nation’s most vexing security challenges.
Homeland Security Today
Homeland Security Todayhttp://www.hstoday.us
The Government Technology & Services Coalition's Homeland Security Today (HSToday) is the premier news and information resource for the homeland security community, dedicated to elevating the discussions and insights that can support a safe and secure nation. A non-profit magazine and media platform, HSToday provides readers with the whole story, placing facts and comments in context to inform debate and drive realistic solutions to some of the nation’s most vexing security challenges.

Related Articles

- Advertisement -

Latest Articles