51.3 F
Washington D.C.
Friday, April 19, 2024

Games Industry Struggles to Moderate Content, Prevent Toxicity In Online Gaming, ADL Report Finds

The 2022 survey found that hate continues unabated: 77 percent of adults experienced severe harassment — including physical threats, stalking and sustained harassment.

As the games industry fails to confront rising toxicity in online gaming, ADL (Anti-Defamation League) today released a new report revealing structural obstacles that prevent the industry from moderating content. The research identifies shortcomings in policies and in the use of trust and safety teams and defines key steps companies can take to offset the threat of spreading toxicity, such as increasing focus on the small number of repeat offenders, and applying content moderation as a principle across the production process.

“The scale of harassment, antisemitism, racism and misogyny on gaming platforms – where so many young people seek entertainment and community – is unacceptable.” said Jonathan Greenblatt, ADL CEO and National Director. “Recent changes at social media companies show us the catastrophic results when the tech industry abdicates its moderation responsibilities. Gaming companies must learn from those mistakes and seriously step up their commitments to creating safe platforms.”

ADL Center for Technology and Society researchers found several persistent issues, including:

  • Trust and safety teams struggle with limited capacity, small budgets and unmanageable workloads. They are caught in a catch-22: For company executives to invest in content moderation, there must be data to document the problem. But trust and safety teams don’t have the resources they need to collect such data.
  • Game companies have inconsistent priorities when it comes to content moderation.
  • Focusing moderation on individuals results in treating those who make occasional mistakes (the vast majority of players) the same as repeat offenders who have nefarious intent.

To address these challenges, ADL recommends that tech companies:

  • Focus on creating a separate category to curb the behavior of the repeat offenders who are disproportionately and persistently toxic. Game companies should identify clusters of users who disproportionately exhibit bad behavior instead of trying to catch and punish every rule-breaking individual.
  • Try player reform strategies, before punitive measures like banning, to warn players that their actions will have consequences.
  • Implement consistent feedback to not only warn against toxic behavior, but also to support and reward good behavior.
  • Build community resilience. Positive content moderation tools work. Use social engineering strategies such as endorsement systems to incentivize positive play.
  • Invest more resources in understaffed and overwhelmed trust and safety teams.
  • Make content moderation a priority in the creation and design of a game – this should be central from a game’s conception to its discontinuation.
  • Avoid jargon to ensure players are actually aware of the rules. Researchers found documents titled “Code of Conduct” are more likely to be read than those titled “Terms of Use.”

“While most users come to online games for entertainment and community, they’re far too often exposed to hate and extremism. Companies must implement consistent moderation policies and fully empower trust and safety teams,” said Yael Eisenstat, ADL Vice President, Center for Technology and Society. “Despite economic headwinds, we urge the industry to see content moderation as a vital business decision to more sustainably make their products less toxic for all users.”

Since 2019, ADL has conducted an annual survey of hate, harassment and extremism in online multiplayer games. Our 2022 survey found that hate continues unabated: 77 percent of adults experienced severe harassment — including physical threats, stalking and sustained harassment.

For this latest report, researchers analyzed terms of use for 12 games representing eight genres and nine game companies. Purposive sampling was used to select five participants who had at least five years of experience in trust and safety and content moderation, especially in online games.

Read more at ADL

author avatar
Homeland Security Today
The Government Technology & Services Coalition's Homeland Security Today (HSToday) is the premier news and information resource for the homeland security community, dedicated to elevating the discussions and insights that can support a safe and secure nation. A non-profit magazine and media platform, HSToday provides readers with the whole story, placing facts and comments in context to inform debate and drive realistic solutions to some of the nation’s most vexing security challenges.
Homeland Security Today
Homeland Security Todayhttp://www.hstoday.us
The Government Technology & Services Coalition's Homeland Security Today (HSToday) is the premier news and information resource for the homeland security community, dedicated to elevating the discussions and insights that can support a safe and secure nation. A non-profit magazine and media platform, HSToday provides readers with the whole story, placing facts and comments in context to inform debate and drive realistic solutions to some of the nation’s most vexing security challenges.

Related Articles

- Advertisement -

Latest Articles