Facebook Groups play a key role on the Facebook platform. Groups are essentially discussion forums organized around an endless number of possible subjects, allowing individuals to pursue their interests and communicate around shared interests. Groups can be public (posts are visible to all Facebook users) or private (individuals must be approved by the group administrator to join and see the posts).
Despite public announcements of efforts to curb hateful speech and misinformation across its platform, Facebook still hosts many spaces in which this content propagates and thrives. While some of these problematic activities occur in small groups, a number of problematic Facebook groups have grown to a significant membership size, yet are still allowed to exist by Facebook.
This is not by accident. Groups are largely self-moderated spaces overseen by specific users with administrative or moderator rights. Facebook states that groups can be banned for repeatedly reposting items that have been flagged as false news, but the system can be circumvented. While posts that break Facebook’s guidelines can be flagged to the platform by members in the group, some moderators actively warn their members not to report problematic content to Facebook. For example, the rules for the group “#RedneckIII%”, a nearly 2,000-member group associated with the militia movement (discussed more below), say “NO REPORTING If you dont like a post keep scrolling. Simple as that. Reporting to facebook is an automatic boot. You can contact an admin and we will check out your claim.”[sic] In many other instances, racist, homophobic, and other hateful posts are not deemed offensive enough by Facebook’s standards to warrant intervention. Facebook has a history of overlooking this kind of behavior, arguing that some hateful content does not go against their Community Standards.