iStock

Facebook’s community standards on hate speech have long been the subject of contention. Facebook’s unclear guidelines and inconsistent enforcement of its hate-speech policy has caused the Declaration of Independence to be flagged, as well as stories from users about being called racial slurs. On the other hand, Holocaust denials are protected; users also say they’ve flagged anti-black images and cartoons on the social media site that were never taken down.

Which is what makes a British journalist’s undercover experience at Facebook so intriguing.

Advertisement

As Business Insider reports, a journalist for the British network Channel 4 went undercover as a Facebook moderator for a documentary on the social network.

The reporter says they found multiple instances where offensive images containing child abuse, racism and violence were deliberately left on the site.

Among these was a meme of a little girl with her head being held underwater. Underneath the image of the girl is the caption: “When your daughter’s first crush is a little negro boy.”

Advertisement

The undercover reporter was working for CPL Resources, a content-moderation contractor based in Dublin that was tasked with enforcing Facebook’s standards by ignoring, deleting or marking flagged content as disturbing.

Business Insider reports that CPL employees told the reporter to “ignore” the image because “it implies a lot, but to reach the actual violation, you have to jump through a lot of hoops to get there.”

In response to the allegation, Facebook told Channel 4 that the image of the drowning girl did meet its criteria for banned hate-speech, and it was “reviewing what went wrong to prevent it happening again.”

Advertisement

As a Washington Post article from last summer noted, Facebook has long shied away from being a gatekeeper for speech, defending itself by saying it was a tech company, not a media company.

This has left many Facebook users wondering about the network’s seemingly arbitrary rules for problematic posts.

From the Post:

In Facebook’s guidelines for moderators, obtained by ProPublica in June and affirmed by the social network, the rules protect broad classes of people but not subgroups. Posts criticizing white or black people would be prohibited, while posts attacking white or black children, or radicalized Muslim suspects, may be allowed to stay up because the company sees “children” and “radicalized Muslims” as subgroups.

Advertisement

In the Post’s story, one black woman said a Facebook post in which she shared a harrowing encounter with a white stranger hurling racial slurs at her young children was taken down by the site.

The documentary, Inside Facebook: Secrets of the Social Network, also claims a CPL Resources staffer told the undercover reporter that violent content was left on the site because “if you start censoring too much, then people lose interest in the platform.”

“It’s all about making money at the end of the day,” another staffer said.