Click to Skip Ad
Closing in...

Facebook’s secret content moderation policies are just as controversial as you’d expect

Updated Nov 22nd, 2019 4:32AM EST
BGR

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Facebook has a huge content moderation problem on its hands, a new report says, revealing the secret rules and guidelines that govern the social network’s content moderation efforts. Facebook has come under fire for the role it had in the spread of fake news during the US presidential election, the kind of viral reports that are believed to have tilted the vote in Trump’s favor. Facebook is already taking measures to curb the spread of fake news. It’s also fighting the spread of revenge porn, the company recently revealed. But this new report focuses on the complex set of rules the company uses to censor content posted on Facebook. Just as you might expect, these secret internal rules and regulations are somewhat controversial.

A Guardian investigation reveals for the first time how Facebook officially deals with topics related to “violence, hate speech, terrorism, pornography, racism, and self-harm.”

Facebook wants to control potentially disturbing content in a manner that doesn’t hinder the right to expression and doesn’t censor access to news and information that users may want to see, regardless of whether or not some people find it offensive or inappropriate. The Guardian says that many moderators have concerns about the inconsistency and peculiar nature of some of the policies they have to respect, especially the ones covering sexual content, which are “the most complex and confusing.”

 

The report lists various examples taken from Facebook’s secret docs.

For example, remarks as “Someone shoot Trump” should be deleted because the president is part of a protected category. But comments like “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat,” or “fuck off and die” are not regarded as threats.

Facebook’s policies would also allow content that depicts non-sexual physical abuse and bullying of children, animal cruelty, videos of abortion that do not contain nudity, and people live-streaming attempts of self-harm. But for each of these types of content, Facebook has rules in place that may limit the spread of content. Here are some examples:

Videos of violent deaths are disturbing but can help create awareness. For videos, we think minors need protection and adults need a choice. We mark as ‘disturbing’ videos of the violent deaths of humans.

We do not action photos of child abuse. We mark as disturbing videos of child abuse. We remove imagery of child abuse if shared with sadism and celebration.

We allow photos and videos documenting animal abuse for awareness, but may add viewer protections to some content that is perceived as extremely disturbing by the audience.

Generally, imagery of animal abuse can be shared on the site. Some extremely disturbing imagery may be marked as disturbing.

These rules do not make Facebook evil. They just show the company has a real problem censoring content, while simultaneously trying to protect free speech. Another consideration is the more time people spend on Facebook, the more money the company makes, regardless of what the users share. However, this report does indicate that Facebook is willing to take steps to improve the quality of content and prevent disturbing posts and imagery from hitting your news feed, even if it doesn’t seem like Facebook is doing enough at times.

The Guardian’s full story is available at this link, and it’s definitely worth a read.

Chris Smith Senior Writer

Chris Smith has been covering consumer electronics ever since the iPhone revolutionized the industry in 2008. When he’s not writing about the most recent tech news for BGR, he brings his entertainment expertise to Marvel’s Cinematic Universe and other blockbuster franchises.

Outside of work, you’ll catch him streaming almost every new movie and TV show release as soon as it's available.