Click to Skip Ad
Closing in...

This is what Facebook won’t let you post

Published Apr 24th, 2018 9:06AM EDT
Facebook Community Standards
Image: AP/REX/Shutterstock

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Despite the fact that Facebook is the largest social network in the world, it often operates as a nearly opaque box. There’s no oversight of how Facebook manages its communities, and the content policy that dictates when a post has crossed the line seems to be applied inconsistently, at best.

But if there’s one silver lining of the scandal that’s currently engulfing Facebook, it’s that we’re finally getting to learn more about how the company operates. This morning, Facebook took the unprecedented step of publishing its internal moderation policy for all the world to see, which more than anything else begs the question: Why didn’t this happen sooner?

The Community Standards run 27 pages, and are broken up into sections dealing with some of the biggest problems Facebook has to deal with: violence and criminal behavior, user safety, ‘objectionable content,’ integrity and authenticity, copyright material, and content-related requests. Facebook says that the standards are “designed to be comprehensive – content that might not be considered hate speech may still be removed for breaching our Bullying Policies,” and that the standards were designed with “input from our community and from experts in fields such as technology and public safety.”

“We have people in 11 offices around the world, including subject matter experts on issues such as hate speech, child safety and terrorism. Many of us have worked on the issues of expression and safety long before coming to Facebook,” wrote Monika Bickert, VP of Global Product Management, in a post that accompanied the release. “I worked on everything from child safety to counter terrorism during my years as a criminal prosecutor, and other team members include a former rape crisis counselor, an academic who has spent her career studying hate organizations, a human rights lawyer, and a teacher.”

Many of the guidelines for what content is and isn’t allowed shouldn’t come as a surprise to any Facebook user. As you’d expect from any major community-based organization, things like terrorist activity, mass or serial murder, human trafficking, or organised violence or criminal activity isn’t allowed on the platform. However, the line doesn’t just stop at things that are illegal: Facebook also prohibits “attempts by individuals, manufacturers and retailers to purchase, sell or trade non-medical drugs, pharmaceutical drugs and marijuana,” even in places where marijuana is decriminalized or illegal.

“Objectionable content,” which often refers to nudity or sexual content on Facebook, is an equally tricky subject. “We restrict the display of nudity or sexual activity because some people in our community may be sensitive to this type of content,” although the policy notes that an exemption may apply if “it is posted for educational, humorous or satirical purposes.”

Hate speech is another category of “objectionable content” that Facebook has struggled with, particularly with drawing the line between hate speech and expression of free speech. The content policies take a notably hard line on hate speech, defining it as “a direct attack on people based on what we call protected characteristics – race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity and serious disability or disease.” Facebook then defines anything that meets those criteria as a Tier 1, Tier 2, or Tier 3 attack, with Tier 1 being something like “violent speech or support for death/disease/harm,” while Tier 3 is “calls to exclude or segregate a person or group of people based on the above-listed characteristics” that do not use particularly strong language.

In addition to publishing its full guidelines for what is and isn’t allowed on the platform, Facebook has also unveiled an appeals process for anyone who has had content removed due to a violation. Given that Facebook’s moderation decisions have been final in the past, this is a major improvement for transparency. To begin with, appeals will be limited to posts that were removed for nudity or sexual activity, hate speech or graphic violence. If your content is removed for one of those reasons, you’ll be contacted, and given an option to initiate a review. Reviews will always be done by a person, usually within 24 hours.

Chris Mills
Chris Mills News Editor

Chris Mills has been a news editor and writer for over 15 years, starting at Future Publishing, Gawker Media, and then BGR. He studied at McGill University in Quebec, Canada.