Facebook Leak Reveals How Company Decides Whether To Remove Content


An extensive leak of Facebook’s internal content moderation guidelines has revealed the criteria the company uses to determine whether to remove violent, hateful, and offensive material from its site. The company’s team is reportedly “overwhelmed” with work.

The documents were obtained by the Guardian and comprise over 100 internal training manuals designed to educate moderators on how to treat sensitive content. Topics discussed within the files include hate speech, terrorism, violence, racism, pornography, and even cannibalism.

The so-called “Facebook Files” reveal for the first time Facebook’s internal stance on the need to moderate different kinds of content. They present the company’s organization as lacking in cohesion and occasionally chaotic. Apparently, due to a lack of time, insiders described the rules as “inconsistent” and “peculiar.”

With millions of reports to review each week, Facebook’s team of moderators reportedly have mere seconds to spend on each piece of content. Employees who talked to the Guardian said they have “just 10 seconds” to make decisions on revenge porn material. There isn’t enough time to fully comprehend the complexities of the situation or carefully consider a wider context.

The strange inconsistencies in Facebook’s guidelines are evident across the different content sections covered by the documents. Comments like “Someone shoot Trump” are banned because they represent a “credible threat” to a head of state. In contrast, “f*** off and die” or specific descriptions of how to “snap a b****’s neck” are acceptable because they apparently lack credibility.

[Image by BrianAJackson/Thinkstock]

Facebook allows images of animal abuse to be distributed on its platform unless the material is “extremely upsetting.” In this instance, it should be labeled as “disturbing.” Non-sexual physical abuse of children is permitted as long as it’s not sadistic or celebratory. The company will also tolerate videos of violent deaths because they can increase awareness of issues including mental health problems.

Some moderators told the Guardian that Facebook has “grown too big” and “cannot keep control of its content.” With over 2 billion users, the company has been forced to realize it doesn’t have enough employees to keep tabs on every piece of content.

Echoing CEO Mark Zuckerberg’s comments earlier this year, Monika Bickert, Facebook’s head of global policy management, told the newspaper that the company internally struggles to agree on where to draw the line. Facebook has to represent every user on its platform, covering a diverse range of cultures and communities worldwide. This leads to the issues that see it allow the sharing of material that may be offensive in some regions.

“We have a really diverse global community and people are going to have very different ideas about what is OK to share,” Bickert said. “No matter where you draw the line there are always going to be some grey areas.”

She added that Facebook feels “very accountable” for its content and is continuing to invest in technologies to protect its users.

[Image by OcusFocus/Thinkstock]

The documents have revealed Facebook’s ongoing efforts to gain control of the controversial content published every day to its site. It is so far resisting calls from governments and campaigners that are seeking to impose external regulations. The documents and insider reports suggest the company is at its breaking point, though, reaching the limits of its capacity.

The leak has attracted the attention of a range of different organizations. There have been calls from all quarters for the company to increase its transparency and start talking more about the procedures it follows. Some have suggested the company should officially release its moderator guidelines, allowing users to see beneath the curtain and understand the content they’re exposed to online.

“Facebook’s decisions about what is and isn’t acceptable have huge implications for free speech,” digital rights campaigners the Open Rights Group said to BBC News. “These leaks show that making these decisions is complex and fraught with difficulty.”

Facebook recently hired 3,000 more community moderators to help police its platform. However, the 10-second decision time revealed by its existing employees suggests these new workers won’t do much to solve the problem. As it faces criticism from regulators, free speech groups, and safety organizations, Facebook’s current approach may not last much longer.

[Featured Image by KatarzynaBialasiewicz/Thinkstock]

Share this article: Facebook Leak Reveals How Company Decides Whether To Remove Content
More from Inquisitr