It’s alleged on a regular basis: Facebook is deleting pages, memes, photos, or content due to a political agenda. Is it really likely, though? To answer the question, one has to consider how Facebook works, and what the value of censoring speech would be.
Facebook is accused of deleting a number of types of content — liberal, conservative, breastfeeding photos, images of soldiers, religious images, and images relating to minority rights. The truth is, Facebook does delete all of the above from time to time, but it’s unrealistic to think it’s related to any agenda.
Memes or political posts often have lines to the effect of “Facebook keeps deleting this photo!” attached, in order to provoke emotional response and sharing. It’s so common that humor pages have started making satirical versions, such as the one below.
Snopes actually had to officially declare the image satire — because it, too, was being shared by those who couldn’t believe Facebook would dare.
Currently, Facebook is being widely accused of inappropriately deleting anti-Islam material. Breitbart recently covered the story, describing an incident regarding the Facebook page of Ingrid Carlqvist. Carlqvist had shared a video that claimed immigration is creating a rape crisis. The video amassed nearly a million views, and it’s reasonable to assume that not all viewers agreed with it, or appreciated its presence on Facebook. However, it’s also safe to say some number reported it to Facebook for removal.
Facebook typically reviews content that is marked as inappropriate, and determines whether the material should remain or be removed. Sometimes, a group might collaborate a mass attack to have content removed intentionally. Facebook says it does not remove content just because of multiple reports if it doesn’t violate standards. However, if every system has a margin of error, more attempts may increase the likelihood of falling into that margin.
An example of mass reports is described on the Skeptical Raptor blog. Allison Hagood, an author of a book about childhood vaccines, has repeatedly had images reported to Facebook, and she has been repeatedly blocked from Facebook for weeks at a time.
It’s probably safe to assume this isn’t Facebook on an anti-vax kick, considering that Mark Zuckerberg made a point of publishing a photo of his own newborn child with the caption, “Doctor’s visit — time for vaccines!”
(To be clear, neither Hagood nor Skeptical Raptor alleged that Facebook was deleting images or blocking her account due to an agenda; it’s just one clear-cut case in which the site has deleted material or blocked users for content that clearly does not go against Zuckerberg’s own stance.)
Another recent report alleged that Facebook was censoring conservative news sites. The Daily Dot published a denial from Facebook. Even the original story, published in Gizmodo, indicated that employees weren’t told to exclude conservative sites, but to choose neutral sources.
“But we would have to go and find the same story from a more neutral outlet that wasn’t as biased.”
There are a few things that violate Facebook’s Community Standards. These violations include nudity (nipples, buttocks, and genitals, excluding breastfeeding and post-mastectomy), hate speech (described below), and graphic violence.
“Facebook removes hate speech, which includes content that directly attacks people based on their:
Sex, gender, or gender identity, or
Serious disabilities or diseases.”
Of course, users who do wish to post content that attacks groups of people based on race, religion, or any of these other factors, may see this as a political difference, but keeping these comments off Facebook means making it feel like a comfortable place for users spanning all these demographics.
It’s entirely possible that a photo or post reported for any of these reasons will get skipped, and be allowed to stay on Facebook. It’s also possible for something officially excluded (such as breastfeeding photos) to be deleted.
Does that demonstrate an agenda on Facebook’s part, or just algorithms that need more tweaking?
Here’s another factor to take into consideration.
“Our review decisions may occasionally change after receiving additional context about specific posts or after seeing new, violating content appearing on a Page or Facebook Profile.”
Even in the unlikely event that Facebook’s owners should decide they hated flags, breastfeeding, vaccines, or Little Timmy, removing content without a good reason (like that it would endanger other users or drive them off the site) wouldn’t be likely — because it stands to reason that what the company likes more is being a viable business model that makes a profit.
Forbes explained it 3 years ago: on Facebook, you are the product. You aren’t paying to use Facebook, so how does the company turn a profit? You’ve probably seen ads pop up in your Facebook feed. Maybe you were even startled by how, coincidentally, an ad popped up for the very thing you had recently posted about.
It wasn’t a coincidence; one thing Facebook does is to target ads to users based on the content they share. If the social network either runs users off because of political differences, or blocks content that might help target ads, it’s Facebook who loses.
It’s not just that Facebook says they don’t censor certain content because of political or social bias, it’s that it makes no sense for Facebook to do that.
[Photo by Stephan Lam/Getty Images]