White Separatists And White Nationalists To Be Banned From Facebook

Social media company found that words matter when it comes to hate speech.

Mark Zuckerburg at conference.
Justin Sullivan / Getty Images

Social media company found that words matter when it comes to hate speech.

Facebook announced in a blog post today that they are banning any praise, support, and representation of white nationalism and separatism.

Motherboard reported last spring that Facebook moderators were banning users from promoting white supremacy, whereas white nationalism and separatism were allowed.

Heidi Beirich, head of the Southern Poverty Law Center’s (SPLC) Intelligence Project, told Motherboard last year that “white nationalism is something that people like David Duke [former leader of the Ku Klux Klan] and others came up with to sound less bad.”

Facebook determined that white nationalism and white separatism are just “less bad” ways of saying white supremacy, and that all are concepts that are deeply linked to organized hate groups.

The new policy will go into effect next week.

Ulrick Casseus is a subject matter expert on hate groups on Facebook’s policy team. He told Motherboard that they repeatedly saw instances that those claiming they were not racists or white supremacists but were white nationalists would use hateful speech and hateful behaviors.

“They’re trying to normalize it and based upon what we’ve seen and who we’ve talked to, we determined that this is hateful, and it’s tied to organized hate,” Casseus said.

Mark Zuckerburg Gives Speech
  Justin Sullivan / Getty Images

Becca Lewis studies white supremacy on social media for nonprofit technology research organization Data & Society.

She told NBC that “for years, Facebook has tiptoed around the issue of white supremacy on its website, which has ultimately allowed it to thrive there, mostly unchecked.” She said that Facebook’s announcement today is another step in the right direction, but warned that social media companies have not always made good on maintaining their platforms after initial announcements.

Facebook got much of the blame for the mass dissemination of a first-person video detailing the shooting earlier this month at a New Zealand mosque. The shooter, a self-proclaimed white supremacist, broadcast the carnage as a live-stream on Facebook.

Loading...

The footage was easily accessible both during and after the attack, as users recorded the video and shared on YouTube and Twitter.

Facebook Chief Executive Officer Mark Zuckerberg acknowledged the difficulty of policing content from its 2.7 billion users, according to Bloomberg. People are more likely to share posts that will cause an emotional reaction, which leads to a side effect of fake news and extremism.

NBC reported Senator Bob Casey has recently introduced legislation that would require the Justice and Commerce departments to study how the internet fuels hate crimes.

Facebook is the parent company of Instagram, which will be implementing the same policy changes.