Facebook on Wednesday made a decision with regard to a U.K. group on its platform. It expelled the Britain First Group permanently on the basis of breaking the rules established and for inciting hatred against minority groups.
According to Reuters, the far-right group was blacklisted when President Donald Trump retweeted anti-Islamic posts. In addition, after the retweet, Twitter and YouTube took action against the group. Facebook’s Policy Director, Simon Miller, informed members of Parliament months earlier that they were investigating Britain First’s activities. As a result, the group was expelled from Facebook.
“Facebook said it had taken down Britain First’s Facebook page and those of its leaders, Paul Golding and Jayda Fransen, for repeatedly violating rules designed to stop the incitement of hatred against minority groups.”
The exposure and number of followers for Britain First were substantial. As reported by the Telegraph, the group had become one of the largest U.K. political parties on Facebook. They had less than 2 million likes on its page.
Once again, Facebook joins Twitter and YouTube in the fight to moderate and prevent the spread of extremist content. A day earlier, YouTube’s CEO announced at the South by Southwest (SXSW) conference that the platform was rolling out efforts to curb the spread of conspiracies in videos. In other words, it will add information cues or sources to better educate viewers on conspiracy theories.
Several countries are aligning their efforts to fight the presence and spread of hate speech. As explained in the above report from Reuters, Prime Minister Theresa May, along with leaders of France and Italy, urged social media networks to put in place mechanisms to remove extremist content. May welcomed the news from Facebook and encouraged others to take the same action.
According to Facebook, it did not make this decision lightly. The social network has in place anti-hatred rules. As such, the Britain First group had clearly violated the established norms with its anti-Islam posts.
“We do not do this lightly, but they have repeatedly posted content designed to incite animosity and hatred against minority groups, which disqualifies the pages from our service,” Facebook said in a blog post.”
As the largest social network, Facebook has struggled to curb the spread of misinformation and hate speech. The Verge explains that for a number of years, Facebook has not dealt effectively with extremist content. Furthermore, the action is more symbolic, but still, it is just a start of a campaign to combat hate groups.
Finally, in an investigation carried out by the Wired, it found a network of Facebook pages that shared the hate content shared by Britain First. Those pages are still online, and as suggested in the news article, they should be investigated.