Facebook Is Testing New Ways To Fight Misinformation By Hiring Part-Time Fact-Checkers

The social media giant continues to struggle with the spread of misinformation via its platform.

Facebook CEO Mark Zuckerberg pauses while speaking about the new Facebook News feature at the Paley Center For Media on October 25, 2019 in New York City.
Drew Angerer / Getty Images

The social media giant continues to struggle with the spread of misinformation via its platform.

Facebook said Tuesday that it’s hiring community reviewers to help shorten the time it takes to identify a false post, as part of a new pilot program that could help the social media network crack down on misinformation.

The new program will put community reviewers to work as researchers to find additional information to corroborate or contradict claims made in posts that will then be shared with third-party fact-checkers for official review.

“[I]f there is a post claiming that a celebrity has died and community reviewers don’t find any other sources reporting that news — or see a report that the same celebrity is performing later that day — they can flag that the claim isn’t corroborated. Fact-checkers will then see this information as they review and rate the post,” Facebook product manager Henry Silverman wrote in a blog post.

Facebook partnered with global public opinion and data company YouGov to determine requirements for selecting community reviewers to ensure they represent the “diverse viewpoints” of the platform’s user base. Axios reported that the contractors will be hired through Appen, a company based in Australia that provides data for improving machine learning and artificial intelligence. Facebook’s third-party fact-checkers are approved by Poynter’s International Fact-Checking Network.

The new move highlights how the world’s largest social media platform is responding to sustained criticism that it doesn’t do enough to combat false claims on its site. However, Facebook would not say how many contractors it would need to match the workload.

Facebook has routinely come under fire for not fact-checking posts quickly enough. In May, it left up an altered video of House Speaker Nancy Pelosi that made it appear as if she was drunk — but reduced its spread after third-party fact-checkers rated the content as false, as previously reported by The Inquisitr. By that time, the video had already been viewed millions of times, leaving Facebook CEO Mark Zuckerberg to admit that the company could have acted in a more nimble fashion.

As the 2020 election cycle continues to heat up, Facebook has faced greater scrutiny over a policy that allows politicians and political parties to lie in ads served on the platform. Last month, one of the social media network’s fact-checking partners ended its relationship over the matter, according to NPR.

“When the program started in the Netherlands, this was just after the Trump election in the U.S. And the whole reason why this program was brought up in the first place was about things that could harm democracy or elections,” said Gert-Jaap Hoekman of Nu.nl, a Dutch news site hired by Facebook for political fact-checking.

He noted that “almost immediately” they learned it “was not really the case” that Facebook was interested in preserving democracy through fact-checking politicians.

Facebook-owned Instagram has also been taking more steps to combat misinformation on its platform. On Monday, the company said it would expand its fact-checking program globally.