Facebook’s ‘Frightening’ Trust Ratings Similar To China’s Social Credit System, Per ‘Forbes’


Last month, we learned that Facebook has a trustworthiness score for each user. The score is based on a person’s reputation, so if someone with a high score flags or reports “fake news” or other inappropriate items, the information is forwarded to an employee to review. On the other hand, if someone with a low score does the same thing, their requests are essentially ignored.

That might sound simple enough, but Forbes questions the role of these scores, and even makes comparisons to China’s social credit scores. Both scores are calculated in secretive ways, and people don’t know how their scores are created and what the scores are being used for.

Forbes points out that the implications of Facebook user scores are troubling, considering that the company has already toyed with the idea of allowing retailers to access their facial recognition technology and database to track customers in stores. In this instance, the application would flag certain people at stores who ought to be watched closely by security.

Moreover, the potential military and world governments’ use of Facebook’s scores is a potential reality. It’s already known that some governments have asked Facebook to reveal lists of people that their algorithms have deemed to fall within certain parameters. For example, the algorithm can reportedly predict homosexuality and even one’s political views. This can be dangerous because these groups of people can be persecuted severely in certain countries.

One of the main concerns behind the score includes the issue of underlying biases.

“As with any kind of user ‘trust’ rating, Facebook’s new scoring system is vulnerable to myriad possible biases, especially given the unknowns regarding the full set of signals used to calculate them.”

Just like any other algorithm, the way that they are written can prove to have huge biases against or for certain groups of people. And since algorithms are completely computerized, any bias can cause a huge ripple effect later on. However, Facebook declined to allow a third-party expert review of the algorithm for bias, leaving the giant with their own ways to calculate people’s “trustworthiness.”

This is what Claire Wardle, director of First Draft from Harvard Kennedy School, said about the score, which explains why the algorithm is not being allowed for outside review.

“Not knowing how [Facebook is] judging us is what makes us uncomfortable… But the irony is that they can’t tell us how they are judging us — because if they do, the algorithms that they built will be gamed.”

Share this article: Facebook’s ‘Frightening’ Trust Ratings Similar To China’s Social Credit System, Per ‘Forbes’
More from Inquisitr