Facebook Just Performed A Creepy Experiment On Users… Without Their Knowledge

Nearly 700,000 English speaking users unwittingly found themselves to be guinea pigs in a psychological experiment conducted by Facebook, and many of those affected may still be unaware that they were even involved.

Data scientists working with Facebook reportedly manipulated what users saw in order to determine if emotions are contagious, Think Progress reports. The experiments took place over the week of January 11th-18th, 2012, and affected approximately 689,003 Facebook profiles.

A paper published in The Proceedings of the National Academy of Sciences reveals that Facebook used a selective algorithm to control what users saw in their news feed. Some Facebook profiles saw more positive posts, while others were exposed to more emotionally negative content. Individual Facebook posts weren’t directly affected, and could still be viewed from friend’s profiles. Scientists then examined user’s news feeds in an effort to determine if the tone of what they saw on Facebook affected what they posted.

The experiment was a success, and researchers were able to determine that positive and negative moods are indeed contagious on Facebook, The New York Daily News reports. Positive posts will inspire someone to post more positive things about themselves, with negative posts also proving to be contagious on social media.

Emotions are contagious via social media sites, according to the results of Facebook's experiment
Facebook's experiment proved that emotions are indeed contagious via social media

While the conclusions are interesting, another aspect of the experiment is disturbing. Users were completely unaware that they were taking part in Facebook’s psychological experiment.

Researchers argue that the experiment is defensible under the company’s terms of service, which state in part that user data may be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” Since a computer program scanned for positive and negative keywords, researchers saw no text from Facebook posts during the experiment. “it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook,” researchers argued, saying the policy constituted “informed consent for this research.”

Facebook isn’t the only social media site to raise red flags for its handling of information recently. As The Inquisitr previously reported, Buzzfeed has been accused of keeping user data gleaned from its popular quizzes, which are often shared on social media networks like Facebook.

Many Facebook users have little understanding of the way their personal data is used by the website, or what other parties may have access to that information. According to a Consumer Reports survey conducted before the experiment was reported, “only 37 percent of Facebook users say they have used the site’s privacy tools to customize how much information apps are allowed to see.”

[Images via The Guardian and NPR]