This weekend, news of a controversial Facebook study aimed at manipulating the emotions of users was widely criticized — and one of the authors of the research addressed the backlash in a Facebook post yesterday.
Adam Kramer, one of three Facebook study authors, spoke to the methodology and results of the somewhat unsettling experiment, in which the News Feeds of nearly 700,000 users were manipulated to appear overwhelmingly positive or negative based upon which sample group they were sorted into.
Kramer, whose groups’ research was published in the journal Proceedings of the National Academy of Sciences (PNAS), took to Facebook to explain at length — and apologize.
He now says that the Facebook study‘s aim was not clearly articulated, beginning:
“The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends’ negativity might lead people to avoid visiting Facebook. We didn’t clearly state our motivations in the paper.”
After explaining the scope of the Facebook research, Kramer defends the overall experiment, but apologizes for any possible emotional unrest caused by the research — admitting that the upset caused may have outweighed the study’s beneficial findings:
“The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
Above is a Facebook link to Kramer’s comment published on the social network — and in the comments, one heavily liked response comes from a poster named Claire Litton. Litton counters that the research did not fall in line with accepted ethical practices, citing previous, controversial research that was later decried for ethics-related breaches:
“The exact reason that we have ethics review committees is to prevent repeats of Milgram, Stanford, the LSD experiments…experiments on human subjects where informed consent was NOT given and participants were unaware of the existence of the study. Agreeing to Facebook’s TOU is not blanket consent for experimental process; in order to follow guidelines of consent, more information would have been required to the ‘participating’ subjects.”
“Arguing that you signed the TOU so you gave consent for everything is false: it wasn’t informed, and it wasn’t actually consent.”
The Facebook study’s abstract is available over at PNAS.