Right now, it seems everyone is talking about a controversial Facebook study that, without user consent, secretly enrolled Facebookers into a social media hellscape of emotional manipulation and an intentional distortion of their perception of reality.
So Facebook is gaslighting you and everybody you know. What’s the big deal? Didn’t we sign up for this in the terms and conditions, after all? Suck it up, whiners!
We learned late this week that the social network giant, Facebook, had been engaging in a strange and unprecedented study of its users emotions. There’s a lot to unpack given that Facebook has unprecedented social data and a frightening body of knowledge on how to manipulate its users emotions — and one study that flies in the face of research ethics is probably the tip of the iceberg.
A lot of debate has since ensued, about whether Facebook’s TOS covers such use of data. Be very aware, it does not. Yes, Facebook can use your data, but it’s generally accepted that individuals subject to a research project that will affect them in any way give informed consent to such. This did not occur.
In this recently revealed Facebook study, users were found to have literally been intentionally emotionally manipulated by Facebook, shown a stream of either neutral to positive or neutral to negative information, to see whether this deliberate manipulation affected their state of mind.
However, there’s reason to believe that that’s not even close to the top of the iceberg — this immediately sprang to mind:
What’s messed up is that according to the study, it’s sort of kind of worked. What’s even more messed up is that the study done on Facebook users entirely bypasses accepted conventions for obtaining consent from subjects, as we indicated above.
Not just a few –hundreds of thousands. Hundreds of thousands of Facebook users were deliberately shown a mixture of content meant to influence their emotional states, and while that’s pretty scary when you consider the suicidal, the troubled married, etc. — it’s not even close to the amount of experimentation Facebook has probably done on your life, behaviors, and relationships.
Even scarier? I don’t work a Facebook, but I’m pretty sure this is happened on a larger scale for similar reasons in a non-study format for a long time.
And why do I suspect that? Well, it’s complicated.
Given my work for The Inquisitr, I am on Facebook a fair bit, as it’s a fantastic tool for content and news. I’ve noticed that certain contacts or acquaintances with whom I do not get along with well appear more in my feed via mutual friends, to start. Anecdotally, I’ve always suspected this was an algorithm meant to utilize people’s conflicts and friend possessiveness to spur interaction — it’s pretty hard to walk away from an internet fight, isn’t it?
It’s also fairly minor, unproveable, not a major thing. So what if we get pulled into a stupid Facebook fight? It’s not a major deal, most of the time. But if you look closely at your feed, you may notice a similar potential manipulation.
For instance, I have a great aunt I love and to whom I don’t often speak because of our locations — I don’t see her comments or interactions, but when she “likes” or comments on a political story with which I do not agree for a page I have not “liked,” it is pulled into my feed. It’s like Facebook wants me to snark at my great aunt, not cool, Facebook.
My own long-term suspicions of Facebook’s gaslighting campaign run a bit deeper, however. Another instance that seemed very unsettling to me occurred about a year and a half ago, when I traveled to a convention with a male friend. Having not interacted on Facebook about where we were going or what we were doing, it seems as if Facebook often takes the liberty of detecting phones near one another — I’ve had a person I barely know immediately appear in my suggested friends list if we are in the same public place, which is creepy.
But this time, whenever we separated during the trip, Facebook appeared to be doing its level best to take the interactions my friend was having with other female users and push them to the top of my feed, which weirded me out. “Your friend likes this picture, your friend says Sarah’s dress is pretty.” It could just be proximity, but it seemed to be a specific sort of content, which was unnerving, like Facebook naturally assumed this sort of interaction would upset me.
However, my largest suspicion that the behavior now being criticized in the Facebook study backlash was so prevalent and common goes back to another odd thing the News Feed did to me more than a year ago. I am one of those people who keeps their personal and relationship details largely off Facebook — specifically not because I think it’s bad to do, but as a content creator, I feel my friends have limited interest in my love life. Who cares if I received a gift or went on a date? I’ll tell them I liked the restaurant, but digital PDA isn’t my bag, mainly because let’s face it — other people’s personal lives are as boring as all get out.
As such, I don’t use Facebook relationship functions. But at the time, last spring, I’d had a pretty big disagreement with my boyfriend. Believing I’d frustrated him into an inevitable parting, I deleted him from my friends’ list — not in retaliation, but to spare myself anticipated upset.
Bear in mind, we’d never used the relationship function, we’d barely interacted on Facebook, and he lives in another country, so our phones wouldn’t often be detected near one another. Overtly and through given information, there was no way to suspect this would create a Facebook ripple.
Which is why, when I opened my laptop to this, I was really creeped out:
I’ve never seen Facebook do this before, and have not seen it since. Pinned to the very top of my feed, all night long, above all stories, was this ad. For a sleeping medication. Despite having avoided logging my relationship on Facebook, Facebook not only knew that I was having problems, it knew they were sleep disrupting problems.
It was here to help… by selling me a sleep aid. All night long. Part of me wondered, knowing my shopping habits, would it disappear if I went to CVS? Would it deduce from my location at a pharmacy that I’d responded to the ad, would it pull data from my CVS card or debit card, and would this immediately be considered successful marketing? The suggestion didn’t let up, or move, leading me to believe Facebook is definitely aware of heightened emotional states even if you don’t specifically share that information.
As it turns out, we reconciled immediately, but the Facebook study immediately seemed to confirm that suspicion, that Facebook does use what it has learned about us and we haven’t told it to influence what we buy and how we interact — even if that comes at the expense of our emotional well-being.
The plural of anecdote is not data, and these personal experiences in no way suggest Facebook is gaslighting its user base regularly. However, I can’t help but wonder how many deeply depressed people are affected should this be a genuine practice, how many marriages didn’t survive, how many employees were depicted in a distorted manner to co-workers or bosses.
A good analogy to what you give Facebook versus what it collects from you might be an interrogation room. If you’ve ever watched the above embedded and excellent video on talking to police, it touches on how police interrogations manipulate citizens. You think what you write and sign is your statement, but Facebook has jailhouse phone calls, every comment you made within police airshot, and video of you in the room.
Facebook is that sneaky cop, that “turned off” tape recorder, that CCTV. Facebook allows you to believe that you control what you share, when in the aggregate, it’s taking its picture of you from your phone, your GPS, your spending habits, and your relationships. Facebook is the ultimate trick question.
The study itself is being questioned, as Facebook’s manipulation of users’ experiences was not done with informed consent as is standard in research of that nature. But as with much of our privacy concern about Facebook, this outrage seems to miss the forest because of all the trees.
One Facebook study that was albeit carried out with sucky ethics is definitely worrisome — but the real question is how deeply Facebook is using your life’s data (purchases, comings and goings, health, political leanings, and about 100 other things) to alter how you see the world and your life. It doesn’t even mean that you should quit Facebook — given the scope of metadata, it’s pointless — but being aware your perception of things through the lens of Facebook can’t be trusted is a good first step.