Facebook Manipulated 689,003 Users' Emotions For 'Creepy' Secret Experiment

Facebook Gets Really,Creepy
File photo dated 19/10/12 of a general view of the Facebook home page on a laptop screen as the two websites Britain cannot live without are the BBC and social networking site Facebook, new research has found.
File photo dated 19/10/12 of a general view of the Facebook home page on a laptop screen as the two websites Britain cannot live without are the BBC and social networking site Facebook, new research has found.
Dave Thompson/PA Wire

Facebook did something really creepy, and you may never even know if it affected you.

The site has been playing mind games on users and, handily for the social media giant, there is no need to get experiment participants to sign any pesky consent forms as they’ve already agreed to the site’s data use policy.

So they are free to manipulate their one billion users however they choose.

In a move that probably has George Orwell spinning in his grave, Facebook secretly manipulated the news feed of about 700,000 users - testing them to see if they could make them angry or happy.

The Huffington Post UK spoke to a researcher specialising in Internet infrastructure and public policy about the creepy new Facebook revelations.

Christian Sandvig, an Associate Professor of Communication Studies and Information at the University of Michigan, told HuffPost UK that what Facebook is doing is "not an ethical research design."

"There is a big difference between our expectations for academic social science and our expectations for Facebook. And that difference is reasonable," he said.

"We are right to expect that psychologists are not secretly experimenting on us via our Facebook feed, but I certainly expect Facebook to experiment on its users for its own gain."

He added Facebook's intentions are "confusing" as it is not clear whether this should be judged as "academic social science or as a self-interested private company that acts on its own behalf. It muddles the debate," he said.

The study on "emotional contagion" has unsurprisingly prompted fury and horror on social media sites (ironically of course, on Facebook).

But it was all perfectly legal according to Facebook's rules, despite questions being asked about whether it was at all ethical to make hundreds of thousands of unknowing users happier or more depressed than usual.

“*Probably* nobody was driven to suicide,” tweeted Mr Sandvig, adding a “#jokingnotjoking” hashtag.

Facebook’s data use policy says users' information will be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement,” making all users potential experiment subjects.

For one week in 2012 Facebook played God/ tampered with the algorithm used to place posts into user news feeds to study how this affected their mood, removing either all of the positive posts or all of the negative posts.

So, if there was a week in January 2012 where you were only seeing photos of either dead dogs or frolicking puppies, you may have been part of the study.

Mr Sandvig said although sites like Facebook "exist in order to manipulate their users' attention," the site’s data use policy does not give them free reign to do what they want.

"If Facebook slows down some of my traffic to improve network performance, that's a different scenario than if they do so to see if they can make me sad. Facebook's defenders emphasise that the company manipulates us all the time - and I agree - but that's really beside the point."

"Clearly the future will be creepier unless something is done," he warned.

Even Susan Fiske, the Princeton University psychology professor who edited the study, felt uneasy about it.

She told The Atlantic: "I was concerned until I queried the authors and they said their local institutional review board had approved it -- and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time."

"All. The. Time." How comforting.

The researchers, led by data scientist Adam Kramer, cheerfully noted that based on the experiment they found out that emotions were contagious.

“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,” according to the paper published by the Facebook research team in the PNAS.

“These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

In a statement sent to Forbes, Facebook focused on privacy and data use rather than the ethics of emotional manipulation.

“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account,” says a Facebook spokesperson.

“We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow."

But users have responded with outrage:

Facebook employs a group of data scientists to study user activity and publish their findings, often pegged to events like Valentine's Day and national elections. But until now, the research has mostly fallen into the category of "observational studies" -- that is, research that involves someone poring over existing data and trying to draw conclusions from it.

The News Feed manipulation, though, is a different beast. It's an experiment, in which scientists create the data by tweaking one variable to see if it affects another. That's what's disconcerting: The "things" being manipulated in this case are people on Facebook -- i.e., basically everyone with an Internet connection.

Close

What's Hot