One of the Facebook researchers who conducted a secret study on users by manipulating their news feeds has apologised - but defended the experiment.
The social network was branded as "creepy" for allowing researchers to modify the news stories shows on 700,000 users' accounts.
The government-sponsored study study was designed to investigate whether emotions are "contagious" online, and whether seeing more positive or negative stories encouraged users to spread that emotion through the nework.
But Facebook was criticised for not asking users for their permission to take part in advance.
“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,” according to the paper published by the Facebook research team in the PNAS.
But after heavy criticism data scientist Adam Kramer, who led the study, took to (where else?) Facebook to respond.
"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," he said.
"We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper."
- Users very small percentage of stories where edited for 0.04% of users
- "Nobody's posts were "hidden," they just didn't show up on some loads of Feed."
- 'Hidden' stories were resurfaced on subsequent log-ins
"Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.
While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper."
In an official statement Facebook said:
"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person's Facebook account.
"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible.
"A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow.
"There is no unnecessary collection of people's data in connection with these research initiatives and all data is stored securely."