Facebook Mood Manipulation Is Bad for Us All

What they did was not ethical. By testing out a theory on unsuspecting users, they were deliberately interfering with the choices they made, whether it is in terms of what status updates they published or what content they interacted with. That doesn't sit right with me.

Facebook is coming under fire following reports that in 2012 it used user behaviour data to try and influence the mood of people when they were using the social network.

The research found that by altering what was served to people in the newsfeed, Facebook's team would see a change in the sentiment of the posts that were being published.

Fill the feed with negativity, you post something sad; fill the feed with positivity and all was light and rainbows.

Manipulation of mood via media is nothing new; newspapers have been attempting to affect the way the people behave for years. The difference is that this was a period of mass communication, not one to one content consumption.

The worrying aspect of the latest revelations is not that Facebook tried to do it. You'd almost expect them to conduct this type of research with the wealth of data at their disposal. No, the concerning element is that they undertook the study without telling people they were participating.

We've always known that Facebook uses our data to serve us adverts or promote different type of sponsored content in our newsfeeds.

That they would try to censor or try to influence ordinary users into doing or feeling one way or another is worrying.

What they did was legal, they have the right to use your data as they wish - it says so in the terms and conditions for using the service after all.

What they did however was not ethical. By testing out a theory on unsuspecting users, they were deliberately interfering with the choices they made, whether it is in terms of what status updates they published or what content they interacted with.

That doesn't sit right with me.

I believe the Internet can change the way we behave for good and that the likes of Facebook should operate with compete transparency when running experimental research to prove one idea or another.

To do otherwise is disingenuous and abusing the trust that users have in them. It totally disrespects the userbase - if this had happened on a forum or in a chat room, the community would be up in arms.

That it is only the tech media who have shown any kind of opinion towards says that people are either not surprised, don't mind or are so apathetic towards Facebook that they have no need to care about how they may or may not be being manipulated.

Facebook was one of the powerhouses behind the evolution of social media, and we should be thankful for that. it doesn't mean that they have the right to mistreat their users.

It would not surprise me if more stories of this type were to break in the coming months; I hope Facebook are paying attention and will change their operating practices accordingly.

Close

What's Hot