Facebook Voting Manipulation? It's Not Just Emotions, It's Democracy Too

Facebook Can Also Influence Whether You're Likely To Vote
Facebook CEO Mark Zuckerberg gestures while delivering the keynote address at the f8 Facebook Developer Conference Wednesday, April 30, 2014, in San Francisco. (AP Photo/Ben Margot)
Facebook CEO Mark Zuckerberg gestures while delivering the keynote address at the f8 Facebook Developer Conference Wednesday, April 30, 2014, in San Francisco. (AP Photo/Ben Margot)
ASSOCIATED PRESS

Facebook can alter your mood based on what news stories it shows you.

Apparently it can also manipulate whether you're likely to vote.

Outrage has been building on Monday over the revelation that researchers working for the social network artificially altered the news feeds of some 700,000 users, in order to test how emotions spread through its site.

Reactions have ranged from mildly creeped out to full-on fury, with thousands of users pledging to close their accounts.

The researcher involved in the study has apologised and said that the results "may not have justified" the reaction:

"I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

But a new report in the New Statesman has linked the study to another, earlier research project with arguably even more sinister implications.

The report draws attention to the fact that in 2010 Facebook made several small adjustments to banners reminding US citizens to vote.

In the study -- news of which is not new, and was not entirely ignored at the time -- two groups of 600,000 users were studied and compared, and Facebook apparently found that up to an extra 340,000 votes were cast as a result of its messages, and that tens of thousands extra were cast based on how it manipulated those banners.

The report says Facebook checked this by comparing private status updates with public voting rolls. In its story the New Statesman describes this as "a massive secret political experiment on the creepy-totalitarian side of interesting".

That said, the experiment also found that a person's close friends were a far greater influence on whether they voted than advertising messages.

However, as Brett Dixon, director of the digital marketing agency DPOM, said in a statement, the experiment is another worrying sign for those concerned that Facebook has amassed a disturbing amount of power over its audience.

"Despite Facebook's insistence that this was merely an academic experiment, it sails perilously close to the illegal world of subliminal advertising. There's a reason this insidious form of manipulation is banned - it is an abuse of people's freedom to choose.

"Whether it appeals to the head or the heart, all advertising seeks to influence people's mood. But there's a big difference between influence and control."

Close

What's Hot