Is Distrust of Facebook Contagious?

Revelations about Facebook experiments are waking us up to the fact that these services are not neutral, neither are they simply there for the public good. Yet they are now an integral part of our lives...

What do these three things have in common? Over the course of a week in January 2012 Facebook deliberately manipulated the news feed of almost 700,000 users in order to compare the effects of positive and negative news. In June 2014 Instagram disabled the account of Courtney Adamo after she posted an innocent photograph of her eighteen-month old daughter showing her bellybutton, saying it violated their rules. Research by Psychologist Robert Epstein released in May claimed that Google could, simply by adjusting its search algorithms, influence the outcome of an election by an average of over 12% - easily enough to swing the vote in marginal contests.

All three are within the power and scope of Facebook and Google (Facebook owns Instagram). All three caused public consternation and, in the case of the first two, a popular backlash (it should be emphasized that the third was conceptual rather than actual - the researchers were testing a theory).

When Facebook, Google and other new media behemoths do something that causes public angst or anger they generally apologise and often shift their position.

But they are under no obligation to change their behaviour. These new media behemoths are public companies. As such they are run as autocracies not as democracies. They may perform a positive public role, but only insofar as it suits their aims and continues to support their business model.

As Rebecca MacKinnon wrote of the two big social networks in Consent of the Networked: 'both Google Plus and Facebook share a Hobbesian approach to governance in which people agree to relinquish a certain amount of freedom to a benevolent sovereign who in turn provides security and other services'.

We know this but often appear to be in denial about it. Yet as we come to rely on these behemoths more and more, we need to remind ourselves that a benevolent sovereign is still a sovereign, and may not always act benevolently.

We need to be especially conscious when it comes to our reliance on these digital sovereigns to perform a civic function. According to the 2014 Reuters Institute Digital News Report Facebook is 'by far the most important network for news everywhere'. Google, Bing and Yahoo together account for between a third and a half of people's pathway to news. News is broken on twitter rather than mainstream media.

We may rely on these digital sovereigns, but we have little control over how they perform this civic function, how they choose to evolve it, or when they stop providing it. Google just announced it would be stopping Orkut, one of its social networking platform on September 30th.

In the same way these organisations can choose what content their users are allowed to publish. Breastfeeding photographs fell foul of Facebook's rules and were removed (the ban was quietly dropped last month). Beheadings were initially allowed, then banned, then allowed again on Facebook. They are now allowed as long as they are posted in 'the right context' (hard to imagine what the 'right context' is for a beheading).

Local campaigns, that may previously have been led by a local newspaper, are often now organized through Facebook. Whether they are to save a library, to stop a bypass, or find a missing person, these are, by most people's definition, civic campaigns. Facebook is not obliged to enable people to run these campaigns, and is within its terms of use to censor them - algorithmically or manually.

We should not fool ourselves into believing that algorithms are somehow neutral. Algorithms are like recipes. If you change an ingredient in the recipe, you change the dish. Tweak an algorithm and suddenly, invisibly, the results you receive will change (for good examples see The Filter Bubble).

Algorithms can be as influential in defining an editorial agenda as a newspaper editor. In its experiment Facebook chose to adjust its algorithm to censor specific news updates on the basis of key words. How is this different from a newspaper editor deciding not to publish a news story because of the effect it may have on the reader? Or the advertiser? The chief difference is that people know news stories are chosen subjectively. Many believe that algorithmic results are objective.

This may account for why lots of people appeared to be shocked by Facebook's psychological research experiment in which its users were the guinea pigs. But this was not the first experiment nor will it be the last.

But whereas one can debate with an editor, or question editorial decisions, it is very hard to see how one can do the same to an algorithm. Especially since these algorithms are closely guarded secrets, the equivalent of the Coke formula.

Revelations about Facebook experiments are waking us up to the fact that these services are not neutral, neither are they simply there for the public good. Yet they are now an integral part of our lives - not just our social but our civic lives too. Our influence over what they do, however, or over what services they provide, or how they use our personal information, is tiny. If you think firing a peashooter at an elephant is ineffectual, try firing emails at Facebook HQ in Menlo Park California.

Close

What's Hot