The Slippery Slope of Internet Censorship

The council will examine the tension between the basic principles of privacy and the public's right to know - the main concern with this controversial ruling.
|

This week, the Advisory Council to Google on the Right to be Forgotten will hold its first of seven European public consultations to gather views on the Court of Justice of the European Union's decision in May that an individual has the right to request any EU-based internet search engine remove links to information about them.

The council will examine the tension between the basic principles of privacy and the public's right to know - the main concern with this controversial ruling.

So what does this 'right to be forgotten' ruling mean in practice? Someone whose personal information appears as a result of a search query may now fill in a simple online form to request that Google, who was the subject of the original ruling, or other search engines, remove specific links from appearing in searches. Legality is not the primary issue here - the information may be perfectly lawful and accurate. And the person requesting the removal does not have to show any harm or prejudice as a result of the information being made available through a search. If Google decides that the information is indeed "inadequate, irrelevant or no longer relevant" the link will be erased. The information will still exist - you just won't be able to find it via that search engine.

The implications of the Court's judgment for free expression are profoundly worrying.

Search engines like Google are the main gateway to information for almost all of us nowadays, so deleting a search result is effectively a form of censorship. When a handful of search engines undertake almost all searches, the service these giants provide is a form of public resource, although this is not to argue that government regulation is automatically the answer. Nevertheless, surely it's inappropriate to put one of the world's most powerful private companies in the position to solely determine what is or isn't legitimate expression, and decide what information people can have access to. Moreover, the public position of most such 'intermediaries' is that they also do not want to play such a role.

Already, it has been reported that a number of requests received by Google relate to news articles about politicians or other public figures and people convicted of serious offences. According to one UK newspaper, almost a third of the requests were in relation to accusations of fraud, around 12 percent to child pornography arrests, and 20 per cent were in relation to violent or serious crimes.

Google has admitted to being flooded with requests since the ruling three months ago. The same UK newspaper wrote that the search engine had reportedly received 12,000 requests - an average of seven requests per second. Given the obvious practical difficulties of undertaking an individual assessment of each request, which could run into millions over a short period of time, we are concerned that it will be quicker and easier to simply remove material without proper examination and without any obvious avenues for redress for those whose content has been removed.

So could this lead to vast swathes of perfectly legitimate and accurate information, which the public has a right to know simply disappearing from the internet because it is inconvenient or embarrassing?

At present, this is an EU-only ruling. But given the global nature of the internet, what about the obligations of other search engines or those which are not established in the EU? Could this judgment be applied to the non-European search platforms of Google and others? What if data protection authorities want to take on Google for refusing to remove personal data from google.com, which, they say, defeats the purpose of the judgment? At the moment, it is possible to get round the ruling through doing your searches via a non-Europe based search engine like this one.

We welcome the opportunity provided by these consultations to discuss the importance of balancing the rights to freedom of expression and privacy, because this 'right to be forgotten' shouldn't be a 'super right' trumping other rights. As a minimum, we would call for a right of notification to content providers, so they at least have the right to challenge unfair removals. We also believe that if individuals want links about them to be removed, they should go to court and courts should apply the balancing exercise between privacy and freedom of expression that courts do best.

Search engines must not be the censors of the internet.