Innocent Until Proven Guilty - Not When It Comes To Your Face

Facial recognition sounds fairly benign. Right now you are probably visualising a policeman watching 100's of television screens showing CCTV images looking out for known criminals lingering in the crowd. Well, that might have been the case in the past, but with this technology the process is automated by an intelligent computer programme.
Mike Kemp via Getty Images

This weekend thousands of people will take to the streets of Notting Hill in West London to celebrate Carnival.

Amidst the colour, music, food and celebration the Metropolitan Police will be deploying hundreds of officers to ensure everyone has a safe, friendly and crime free time.

But it won't just be Bobbies on the Beat monitoring people's behaviour. The police will be using cameras to scan the faces of people in the crowds in order to weed out known troublemakers.

These cameras aren't just regular CCTV, they are armed with automatic facial recognition software.

Facial recognition sounds fairly benign. Right now you are probably visualising a policeman watching 100's of television screens showing CCTV images looking out for known criminals lingering in the crowd. Well, that might have been the case in the past, but with this technology the process is automated by an intelligent computer programme.

By creating a biometric algorithm of a person's face, this information can be fed into the facial recognition system enabling the cameras to scan multiple faces at a time, only alerting the officer when a match has been made. The system, we are told, does not save the faces which do not match, so those of us who are innocent by-passers have nothing to worry about.

What a good idea you may think. In an era of heightened security who doesn't want a system which automatically picks bad people out of crowds, preventing them from doing harm. Right?

Except the promises that only known criminals or people on terror watch lists are being sought out aren't quite as they seem

The Home Office are currently investing millions of pounds in the building of a two-pronged facial recognition approach to policing.

One is the system being used at Notting Hill: a live scan of faces run against a watch list of known criminals.

The other is the creation of a system used to identify people from still images taken from CCTV, for example by comparing them against custody images held by the force. Both systems are currently being used outside of the law. There has been no public debate about the use of facial biometric recognition software, no debate in Parliament, and no proposed legislation.

This is particularly worrying when you consider that back in 2012 the High Court ruled that the retention of custody images of innocent people is unlawful.

The Judges insisted the Government quickly address the problem, but it took the Home Office almost five years to publish their review, and when it finally arrived at the start of this year, the response changed nothing.

So, the retention of custody images continues - despite the fact that being taken into custody does not make a person guilty of a crime. Not only are these images retained but increasingly police forces are turning them into facial biometrics, mostly without the person's knowledge.

The number of images and biometrics being held are far from small. Sky News revealed earlier this week that 20 million images are now sat on the Police National Database, over 16 million of which have been made into biometrics.

According to the former Biometrics Commissioner Alastair MacGregor QC in his annual review 2013/2014, "hundreds of thousands" of these images are of innocent people.

This is profoundly worrying. A facial biometric is as unique to a person as their fingerprint or DNA biometrics. Picking a person out of a crowd using fingerprints is pretty tricky, but doing it using a facial biometric is far easier.

Because biometrics are so sensitive to a person's human rights and personal privacy the law was changed back in 2012 so that an innocent person's DNA and fingerprints must be automatically deleted by the police. That law does not currently apply to custody images. The Home Office have said they don't want that to change.

The Home Office believe that a person's facial image is "generally less intrusive" than fingerprints and DNA because "people's faces are on public display all of the time".

Furthermore they say Police IT systems are not "generally designed to automatically weed, review or destroy images, or to differentiate between convicted and un-convicted individuals" and it would be too costly to get the police to upgrade their systems.

Too costly? But the Home Office are currently offering £5 million pounds in contracts to companies who can help them build facial biometric recognition systems. Why that money isn't being spent on protecting the innocent is far from clear.

That question is one of many Big Brother Watch would like to see answered by the Home Secretary.

We are so concerned by the lack of protection for innocent people's custody images that we have launched the FaceOff campaign calling for:

An end to the retention of innocent people's custody images.

An end to the creation of innocent people's facial biometrics.

The Government's yet to be published Biometric Strategy to legislate for custody images and facial biometrics to be given the same protections as fingerprints and DNA.

If you want to offer your support visit www.bigbrotherwatch.org.uk/all-campaigns/face-off/ and sign the petition, write to your MP and share the campaign on social media.

Big Brother Watch are committed to keeping people safe. Those who are innocent should not be subject to surveillance techniques; those who have done nothing wrong should absolutely have nothing to fear.

Close

What's Hot