Passport System's 'Racist' Reason For Rejecting This Student's Photo Is Terrible

'Subject's eyes are closed.'

Richard Lee, a New Zealand man of Asian descent, was surprised to find earlier this month that his passport photo had been rejected.

The facial recognition software had mistakenly registered his eyes as shut.

While Lee’s eyes were clearly open, the software run by the New Zealand government informed the student he would have to submit another photo.

“No hard feelings on my part, I’ve always had very small eyes and facial recognition technology is relatively new and unsophisticated,” Lee told Reuters.

“It was a robot, no hard feelings. I got my passport renewed in the end.”

The incident was first publicised on the Facebook page of Australian DJs Mashd N Kutcher.

When Lee contacted the New Zealand passport office, he was told he would have to get more photos taken because “there were shadows in the eyes or uneven lighting”, he told Mashable.

An Internal Affairs spokesperson told Reuters that 20% of passports are rejected for various reasons: “The most common error is a subject’s eyes being closed and that was the generic error message sent in this case.”

But the error prompted claims that the system is racist. Mashable’s Johnny Lieu wrote: “Racism is bad enough with humans, but when computers start laying into you? That just sucks.”

Facial recognition systems typically work by analysing the characteristics of someone’s face, measuring the distances between features such as eyes, nose, mouth and jaws.

This isn’t the first time an algorithm has been accused of being racist. Earlier this year ProPublica revealed that American police were using software which was biased against black people to predict future criminals.

Close

What's Hot