Passport Office’s online system slammed for ‘digital racism’ as it doesn’t recognise some black people’s faces – The Sun

A BRIT who tried to renew his passport online was stunning when the automated system mistook his lips for an open mouth.

Joshua Bada used a high quality photo booth image to apply for his passport, with a digital code that he had to note down and enter on gov.uk.

The facial detection system on the site informs people when it thinks the photo uploaded may not meet strict requirements, which include a plain expression and the mouth to be kept closed.

The 28-year-old from west London told PA that it was not the first time technology had issues with the size of his lips.

He explained: "When I saw it, I was a bit annoyed but it didn't surprise me because it's a problem that I have faced on Snapchat with the filters, where it hasn't quite recognised my mouth, obviously because of my complexion and just the way my features are.

"After I posted it online, friends started getting in contact with me, saying it's funny but it shouldn't be happening."

When asked by the system if he wanted to submit the photo anyway, Mr Bada was forced to explain why in a comment box, writing: "My mouth is closed, I just have big lips."

The incident is not an isolated case.

In April, a black woman shared a post on Twitter of similar struggles.

Cat Hallam, an educational technologist from Staffordshire, stressed that she does not believe it amounts to racism, but thinks it is a result of algorithmic bias.

She became frustrated after the system told her it looked like her eyes are closed and that it could not find the outline of her head.

Cat told PA: "The first time I tried uploading it and it didn't accept it.

"So perhaps the background wasn't right. I opened my eyes wider, I closed my mouth more, I pushed my hair back and did various things, changed clothes as well – I tried an alternative camera."

Cat said she begrudged paying extra to have an image taken in a photo booth when free smartphone photos work for others.

She said: "How many other individuals are probably either spending money unnecessarily or having to go through the process on numerous occasions of a system that really should be able to factor in a broad range of ethnicities?"

Cat proceeded with one of the images and received her passport without any further problems.

When posting about the issue on Twitter at the time, the Passport Office tweeted back to her, saying it was sorry the photo upload service hadn't "worked as it should".

The Home Office responded to PA, saying: "We are determined to make the experience of uploading a digital photograph as simple as possible, and will continue working to improve this process for all of our customers.

"In the vast majority of cases where a photo does not pass our automated check, customers may override the outcome and submit the photo as part of their application.

"The photo checker is a customer aide that is designed to check a photograph meets the internationally agreed standards for passports."

Noel Sharkey, professor of artificial intelligence and robotics at the University of Sheffield, believes lack of diversity in the workplace and an unrepresentative sample of black people is one of the reasons why the error may have happened.

He explained: "We know that [automated systems] has problems with gender as well, it has a real problem with women too generally, and if you're a black woman you're screwed, it's really bad, it's not fit for purpose and I think it's time that people started recognising that.

"People have been struggling for a solution for this in all sorts of algorithmic bias, not just face recognition, but algorithmic bias in decisions for mortgages, loans, and everything else and it's still happening."

Dr Saurabh Johri, an AI and data science specialist at Babylon, said AI is only as good as the data it uses and for some time it has been known that these systems are prone to exacerbate biases in the data.

He added: "In this instance, it's possible that there is bias in the training data but equally it could simply be a wider limitation of the technology."

The Race Equality Foundation said it believes the system was not tested properly to see if it would actually work for black or ethnic minority people, calling it "technological or digital racism".

Samir Jeraj, the charity's policy and practice officer, commented: "Presumably there was a process behind developing this type of technology which did not address issues of race ethnicity and as a result it disadvantages black and minority ethnic people."

Most read in travel

SPAIN RULES

The new travel rules in Spain from next month that you need to know about

ROOM FOR MORE

I’m a flight attendant and I swear by my clever hotel hack for cramped rooms

ISLE BE THERE

You can now spend the night on a private island in the UK for just £46pppn

CHECK OUT

I'm an ex-hotel worker – here's why I tell guests never to sleep under the duvet

A similar issue affected Apple's iPhoneX in the past with Chinese users claiming the device couldn't tell them apart.

And a study last year revealed that robots can become "racist and sexist" all on their own.

This week, the Home Office warned Brits to renew their passports before Brexit if it's running out soon.

    Source: Read Full Article