"they just gave me the same copied answer"
BAME Uber Eats couriers have alleged that the company’s “racist” facial recognition software is costing livelihoods due to its incapability of recognising their faces.
Fourteen Uber Eats couriers claim they were threatened with termination, had accounts frozen or were permanently dismissed after selfies they took failed the company’s “Real-Time ID Check”.
Another was fired after the selfie function refused to work.
According to trade unions, the issue has affected many more Uber Eats couriers across the UK.
Despite thousands of deliveries and 100% satisfaction rates, some workers were suddenly removed from the platform.
They say it was an automated process without any right to appeal.
Many pleaded for their jobs back but they received the same response, saying their dismissal was “permanent and final” and that Uber hoped they would “understand the reason for the decision”.
One Sheffield-based employee regularly worked 16 hours a day, six to seven days a week.
However, in October 2020, he was blocked from accessing his account after a selfie check.
He told WIRED: “I was trying to message them all day but they just gave me the same copied answer every time.
“I only earned money from Uber Eats. I used that money to pay for my bills, my rent, my car insurance, my food, my phone, everything.”
He was forced to borrow £1,000 to pay his bills that month.
Uber changed its mind after he contacted the Independent Workers’ Union of Great Britain (IWGB), which subsequently sent a letter to Uber threatening to go public.
Ubers say couriers can choose between AI and human verification when they submitted their real-time selfies.
However, couriers allege that if they choose the latter, no one overrides the mistakes the software can make.
Another worker had shaved his beard in preparation for a job interview when he logged into the app.
After submitting a selfie, the app said the photo was not of him. It asked the worker to provide information about the person that was replacing him within 24 hours or risk termination.
He said: “I tried going through the official routes but all it showed was options to provide information on my substitute.
“There was no way to say they had made a mistake.”
Although a call from a journalist to Uber’s press team stopped his account from being permanently closed, others were not as lucky.
Other couriers were accused of illegally subcontracting their shifts to another person.
While their status as independent contractors technically allow them to subcontract their work, there have been concerns about giving work to those who have not had background checks or cannot legally work.
As a result, Uber added a facial recognition step when users open the app in April 2020.
Couriers and drivers are required to take a selfie to prove that they are the ones logged on.
Microsoft facial recognition software is used, however, it has failed to identify darker-skinned faces.
In 2018, a similar version of the software used by Uber was found to have a failure rate of 20.8% for darker-skinned female faces.
For darker-skinned male faces, the figure was six per cent, while the figure was zero per cent for white men.
A study involving 189 different facial recognition systems found that all of them performed worse when identifying BAME faces, sometimes by a factor of 10 or even 100.
Professor Peter Fussey, of the University of Essex, said:
“There is no facial recognition software that performs equally across different ethnicities.”
“That technology doesn’t exist.
“If you’re bringing in that technology into an already unequal environment it just exacerbates those conditions and amplifies racial inequality.”
An Uber spokesperson stated that the company needed the software to protect against “potential fraud”.
Any decision to remove workers from the platform “always involves a manual human review” and “anyone who is removed can contact us to appeal the decision”.
Alex Marshall, president of the IWGB, said:
“It’s what these workers fear most about heading to work.
“And we’re definitely seeing more numbers of the BAME community being affected. It’s definitely indirect racism.”
He explained that drivers are resorting to food banks and shelters because they cannot afford to pay their rent and bills after losing the right to work on the app.
The IWGB says a lack of legal framework protecting gig economy couriers from dismissal makes a court case difficult, but it is battling to reverse dismissals on a case-by-case basis and pushing for an Early Day Motion to be filed in parliament.
The Uber Eats couriers stated that neither they nor anyone they know had ever subcontracted shifts because it would not make sense to do so.
Mr Marshall added: “Most of our members who’ve been fired for substitution are working 60 hours a week and still can’t make ends meet.
“When are they meant to be able to lend their account to someone else?”
A reason why illicit subcontracting has become such an issue is due to regulation.
As part of its deal to reinstate Uber’s private-hire taxi licence to operate in London, Transport for London (TfL) pressured the company to crack down on illegal subcontracting to protect people from unlicensed drivers.