Oct 8, 2021

Legal action over alleged Uber facial verification bias

Two unions are taking legal action against Uber, alleging that software used to verify drivers' identity is racially biased, and the firm has unfairly dismissed drivers. The system is designed to stop drivers sharing accounts, in part because drivers must go through criminal record checks to get a private hire licence. Uber told the BBC it protected the "safety and security of everyone who uses the Uber app by helping ensure the correct driver is behind the wheel". Uber says drivers can choose whether their selfie is checked by Microsoft software, or by humans. Image source, BBC/PA MANJANG. In June the ADCU launched legal action over what it alleges was the unfair dismissal of a driver and an Uber Eats courier after the company's facial recognition system failed to identify them.

Read the full story

 Related companies

Make a complaint about Uber by viewing their customer service contacts.