Feb 4, 2019
Amazon: Facial recognition bias claims are 'misleading'
Joy Buolamwini Amazon has defended its facial-recognition tool, Rekognition, against claims of racial and gender bias, following a study published by the Massachusetts Institute of Technology. The researchers compared tools from five companies, including Microsoft and IBM. While none was 100% accurate, it found that Amazon's Rekognition tool performed the worst when it came to recognising women with darker skin. The study found that Amazon had an error rate of 31% when identifying the gender of images of women with dark skin. Clients of Rekognition include a company that provides tools for US law enforcement, a genealogy service and a Japanese newspaper, according to the Amazon Web Services website. "The main message is to check all systems that analyse human faces for any kind of bias. If you sell one system that has been shown to have bias on human faces, it is doubtful your other face-based products are also completely bias free," Ms Buolamwini said.
Make a complaint about Amazon by viewing their customer service contacts.