Addressing two weak points in the currently available face recognition technologies, Microsoft has updated its facial recognition tools that can better identify people with darker skin tones than before.
With the new improvements, the tools were able to reduce the error rates for men and women with darker skin by up to 20 times.
"For all women, the error rates were reduced by nine times. Overall, with these improvements, they were able to significantly reduce accuracy differences across the demographics," Microsoft said in a blog post written by John Roach late on Tuesday.
Currently, facial recognition tools tend to perform best on men with lighter skin and worst on women with darker skin.
"That improvement addresses recent concerns that commercially available facial recognition technologies more accurately recognised gender of people with lighter skin tones than darker skin tones, and that they performed best on males with lighter skin and worst on females with darker skin," Roach wrote.
The higher error rates on females with darker skin highlights an industrywide challenge -- Artificial Intelligence (AI) technologies are only as good as the data used to train them.
More From This Section
If a facial recognition system is to perform well across all people, the training dataset needs to represent a diversity of skin tones as well as factors such as hairstyle, jewellery and eyewear.
The team responsible for the development of facial recognition technology at Microsoft, which is available to customers as the Face API via Azure Cognitive Services, worked with experts on bias and fairness across Microsoft to improve a system called the gender classifier, focusing specifically on getting better results for all skin tones.
"We had conversations about different ways to detect bias and operationalise fairness. We talked about data collection efforts to diversify the training data. We talked about different strategies to internally test our systems before we deploy them," said Hanna Wallach, Senior Researcher in Microsoft's New York research lab.
Wallach and her colleagues provided "a more nuanced understanding of bias," said Cornelia Carapcea, a Principal Programme Manager on the Cognitive Services team, and helped her team create a more robust dataset "that held us accountable across skin tones."
--IANS
na/bg
Disclaimer: No Business Standard Journalist was involved in creation of this content