Facial Recognition Is Accurate, if You’re a White Guy
- 6 years ago
Facial Recognition Is Accurate, if You’re a White Guy
In her newly published paper, which will be presented at a conference this month, Ms. Buolamwini studied the performance of three leading face recognition systems — by Microsoft, IBM
and Megvii of China — by classifying how well they could guess the gender of people with different skin tones.
Ms. Gebru is a scientist at Microsoft Research, working on its Fairness Accountability Transparency and Ethics in A. I.
Megvii, whose Face++ software is widely used for identification in online payment
and ride-sharing services in China, did not reply to several requests for comment, Ms. Buolamwini said.
But the darker the skin, the more errors arise — up to nearly 35 percent for images of darker skinned women, according to a new study
that breaks fresh ground by measuring how the technology works on people of different races and gender.
To test the commercial systems, Ms. Buolamwini built a data set of 1,270 faces, using
faces of lawmakers from countries with a high percentage of women in office.
In 2015, for example, Google had to apologize after its image-recognition photo app initially labeled African Americans as “gorillas.”
Sorelle Friedler, a computer scientist at Haverford College and a reviewing editor on Ms. Buolamwini’s research paper, said experts had long suspected
that facial recognition software performed differently on different populations.
In her newly published paper, which will be presented at a conference this month, Ms. Buolamwini studied the performance of three leading face recognition systems — by Microsoft, IBM
and Megvii of China — by classifying how well they could guess the gender of people with different skin tones.
Ms. Gebru is a scientist at Microsoft Research, working on its Fairness Accountability Transparency and Ethics in A. I.
Megvii, whose Face++ software is widely used for identification in online payment
and ride-sharing services in China, did not reply to several requests for comment, Ms. Buolamwini said.
But the darker the skin, the more errors arise — up to nearly 35 percent for images of darker skinned women, according to a new study
that breaks fresh ground by measuring how the technology works on people of different races and gender.
To test the commercial systems, Ms. Buolamwini built a data set of 1,270 faces, using
faces of lawmakers from countries with a high percentage of women in office.
In 2015, for example, Google had to apologize after its image-recognition photo app initially labeled African Americans as “gorillas.”
Sorelle Friedler, a computer scientist at Haverford College and a reviewing editor on Ms. Buolamwini’s research paper, said experts had long suspected
that facial recognition software performed differently on different populations.