In February of 2023, news media outlets reported that Porcha Woodruff, a woman from Detroit, was falsely arrested for robbery and carjacking. She was arrested because facial recognition software identified her as the perpetrator.
Artificial intelligence is known to be unreliable at recognizing faces especially when it comes to women with darker skin, such as Ms. Woodruff. According to a study on the AIs of Face++, Microsoft, IBM, Amazon, and Kairos, conducted by Iniouwa Debora Raji and Joy Buolamwini the average error rate for dark skinned women is 15.29%, dark skinned men is 0.96%, light skinned women is 3.44%, and light skinned men is 0.15%. The reason for this is the training datasets are mostly white and men.
Additionally, Project Green Light, a Detroit surveillance program, uses facial recognition to screen camera footage. The monitoring stations are placed mostly in black neighborhoods as shown by this map.
While accuracy and camera placement are fairly obvious factors, another, less intuitive one exists — camera settings. Default camera settings are not usually optimized to capture photos of people with darker skin tones. This can lead to low quality photos for the AI to use either in identifying some one or in the training dataset making it much more difficult for it to be accurate.