ETD-HUB

Technology: Facial Recognition Bias

Created: 9 months, 2 weeks ago. by: Charlie
Categories: AI BIAS Machine Vision Social

Domain: Computer Vision/Biometrics
Description: Joy Buolamwini (MIT) and Timnit Gebru's Gender Shades project exposed severe intersectional bias in commercial facial recognition systems, demonstrating these systems performed 32 times worse on dark-skinned women compared to light-skinned men.

Ethical Challenges:

  • Intersectional Bias: Error rates varied dramatically: 0.8% for light-skinned men vs. up to 34.7% for dark-skinned women
  • Data Representation: Training datasets with 83.5% white individuals
  • Civil Rights Impact: Use in surveillance and law enforcement applications
  • Platform Governance: Led to policy changes at major tech companies

Public Datasets:


2 Questions

Inherent Bias in Image Dataset Labelling

Asked: 9 months, 2 weeks ago

By: Deleuze(Citizen)

208 Views

IS IT LEGAL TO BE interviewed BY AN AI without consent?

Asked: 4 months ago

By: Allf(AI Ethics Expert)

71 Views