Education: Automated Essay Scoring Bias
Created: 9 months, 2 weeks ago.
by: Charlie
Categories:
AI BIAS
Machine Learning
Education
Domain: Education/Assessment
Description: Automated Essay Scoring (AES) systems show systematic bias against students from marginalized groups. 2024 studies found GPT-4o incorrectly predicted academic failure for Black students 19% of the time vs. 12% for white students.
Ethical Challenges:
- Linguistic Bias: Against non-native English speakers and dialectal variations
- Cultural Bias: Preference for mainstream cultural references and writing conventions
- Socioeconomic Bias: Correlated with writing style differences across economic classes
- Educational Gap Perpetuation: Biased algorithms can reinforce existing educational disparities
Public Datasets:
- Primary: ASAP (Automated Student Assessment Prize) Dataset
- URL: https://www.kaggle.com/c/asap-aes/data
- Papers with Code: https://paperswithcode.com/dataset/asap
- Source: Kaggle Competition Data
- Content: \~12,800 essays from grades 7-10 across 8 different prompts
- Enhanced: ASAP 2.0 Dataset (2024) - 24,000 argumentative essays with explicit bias considerations
- ASAP 2.0 URL: https://the-learning-agency-lab.com/learning-exchange/asap-2-0-dataset/
- GitHub Implementation: https://github.com/Turanga1/Automated-Essay-Scoring
0 Questions
No questions yet. Be the first to ask!