ETD-HUB

5: How to Hold Developers Accountable for Bias?

Asked: 3 months, 1 week ago By: Catalink Views: 86 Catalink Case Study: IRIS

If the IRIS application fails to achieve unbiased performance or causes harm/unfair outcomes, what are the legal and ethical grounds for holding the application/owners accountable?

17 Answers

Answered: 1 month, 2 weeks ago By: Chiamakaokorie
Yes
Answered: 1 month, 2 weeks ago By: Tundefasina
Yes. If IRIS causes harm or shows biased performance, developers and deployers can be held accountable under product liability law, GDPR, and the EU AI Act, especially if they failed to meet safety, fairness, or risk-mitigation obligations.
Answered: 1 month, 2 weeks ago By: Zainabodogwu2
Yes—there would be clear grounds for accountability if IRIS causes harm or unfair outcomes, under EU AI Act (provider obligations and liability), GDPR (discriminatory or unlawful processing), product liability laws, and negligence standards, especially if bias, inadequate validation, or lack of safeguards can be demonstrated.
Answered: 1 month, 2 weeks ago By: Oliverharrow
Yes
Answered: 1 month, 2 weeks ago By: Ngozioshoba
If IRIS causes harm or shows bias, developers or operators can be held accountable. Safety technology must be properly tested and monitored. Accountability encourages responsible design and protects users from avoidable risks.
Answered: 1 month, 2 weeks ago By: Efeadelaja
Yes, if IRIS causes harm or unfair outcomes due to bias or poor performance, there are clear grounds for accountability under product liability, data protection, and AI governance laws.
Answered: 1 month, 2 weeks ago By: Meilincai
No at the moment
Answered: 1 month, 2 weeks ago By: Kelechinwosu
Under the EU AI Act and the revised Product Liability Directive, if an application like IRIS fails to perform as promised—particularly if it exhibits demographic bias or causes harm—it is treated as a defective product. Owners and developers are held to a "duty of care".
Answered: 1 month, 2 weeks ago By: Beatricelorne
Depends if people are negatively affected by it
Answered: 1 month, 2 weeks ago By: Zainabodogwu32
Yes, there are clear grounds for holding the application or its owners accountable if IRIS causes harm or produces unfair outcomes. Potential bases for accountability include: EU AI Act non-compliance, particularly failures in risk management, bias mitigation, or post-market monitoring. Product liability law, if IRIS is deemed defective or unsafe. Negligence, if known limitations or biases were not adequately addressed or disclosed. GDPR violations, if personal data is processed unlawfully or without sufficient transparency. High-risk classification increases the expectation that providers exercise heightened due diligence, making liability more likely where safeguards are insufficient.
Answered: 1 month, 2 weeks ago By: Miles_Hatcher
Yes. Bias or risk was foreseeable
Answered: 1 month, 2 weeks ago By: Aminaolorun
No
Answered: 1 month, 2 weeks ago By: Clarawhitby
It depends
Answered: 1 month, 2 weeks ago By: Ifeanyiakare
Yes
Answered: 1 month, 2 weeks ago By: Kunleekwueme
Yes.
Answered: 1 month, 2 weeks ago By: Sadeogunlana
Yes, as the IRIS models lack sufficient training
Answered: 1 month, 2 weeks ago By: Tomashbrook
Yes, I do believe those in charge of implementing the software should be held accountable.

Your Answer

Login to add your answer!

We’d love to hear your thoughts — share a meaningful answer by logging in.