13: How are Non-Drivers Affected?
If the IRIS application is deployed in public service vehicles (like taxis or buses), are there distinct ethical and legal issues that arise - given that there is a possibility that some passengers (non-drivers of the company) may be detected/captured by the IRIS application for some period of time without providing explicit consent to the legal agreements?
17 Answers
Answered: 1 month, 2 weeks ago
By: Chiamakaokorie
-
Answered: 1 month, 2 weeks ago
By: Tundefasina
Full anonymization removes GDPR obligations but may limit model improvement. Pseudonymization is usually sufficient if combined with strong access controls and a right-to-erasure mechanism, allowing users to delete their data. Retaining identifiable data after profile deletion would raise legal risks.
Answered: 1 month, 2 weeks ago
By: Zainabodogwu2
Full anonymization → safest, fewer legal limits
• Pseudonymization → allowed if “right to be forgotten” and strong securi
Answered: 1 month, 2 weeks ago
By: Oliverharrow
Should be deleted once 5 years has passed
Answered: 1 month, 2 weeks ago
By: Ngozioshoba
Only data necessary for fatigue detection should be collected and stored for limited periods. Users must understand why their data is processed. Regular checks ensure the data is not reused beyond its purpose.
Answered: 1 month, 2 weeks ago
By: Efeadelaja
Pseudo-anonymization with deletion on request is generally sufficient under GDPR, but full anonymization reduces legal risk; retaining data after deletion could still have legal implications.
Answered: 1 month, 2 weeks ago
By: Meilincai
The data must be saved up to 10 years according to the EU AI charter for various reasons including the database
Answered: 1 month, 2 weeks ago
By: Kelechinwosu
If you fully anonymize data, it is no longer considered "personal data," and GDPR no longer applies. You could legally retain this data for 5 years (or indefinitely) even after a profile is deleted.
Answered: 1 month, 2 weeks ago
By: Beatricelorne
Yes
Answered: 1 month, 2 weeks ago
By: Zainabodogwu32
From both a legal and ethical standpoint, full anonymization is preferable but not always technically feasible for biometric datasets.
Full anonymization removes all identifiable links to individuals and places data outside GDPR’s scope. This allows longer retention (e.g. 5 years) but is extremely difficult to achieve with facial or physiological data without destroying its utility.
Pseudonymization retains identifiers separately and allows compliance with GDPR rights, including the right to erasure (“right to be forgotten”).
In practice, pseudonymization combined with strict access controls, encryption, and deletion mechanisms is generally considered sufficient and more realistic, provided users can request deletion and data is not retained longer than necessary.
Ethically, respecting user control and deletion rights is critical to maintaining trust, even if anonymization would offer fewer legal constraints.
Answered: 1 month, 2 weeks ago
By: Miles_Hatcher
Full anonymisation is not strictly required though it provides protection
Answered: 1 month, 2 weeks ago
By: Aminaolorun
It is not required but it provides protection
Answered: 1 month, 2 weeks ago
By: Clarawhitby
Shii idk
Answered: 1 month, 2 weeks ago
By: Ifeanyiakare
Pseudo-anonymization with a functional “right to be forgotten” is generally sufficient if personal identifiers can be deleted on request.
Full anonymization is stricter, allows longer retention without legal risk, but may limit personalized model improvements.
Key: Must prevent re-identification and comply with GDPR retention limits.
Answered: 1 month, 2 weeks ago
By: Kunleekwueme
I believe full anonymisation would allow use without legal implication, provided thr users agreed to the data collection.
Answered: 1 month, 2 weeks ago
By: Sadeogunlana
Let full anonymization be required
Answered: 1 month, 2 weeks ago
By: Tomashbrook
I suppose psuedo-anonymisation can be used.
Your Answer
Login to add your answer!
We’d love to hear your thoughts — share a meaningful answer by logging in.