"ICO Warns of "Immature" Biometric Tech"

The UK’s data protection regulator has warned organizations using or developing “emotion analysis” technology to act responsibly or risk facing a formal investigation.  The Information Commissioner’s Office (ICO) issued a statement recently, claiming that immature algorithms unable to detect emotional cues accurately enough could raise the risk of systemic bias, inaccuracy, and discrimination while also having data protection challenges.  The ICO stated that emotional analysis tech can monitor a user’s gaze, sentiment, facial movements, gait, heartbeat, facial expression, and even skin moisture to achieve various ends, such as health monitoring at work or registering students for exams.  The regulator warned that it is even riskier than biometric data processing for identity verification.  Deputy commissioner, Stephen Bonner, stated that the biometrics and emotion AI market might never reach maturity and, in the meantime, presents data protection risks.  Bonner noted that while opportunities are present, the risks are currently greater.  At the ICO, they are concerned that incorrect analysis of data could result in assumptions and judgments about a person that are inaccurate and lead to discrimination.  The ICO noted that the only sustainable biometric deployments will be those that are fully functional, accountable, and backed by science.  The ICO stated that as it stands, they are yet to see any emotion AI technology develop in a way that satisfies data protection requirements and that they currently have more general questions about proportionality, fairness, and transparency in this area.

 

Infosecurity reports: "ICO Warns of "Immature" Biometric Tech"

Submitted by Anonymous on