TY - JOUR
T1 - Emotion recognition technologies and dignity in AI-based surveillance capitalism
AU - Arnold, Bruce Baer
AU - Bonython, Wendy Elizabeth
AU - Rooney, Tess
PY - 2025/7/29
Y1 - 2025/7/29
N2 - Businesses, governments and other entities are increasingly presented with AI-based ‘emotion recognition’ biometric systems, promoted as tools offering robust insights into the honesty, comprehension or health support needs of individuals, particularly students and employees. Australian universities may consider adopting this technology as they expand their AI engagement in learning/assessment platforms and student support systems. Automated emotion recognition systems pose legal and human rights challenges arising from their potential to be used deterministically; their potential lack of reproducibility, replicability and validity; and their susceptibility to bias, notwithstanding their possible utility. Further, they rely on non-consensual or co-opted participation of individuals whose dignity is eroded by consequent reduction from persons to data subjects. This article evaluates such systems through a dignitarian human rights lens, highlighting the need for a precautionary approach.
AB - Businesses, governments and other entities are increasingly presented with AI-based ‘emotion recognition’ biometric systems, promoted as tools offering robust insights into the honesty, comprehension or health support needs of individuals, particularly students and employees. Australian universities may consider adopting this technology as they expand their AI engagement in learning/assessment platforms and student support systems. Automated emotion recognition systems pose legal and human rights challenges arising from their potential to be used deterministically; their potential lack of reproducibility, replicability and validity; and their susceptibility to bias, notwithstanding their possible utility. Further, they rely on non-consensual or co-opted participation of individuals whose dignity is eroded by consequent reduction from persons to data subjects. This article evaluates such systems through a dignitarian human rights lens, highlighting the need for a precautionary approach.
U2 - 10.69970/gjlhd.v12i2.1273
DO - 10.69970/gjlhd.v12i2.1273
M3 - Article
SN - 2203-3114
VL - 12
SP - 28
EP - 44
JO - Griffith Journal of Law & Human Dignity
JF - Griffith Journal of Law & Human Dignity
IS - 2
ER -