Concept

Algorithmic Surveillance and Bias

Price notes that software that uses artificial intelligence algorithms to surveil students, including the honor lock softwares that claim to be able to detect academic dishonesty through students' hand and eye movements while taking a test, have been found to systematically disadvantage users of color, because of how the tools are often not trained on users with various skin tones, and disabled users, for instance be detecting fidgeting or stimming as "cheating."

0

1

Updated 2025-11-10

Contributors are:

Tags

Disability Studies

Culture as a Sociological Issue

Social Science

Empirical Science

Science

Sociology