The narrative is compelling: AI systems that can detect your emotional state from facial expressions, vocal tone, body language, or physiological signals.
Whether deployed by government or private companies, social scoring systems create perverse incentives and erode human autonomy.
High risk: AI systems that could significantly impact legal rights or safety require extensive documentation, testing, and human oversight.


