Big brother, an algorithm 🫥

Schools are employing dubious AI-powered software to accuse teenagers of wanting to harm themselves and sending the cops to their homes as a result —with often chaotic and traumatic results.


An algorithm then analyzes the language for evidence of teenagers wanting to harm themselves.

Unsurprisingly, the software can get it wrong by woefully misinterpreting what the students are actually trying to say. 

A 17-year-old in Neosho, Missouri, for instance, was woken up by the police in the middle of the night.

Comments

Popular posts from this blog

Perplexity

Hamza Chaudhry