LAION

Over 1,000 images of sexually abused children have been discovered inside the largest dataset used to train image-generating AI, shocking everyone except for the people who have warned about this exact sort of thing for years.

The dataset was created by LAION, a non-profit organization behind the massive image datasets used by generative AI systems like Stable Diffusion. Following a report from researchers at Stanford University, 404 Media reported that LAION confirmed the presence of child sexual abuse material (CSAM) in the dataset, called LAION-5B, and scrubbed it from their online channels. 

Comments

Popular posts from this blog

Perplexity

Aphorisms: AI

DeepAI's Austen on China