Short list of anti-AI tools
Glaze Glaze is a system designed to protect human artists by disrupting style mimicry. At a high level, Glaze works by understanding the AI models that are training on human art, and using machine learning algorithms, computing a set of minimal changes to artworks, such that it appears unchanged to human eyes, but appears to AI models like a dramatically different art style. Nightshade Nightshade, a tool that turns any image into a data sample that is unsuitable for model training ArtShield ArtShield embeds a well-camouflaged watermark into your images that helps prevent AI models from training on your data. This watermark is what models such as Stable Diffusion use to mark images that it generates in order to prevent it from training off of data it has produced itself. Anti-DreamBooth The system aims to add subtle noise perturbation to each user's image before publishing in order to disrupt the generation quality of any DreamBooth model trained on these perturbed...