Data poisoning software from the University of Chicago that corrupts images that might be used in AI models. Launched in 2023, Nightshade lets artists "poison" their images without visual detection. If the image is scraped without permission (as countless images are) and used to train an AI model, the pixel alterations cause the resulting AI system to generate undesirable results. A flower might wind up becoming a part of a horse; a car might have dog features and so on. See
AI scraping.