AI generated art in the style of Aharon Kahana

New anti-AI tool ‘poisons’ generative models to protect artwork from unauthorized robo-Rembrandts

A new tool from researchers at the University of Chicago promises to protect art from being hoovered up by AI models and used for training without permission by “poisoning” image data.

Known as Nightshade, the tool tweaks digital image data in ways that are claimed to be invisible to the human eye but cause all kinds of borkage for generative training models, such as DALL-E, Midjourney, and Stable Diffusion.

www.pcgamer.com