A new tool called Nightshade has been developed to disrupt AI image generation models. It works by subtly altering image data, causing AI models trained on this poisoned data to produce distorted or nonsensical outputs. This method aims to protect artists' work from being used without consent in AI training datasets. AI
Summary written by gemini-2.5-flash-lite from 1 source. How we write summaries →
RANK_REASON Release of a new tool/technique for AI safety research.