Fighting against data used to train AI models has become more poisonous.
![](https://static.wixstatic.com/media/669e65_21b2d1f5f1c5467d972974fb95af5c00~mv2.png/v1/fill/w_707,h_449,al_c,q_85,enc_auto/669e65_21b2d1f5f1c5467d972974fb95af5c00~mv2.png)
A new tool called Nightshade allows users to attach it to their creative work, and it will corrupt — or poison — training data using that art. Eventually, it can ruin future models of AI art platforms like DALL-E, Stable Diffusion, and Midjourney, removing its ability to create images.
Nightshade adds invisible changes to pixels in a piece of digital art. When the work is ingested by a model for training, the “poison” exploits a security vulnerability that confuses the model, so it will no longer read an image of a car as a car and come up with a cow instead.
Comments