Future

This Free Tool Wants to ‘Poison AI Models’ So Human Artists Can Fight Back

bottle of poison

Even before the SAG-AFTRA strikes that put Hollywood on pause for almost a year, artists by and large were against AI models for “stealing” their work. Now, researchers aim to give them a tool to fight back against their art being used to train AI models like the popular Midjourney.

How? Enter the free Nightshade AI tool, which wants to ‘poison’ AI models so that they can’t generate new art. 

Developed by a team of computer scientists from the University of Chicago, the Nightshade AI is a tool that lets artists subtly modify their artwork so that it contains pixels that would confuse other AI programs and trick them into seeing something else. 

In theory, if enough artists used Nightshade, the amount of ‘corrupted’ or ‘tricky’ visuals would be enough to render an image-generating AI useless. The researchers also created another similar tool to fight against AI art generators, called Glaze, which was downloaded more than 2 million times.

For those who aren’t working in the AI field or are unfamiliar with the nitty gritty of this technology, this Redditor very nicely explained how the Nightshade AI works:

“AI train what to do by analyzing pictures much more closely than the human eye. AI train “models”, looking at many source images pixel by pixel. People use those models using a program to generate new images. There are many models trained with different images in different ways, and they interact with image generation AI software in different ways.

Nightshade exploits this pixel-by-pixel analysis. What it does is it alters a source image in such a way that it is identical to the human eye, but looks differently to an AI due to how they analyze pixels. For example, even though a picture might look like it was painted in the style of picasso, Nightshade may alter it to appear to an AI as a modern digital image.

The result of this is that when you pass instructions to an image generation ai software in the form of text, you might say something like “in the style of picasso”. Well if that model was trained using that poison image, it will skew towards outputting a modern digital image. Or for another example, it might do something like change common subjects. A beautiful woman might be a commonly generated image, so an image “shaded” by nightshade might poison a model by changing the prompt inputted requesting a woman to output a man instead.

The potent part about this is that images generated through this process will have the same poisoning (or so they claim), so the poison spreads in a sense. If a popular model uses an image poisoned by nightshade, the impact of that might not be realized immediately, but if that model is popular, and users use it to generate a lot of images, and upload those images to share them, and other models use those generated to train their models, then the poison spreads through those images”  

Will the NIghtshade AI have what it takes to actually put a break into the proliferation of AI image generators? Only time will tell, although if more human artists were to use it, it definitely wouldn’t hurt their cause.

Also read: How to Prove You Didn’t Use ChatGPT: One Simple Trick to Avoid ChatGPT Plagiarism Accusations

Follow TechTheLead on Google News to get the news first.

Subscribe to our website and stay in touch with the latest news in technology.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Must Read

Are you looking for the latest innovations in tech? You're in the right place, just subscribe to our RSS feed


Techthelead Romania     Comedy Store

Copyright © 2016 - 2023 - TechTheLead.com SRL

To Top