Secret Weapon Against Unauthorized AI Use

unauthorized use, unauthorized, use, AI, Artists, Nightshade

A University of Chicago team has devised a tool called Nightshade to help artists safeguard their creative works from unauthorized use in AI training.

  • AI companies have been scraping artists’ copyrighted material and personal information without consent.
  • Nightshade subtly alters the pixels in digital art causing unpredictable and chaotic outputs in AI training datasets.

A team of researchers at the University of Chicago has developed a tool called Nightshade to protect creative works from unauthorized use in AI training.

AI companies such as OpenAI, Meta, Google, and Stability AI are currently facing a wave of legal challenges. A number of artists claim that their copyrighted material and personal information have been scraped without permission. But Nightshade aims to tip the power balance back in favor of the creators. Artists can “poison” their digital art, rendering it incompatible with AI training.

Nightshade is designed to disrupt training data used by image-generating AI models such as DALL-E, Midjourney, and Stable Diffusion. For that, it subtly alters the pixels in digital art in a way that is virtually imperceptible to the human eye. When this “poisoned” art is included in AI training datasets, it results in unpredictable and chaotic outputs, turning dogs into cats, cars into cows, and other distortions.

The same team behind Nightshade also developed Glaze, a tool that enables artists to “mask” their personal style. This one, however, makes it challenging for AI systems to interpret the work accurately. Nightshade will be integrated into Glaze, giving artists the option to use the data-poisoning tool or not.

The researchers plan to make Nightshade open source, allowing others to adapt and create their versions. The more widespread the use of this tool, the more powerful it becomes, considering that AI training datasets can consist of billions of images.

Up to this point, artists could only opt out of the scraping but that doesn’t guarantee their works’ safety. But this, this is quite a unique approach to an admittedly bizarre situation.

Their position and subsequent efforts to protect their creative works are completely understandable. Because here’s the thing: If you mess with my property without my permission and the boobytrap goes off, it’s on you. The first thing we learn as toddlers is not touching what isn’t ours so why are AI companies acting like they are above that?

Critics, however, have raised concerns about the potential for malicious use of data poisoning techniques. But, to inflict significant damage on larger AI models, attackers would need thousands of poisoned samples.

Nightshade has the potential to hold AI companies accountable for respecting artists’ rights. And no one can fault them for protecting their intellectual property.


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Intelligent Tech sections to stay informed and up-to-date with our daily articles.