

Glaze and Nightshade
Glaze is a tool developed by the University of Chicago to protect digital artwork from being scraped and used to train generative AI models.
Cost / License
- Free
- Proprietary
Platforms
- Mac
- Windows
Features
Properties
- Privacy focused
Tags
Glaze and Nightshade News & Activities
Recent activities
- jdakfkj333 added Glaze and Nightshade
- POX updated Glaze and Nightshade
jdakfkj333 added Glaze and Nightshade as alternative to Fawkes, Mist V2 and photoguard
Glaze and Nightshade information
What is Glaze and Nightshade?
Glaze is a tool developed by researchers at the University of Chicago to protect artists from having their work unknowingly scraped and used to train generative AI models. It works by applying subtle, algorithmically generated perturbations to an image that are invisible to the human eye but disrupt the way AI models interpret and learn from it. When Glazed images are fed into AI training pipelines, the model is misled into learning incorrect representations of the artist’s style. This undermines the ability of tools like Stable Diffusion or Midjourney to reproduce the original style when prompted with phrases like “in the style of [artist name].”
NightGlaze is an extension of this concept, developed in response to the accelerating arms race between protective techniques and AI model training. As models increasingly begin training on datasets that already include Glazed images, there's a risk that they will adapt to or “see through” the original cloaking methods. NightGlaze counters this by generating more complex and resilient perturbations designed to fool newer, more robust generative models. It refines the adversarial noise with a deeper understanding of how current AI architectures process style and visual patterns, making the cloaking more adaptive and difficult to reverse-engineer.



