How to protect your Art from AI. (Glaze and NightShade Overview)

29 Jan 202414:04

TLDRIn this informative video, the presenter discusses the challenges artists face with AI copying their art styles. To combat this, the University of Chicago's Glaze team has developed two tools: Glaze and Nightshade. Glaze subtly alters artwork to prevent style mimicry, making AI perceive the art differently without significantly changing its appearance to humans. Nightshade, on the other hand, more drastically changes the content as perceived by AI, potentially tricking it into misidentifying the subject of the artwork. The presenter explains that for effective protection, widespread adoption of these tools is necessary to disrupt AI models. Both tools require a powerful GPU for efficient processing, with Nightshade being particularly demanding. The video also mentions a web-based version of Glaze for those without the necessary hardware. The presenter encourages viewers to use these tools to protect their art from unauthorized AI usage.


  • 🎨 Protecting Artwork: The University of Chicago's Glaze team has released tools to protect artists' work from AI copying and style mimicry.
  • πŸ› οΈ Two Tools: Glaze and Nightshade are the tools released, each serving a slightly different purpose in safeguarding artwork.
  • πŸ€– AI Disruption: Glaze distorts the artwork to prevent AI from copying the style, making it look compressed and slightly different.
  • πŸ” Content Trickery: Nightshade changes the content perception by AI, making it see something entirely different from what humans see.
  • πŸ“ˆ Intensity Levels: Both tools offer adjustable intensity levels to control the degree of distortion and protection.
  • πŸ’» GPU Requirement: For efficient use, an Nvidia GPU is recommended due to the computational demands of the tools.
  • ⏱️ Processing Time: High-intensity rendering can take up to an hour on a powerful PC, much longer without a suitable GPU.
  • 🌐 Web Alternative: Web Glaze is an online alternative for those without the necessary hardware, though it's currently invite-only.
  • πŸ“ User Guide: Nightshade mentions the need for an Nvidia GPU and at least 4GB of gddr5 memory, along with the latest Nvidia drivers and Cuda toolkit.
  • πŸ‘₯ Community Effort: The effectiveness of these tools relies on widespread adoption by the artist community to disrupt AI models effectively.
  • πŸ“Έ Before and After: The video shows before and after examples of artwork protected by Glaze and Nightshade, highlighting the subtle yet significant differences.

Q & A

  • What are Glaze and NightShade, and how do they protect artwork from AI?

    -Glaze and NightShade are tools developed by the University of Chicago to protect artists' works from being copied by AI. Glaze modifies the style of an artwork by adding small distortions, making it harder for AI to replicate the style accurately. NightShade goes further by altering the content recognition of images, causing AI to misidentify the subjects in the artwork.

  • How do Glaze and NightShade differ in their approach to protecting artwork?

    -Glaze focuses on altering the style of an artwork to prevent style mimicry by AI, adding subtle artifacts that distort the style. NightShade, on the other hand, changes how AI perceives the content of images, leading it to recognize incorrect subjects, thus providing a stronger layer of protection against AI replication.

  • What is the purpose of the intensity settings in Glaze and NightShade?

    -The intensity settings in Glaze and NightShade control how much the artwork is altered. Lower settings result in minimal changes, preserving the artwork's appearance while still offering some protection. Higher settings increase the distortions, significantly altering the artwork's style or content recognition for better protection against AI misuse.

  • Can you describe an example of how NightShade alters AI's content recognition?

    -NightShade can trick AI into misidentifying images. For instance, an image of a cow in a field might be altered so that AI perceives it as a leather purse in the grass. This significant misidentification disrupts AI's ability to correctly categorize and replicate the image's content.

  • What are the hardware requirements for running Glaze and NightShade effectively?

    -To run Glaze and NightShade effectively, a powerful Nvidia GPU is recommended. This is because these tools utilize intensive computational processes that are optimized for Nvidia's architecture, specifically requiring a minimum of 4 GB of GDDR5 memory.

  • What is the significance of the web version of Glaze, known as WebGlaze?

    -WebGlaze offers a solution for users who do not have the powerful hardware needed to run Glaze or NightShade locally. It's a web-based version that processes images in the cloud, making it accessible to more users by removing the high computational barrier.

  • How do Glaze and NightShade contribute to the fight against art theft by AI?

    -By corrupting the data AI models train on, Glaze and NightShade disrupt the ability of these models to accurately replicate or steal art styles and content. This collective use of distorted images can degrade the quality of AI-generated art, making it less likely for AI to effectively copy and use original art without permission.

  • What are the potential downsides of using high intensity settings in Glaze and NightShade?

    -Using high intensity settings in Glaze and NightShade can significantly distort the artwork, making it visually unappealing for human viewers. This trade-off between aesthetic integrity and protection level needs to be carefully managed by artists.

  • How does NightShade differ from other AI-disruption tools?

    -Unlike other tools that may only obscure style, NightShade uniquely alters content recognition, causing AI to identify entirely wrong subjects within an artwork. This makes it particularly effective at protecting the entire concept of the artwork, not just the style.

  • What is the community aspect of fighting AI art theft as suggested by the use of NightShade?

    -The effectiveness of NightShade increases when more artists use it to poison the datasets AI models train on. If widespread, it can lead to significant disruptions in AI's ability to accurately generate art, forcing model developers to address these distortions, and potentially leading to better ethical practices in AI training.



🎨 Introduction to Artwork Protection Tools: Glaze and Nightshade

Tanil introduces the concept of protecting artwork in the age of AI and high incidence of art theft. He highlights two tools developed by the University of Chicago's Glaze team, named 'Glaze' and 'Nightshade', designed to prevent style mimicry and content misinterpretation by AI. Tanil explains that Glaze introduces subtle changes in shading that result in minor visible artifacts, which can vary in intensity. These artifacts aim to confuse AI without drastically altering the human visual perception of the art. He shows examples of the effects on basic geometric shapes and a photorealistic painting, and discusses how these tools could help protect artists' unique styles from being replicated by AI systems.


πŸ›‘οΈ Advantages and Practical Usage of Nightshade for Greater Art Protection

Tanil emphasizes the significance of using Nightshade alongside Glaze for optimal protection of artwork from AI theft. He explains that while Glaze targets art style protection, Nightshade alters how AI interprets the content of images, which could lead to significant mismatches in AI recognition, such as mistaking a cat for a motorcycle. Tanil discusses the importance of widespread adoption of Nightshade among artists to effectively corrupt AI models that use stolen artworks. He also shows the practical aspects of using these tools, including their interfaces and settings options, and highlights the challenges artists might face in terms of artifact visibility depending on the artwork's characteristics.


πŸ–₯️ Technical Requirements and Alternatives for Running Glaze and Nightshade

Tanil addresses the technical requirements necessary to run Glaze and Nightshade efficiently, notably the need for a robust GPU, preferably from Nvidia, due to software compatibility and performance issues with other hardware. He details the installation process, including the CUDA toolkit for Nvidia users, and provides solutions for those lacking the optimal hardware, such as running the tools for extended periods or using the cloud-based service, Web Glaze. Tanil mentions his personal attempts to gain access to Web Glaze and underscores the communal effort required to combat AI-driven art theft by corrupting training datasets, using humor to suggest the potential chaos in AI misidentifications if widespread usage of Nightshade is achieved.



πŸ’‘Artwork Protection

Artwork protection refers to the measures taken to safeguard an artist's original creations from unauthorized use, copying, or theft. In the context of the video, this is particularly relevant as AI technology has made it easier for art styles to be replicated without consent. The video discusses tools like Glaze and Nightshade that help protect an artist's work by altering it in a way that confuses AI algorithms.


AI, or artificial intelligence, is the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. In the video, AI is discussed as a potential threat to original art, as it can be used to copy or mimic an artist's unique style. The tools Glaze and Nightshade are introduced as a countermeasure to prevent AI from accurately replicating an artist's work.


Glaze is a tool developed by the University of Chicago that alters an artist's work by making subtle changes, which are imperceptible to the human eye but can significantly disrupt AI's ability to recognize and copy the art style. It is used as a protective measure against style mimicry by AI, as demonstrated in the video with examples of how it applies a 'glazed' effect to artwork.


Nightshade is another tool mentioned in the video, similar to Glaze, but with a different approach. It not only alters the artwork to prevent style recognition by AI but also changes the content in a way that is perceptible to AI but not to human eyes. This can lead AI to misinterpret the content, thereby protecting the original artwork from being learned and replicated by AI systems.

πŸ’‘Style Mimicry

Style mimicry is the act of replicating the unique style of an artist's work without their permission. The video discusses how AI can be used for style mimicry, which is a concern for artists who want to protect their original styles. Tools like Glaze and Nightshade are presented as solutions to disrupt this process.


In the context of the video, artifacts refer to the small, imperceptible changes or distortions added to an artwork by the Glaze and Nightshade tools. These artifacts serve as a protective layer that confuses AI algorithms, preventing them from accurately copying or recognizing the original art style.

πŸ’‘Content Alteration

Content alteration is a technique used by the Nightshade tool to protect artwork. It involves changing the content of the artwork in a way that is indiscernible to humans but can significantly alter the AI's interpretation. For example, an AI might perceive a cow in a field as a leather purse lying on the grass, thus preventing the AI from correctly recognizing the original content.

πŸ’‘Intensity Levels

Intensity levels in the video refer to the degree of alteration applied to the artwork by the Glaze and Nightshade tools. Users can adjust the intensity to control the extent of the protective artifacts added to the artwork. Higher intensity levels generally provide better protection but can also result in more noticeable changes to the artwork.


A GPU, or graphics processing unit, is a type of computer hardware that accelerates the creation of images, video, and animations. The video mentions that using a GPU, specifically an Nvidia GPU, can significantly speed up the process of applying Glaze or Nightshade to artwork. Without a suitable GPU, the process can be much slower.

πŸ’‘Web Glaze

Web Glaze, as mentioned in the video, is an online version of the Glaze tool that operates over the internet rather than being installed locally on a user's computer. It is currently invite-only and is designed for users who may not have the necessary hardware to run the Glaze or Nightshade tools on their own systems.

πŸ’‘Disrupting AI Models

Disrupting AI models is a strategy discussed in the video where artists use tools like Nightshade to alter their artwork in a way that confuses AI algorithms. If AI models are trained on such altered artwork, they can become corrupted, leading to incorrect outputs and thus protecting the original artwork from being effectively copied or learned by AI.


The Glaze and NightShade tools from the University of Chicago help protect artwork from AI theft.

Glaze disrupts style mimicry by making small changes to artwork, making it harder for AI to copy the art style.

NightShade alters the content perception of AI, showing it something different than what human eyes see.

Using NightShade can trick AI into misinterpreting the content, such as perceiving a drawing of a cow as a leather purse.

The effectiveness of Glaze and NightShade relies on widespread adoption among artists to disrupt AI models significantly.

Glaze and NightShade introduce artifacts to artwork, with the intensity level adjustable for more or less protection.

The Glaze tool prevents the copying of an artist's style, while still allowing AI to recognize the subjects within the artwork.

NightShade is particularly useful as it can completely mislead AI, causing it to categorize artwork incorrectly.

The Glaze and NightShade tools are designed to be user-friendly with a simple interface for applying the protective effects.

For optimal performance and faster processing times, using an Nvidia GPU with at least 4GB of gddr5 memory is recommended.

Artists without an Nvidia GPU can still use Glaze and NightShade, but processing times may be significantly longer.

Web Glaze is an alternative for those without the necessary hardware, offering online access to the Glaze tool.

The Web Glaze platform is currently invite-only and can be accessed by reaching out to the Glaze project on social media.

The video demonstrates the minimal visual impact on artwork when using Glaze and NightShade, even at high intensity settings.

The protective changes made by Glaze and NightShade are more noticeable in lighter artwork, affecting the visibility of the artifacts.

The presenter suggests that widespread use of NightShade could corrupt AI models, forcing them to misidentify objects.

The video provides a step-by-step guide on how to use Glaze and NightShade, including system requirements and processing times.

The Glaze and NightShade tools are part of a broader effort to protect artists' works from unauthorized use by AI technologies.