Can artists protect their work from AI? – BBC News

BBC News
18 Jun 202306:15

TLDRThe rapid advancement of AI art, exemplified by sales reaching over $400,000 at auctions like Christie's, poses significant copyright issues for artists. AI models, using data scraped from the web, learn to mimic specific styles without artists' consent. Carla Ortiz, a concept artist affected by this, joined a lawsuit against major AI firms. The University of Chicago's 'Glaze' technology proposes a solution by altering artwork in subtle ways to confuse AI without affecting human perception. Despite these efforts, the ongoing legal and ethical debates highlight the challenges of balancing innovation with artists' rights.

Takeaways

  • 🎨 AI艺术市场近年来取得了巨大飞跃,例如2018年在佳士得拍卖行以超过四十万美元的价格售出的一件AI艺术品。
  • 🚀 图像生成器如Dolly和Stable Diffusion使得几乎任何人都能在几秒钟内创作出新艺术作品。
  • 🤖 这些艺术作品的生成模型通过训练过程学习模仿风格,甚至是特定艺术家的风格,它们分析了从网上抓取的数百万甚至数十亿张图片以及描述这些图片的文本。
  • 🚫 许多艺术家并未同意将其艺术作品用于此类图像生成器。
  • 👩‍🎨 概念艺术家Carla Ortiz发现她的作品被未经允许地包含在AI图像数据集中,这让她感到被侵犯。
  • 📄 Carla和一群艺术家对Stability AI和其他AI图像生成器提起了集体诉讼。
  • ⛔ Carla决定尽可能将她的作品从互联网上撤下,以避免未经同意就被计算机抓取到图像数据集中。
  • 🔍 芝加哥大学的Ben Zhao教授和他的实验室开发了一种名为“glaze”的解决方案,它利用人类视觉与机器学习模型视觉之间的巨大差异。
  • 🖼️ 使用glaze处理后的艺术作品,对人眼来说变化几乎不可察觉,但对机器来说却有显著影响,可以防止机器学习模型准确模仿艺术家的风格。
  • 🤔 批评者认为AI艺术生成器的灵感来源于对其他作品的学习,这与人类艺术家的学习过程相似,而且它们生成的并不是简单的复制品。
  • 📋 一些艺术家愿意让自己的作品被AI图像生成器使用,但他们认为应该是选择性加入而非选择性退出。
  • 🔧 尽管Adobe的新图像生成器Firefly声称只使用了其库存库中的图片进行训练,但Adobe的贡献者表示,他们从未明确同意过这种使用方式。
  • 🕵️‍♂️ 人们已经开始尝试破解glaze,尽管Carla认为glaze可以为艺术家争取时间,直到相关法规和公众意识赶上来。
  • ⏳ Carla希望glaze不会是唯一的工具,她期望有更多的工具出现,以保护艺术家免受未经同意使用其作品的侵害。

Q & A

  • What is the recent development in AI art that has raised concerns among artists?

    -AI art has made significant advancements, with AI-generated artworks selling for high prices at auctions. The concern is that AI models, like Dolly and stable diffusion, are able to mimic styles of specific artists without their consent by ingesting millions of images from the web.

  • How does an AI image generator learn to mimic the styles of artists?

    -AI image generators learn to mimic styles through a process called training, where they ingest millions or even billions of images scraped from websites, combined with text describing the images, to create a dataset that allows them to generate images from a simple text prompt.

  • What action did Carla Ortiz and other artists take in response to their art being used in AI image datasets without consent?

    -Carla Ortiz and a group of other artists filed a class action lawsuit against Stability AI and other AI image generators to address the unauthorized use of their art in AI datasets.

  • What is the solution proposed by Professor Ben Zhao and his lab to protect artists' work from being used in AI image generators?

    -Professor Ben Zhao and his lab at the University of Chicago have developed a solution called 'glaze'. Glaze uses the differences in how humans and machine learning models perceive visual images to make subtle changes to the artwork that are almost imperceptible to humans but significantly alter how a machine sees it.

  • How does the 'glaze' technology help artists protect their work online?

    -Glaze allows artists to make their work publicly available online while preventing AI models from accurately learning and mimicking their style. The technology adds imperceptible changes to the images that confuse AI models, causing them to learn an incorrect style.

  • What is the counter-argument from critics regarding AI art generators and the use of existing artwork?

    -Critics argue that AI art generators are taking inspiration in a similar way to how humans do, by studying and learning from other pieces. They assert that the AI is not creating copies but is learning and evolving, which is a natural part of the creative process.

  • How are companies like Stability AI responding to the concerns raised by artists?

    -Stability AI has stated that their new generators will be opt-out, meaning artists will have the choice to exclude their work from being used in the AI datasets. Adobe, with its new image generator Firefly, has also trained its models only on images from its stock library, although contributors have expressed concerns about the clarity of their agreements.

  • What is the potential impact of 'glaze' on the future of AI art and artists' rights?

    -Glaze could potentially buy artists some time to protect their work from being misused by AI art generators. It also raises awareness and could lead to the development of more tools to protect artists' rights, as well as促成法规的制定 and public understanding to ensure that AI tools are developed ethically and with the consent of the artists.

  • What are the current efforts to break the 'glaze' technology?

    -People on the internet are already attempting to break the 'glaze' technology to get around its protective measures. While it may not be a permanent solution, it is hoped that it will provide temporary relief and encourage the development of more robust tools.

  • Why is it important for the public and regulators to be involved in the development of AI art generators?

    -Public and regulatory involvement is crucial to ensure that AI art generators are developed responsibly, with respect for artists' rights and intellectual property. It helps to create a balance between technological advancement and ethical considerations.

  • What is the role of informed consent in the use of artists' work for AI training?

    -Informed consent is essential to respect the rights of artists. It ensures that artists are aware of how their work is being used, have the opportunity to agree to its use, and can set boundaries on what uses are acceptable. This is a key issue in the debate over AI art generators.

  • How can artists protect their work from being scraped into AI image datasets without their consent?

    -Artists can take measures such as removing their work from the internet or using technologies like 'glaze' to alter their images in a way that prevents AI models from accurately learning their style. Additionally, they can advocate for clearer regulations and agreements regarding the use of their work in AI applications.

Outlines

00:00

🎨 AI Art Controversy and the Glaze Solution

The first paragraph discusses the recent advancements in AI art, highlighting the sale of an AI artwork at Christie's auction for a substantial sum. It raises concerns about the use of artists' work in AI image generators without their consent. The paragraph introduces Carla Ortiz, a concept artist whose work was used in AI datasets without permission. The narrative then shifts to Professor Ben Zhao's development of 'glaze,' a tool that subtly alters images to prevent AI from accurately mimicking an artist's style. The effectiveness of glaze is demonstrated, and the paragraph concludes with the ongoing debate about AI's use of existing artwork and the need for regulation.

05:01

🛡️ The Future of Artistic Protection and Regulation

The second paragraph focuses on the potential of tools like glaze to provide temporary relief for artists facing AI art theft. It emphasizes the hope that such tools can buy time for the development of regulations and public awareness. The paragraph also touches on the necessity of recognizing the value of artists' work, which has been used without their consent to train AI models. It concludes by stating that while AI art is likely here to stay, the involvement of regulators, artists, and an informed public is crucial to ensure these tools are developed ethically and responsibly.

Mindmap

Keywords

AI art

AI art refers to the creation of artwork using artificial intelligence, particularly through image generators that can mimic styles of various artists. In the video, it is highlighted that AI art has made significant advancements, with one piece selling for over four hundred thousand dollars at Christie's auction in 2018. The theme revolves around the ethical and legal implications of AI art, especially concerning the consent of the original artists whose styles are being replicated.

Image generators

Image generators are AI tools that can produce new images based on text prompts. They are trained on vast datasets of images and associated text descriptions, enabling them to mimic styles and create new pieces of art. The video discusses the controversy surrounding these tools, as they often use artists' work without their consent, leading to legal and ethical concerns.

Training (in AI)

Training in the context of AI refers to the process where AI models learn from a dataset. For image generators, this involves ingesting millions or billions of images and text descriptions to understand and replicate various styles. The video points out the issue of artists' work being used in these training datasets without their permission.

Art theft

Art theft, as discussed in the video, is the unauthorized use of an artist's work, which is a significant concern with AI image generators. The video mentions that artists like Carla Ortiz have had their work scraped into AI datasets without their consent, which they consider a form of art theft.

Class action lawsuit

A class action lawsuit is a type of legal action in which a group of people with similar claims against another party come together to sue as one. In the video, Carla Ortiz and other artists have filed a class action lawsuit against AI image generators like Stability AI for using their work without consent.

Glaze

Glaze is a solution developed by Professor Ben Zhao and his lab at the University of Chicago. It is designed to protect artists' work from being used by AI image generators. Glaze makes subtle changes to images that are imperceptible to humans but significantly alter how a machine learning model interprets the image, thus preventing the model from accurately learning the artist's style.

Opt-in and opt-out

Opt-in and opt-out refer to the consent mechanisms for using services or data. An opt-in process requires explicit consent from individuals before their data can be used, while an opt-out process allows data use by default and individuals can choose to withdraw their consent. The video discusses the debate over whether artists should have to opt-in for their work to be used in AI training, with some AI companies adopting opt-out models.

Adobe Firefly

Adobe Firefly is a new image generator developed by Adobe. It is mentioned in the video as an example of an AI tool that has been trained only on images from Adobe's stock library. However, even in this case, there are concerns about whether the original contributors to the stock library explicitly agreed to their work being used for AI training.

Regulation

Regulation refers to the rules and oversight that govern certain activities or industries. In the context of the video, regulation is discussed as a necessary component to address the ethical and legal issues surrounding AI art and the use of artists' work without consent. The hope is that regulation will help protect artists' rights while allowing AI technology to progress.

Public awareness

Public awareness is the level of knowledge and understanding that the general public has about a particular issue. The video emphasizes the importance of public awareness in recognizing the implications of AI art on artists' rights and the need for consent. It suggests that an informed public can contribute to the development of appropriate regulations and ethical standards.

Machine learning models

Machine learning models are algorithms that enable computers to learn and make predictions or decisions without being explicitly programmed. In the context of AI art, these models are trained on large datasets to mimic artistic styles. The video discusses how these models can be 'fooled' by the glaze technique, which changes images in a way that is unnoticeable to humans but significantly affects how the model interprets the style.

Highlights

AI art has recently seen significant advancements, with one piece selling for over $400,000 at Christie's in 2018.

Image generators like Dolly and Stable Diffusion allow anyone to create new art in seconds by mimicking styles of specific artists.

These AI models are trained on millions of images and text descriptions, scraped from various websites without artists' consent.

Artists are now facing the issue of their art being used in AI image generators without their permission.

Carla Ortiz, a concept artist from San Francisco, discovered her art was used in an AI image data set without her consent.

Carla Ortiz and other artists filed a class action lawsuit against AI image generators like Stability AI.

To avoid unauthorized use, Carla Ortiz removed her work from the internet.

Researchers at the University of Chicago, led by Professor Ben Zhao, developed a solution called 'Glaze' to protect artists' work from AI.

Glaze works by making imperceptible changes to images that significantly alter how a machine sees them.

When an artist's work is glazed, AI models attempting to learn from it will fail to replicate the correct style.

Glaze is designed to buy artists time until regulations and public awareness can catch up with the technology.

Critics argue that AI art generators are taking inspiration in the same way humans do by studying and learning from other pieces.

Some artists are willing to use their work with AI image generators, but they prefer an opt-in process rather than opt-out.

Stability AI announced that their new generators will be opt-out, and Adobe's Firefly has only been trained on its stock library images.

Adobe contributors have expressed concerns that the usage of their images for training AI was not explicitly agreed upon.

People are already attempting to bypass Glaze, indicating the ongoing challenge of protecting artists' work in the digital age.

The article emphasizes the importance of regulation, artist input, and an informed public to ensure ethical development and use of AI tools.