Can artists protect their work from AI? – BBC News
TLDRAI art has made significant advancements, with AI-generated art selling for hundreds of thousands at auctions. However, this progress has raised concerns about artists' rights, as AI models learn to mimic styles by ingesting millions of images from the web without consent. Concept artist Carla Ortiz discovered her art was used in an AI dataset and, along with other artists, filed a lawsuit against AI image generators. To combat this, Professor Ben Zhao and his team at the University of Chicago developed a tool called 'glaze', which subtly alters images to prevent AI from accurately learning an artist's style. Despite the promising solution, critics argue that AI generators are simply taking inspiration, as humans do, and companies are fighting against the lawsuit. Some artists are open to their work being used with AI, but they prefer an opt-in approach. As AI art becomes more prevalent, the need for regulation and public awareness is paramount to ensure these tools are developed ethically and with the consent of the artists involved.
Takeaways
- 🖼️ AI art generators like DALL-E and Stable Diffusion can produce art by mimicking styles from a dataset of images, often without artists' consent.
- 👩🎨 Carla Ortiz, a concept artist, found her work included in AI datasets without her permission, highlighting a widespread issue for many artists.
- 🔍 The models learn to create art by training on massive datasets containing millions of images scraped from the web, sometimes leading to unauthorized use.
- 🚫 Artists are taking measures such as Carla Ortiz who removed her work from the internet to prevent further misuse in AI training datasets.
- ⚖️ A group of artists, including Carla Ortiz, filed a class-action lawsuit against AI image generator companies like Stability AI for using their work without consent.
- 🛡️ 'Glaze' is a technology developed to alter images slightly so they mislead AI training while remaining visually similar to the human eye.
- 🧑🎨 Critics argue that AI art generators mimic the natural human process of learning from existing art, asserting that these aren't direct copies.
- 🔄 Companies like Stability AI are moving towards 'opt-out' models for their AI generators, whereas Adobe ensures its Firefly generator only uses images from its stock library.
- 📡 Despite efforts like Glaze, people are already finding ways around these protections, suggesting that it may not be a permanent solution.
- 🌐 The issue requires regulatory and public awareness to ensure the ethical use of AI in art, with artist consent and participation in the development of such technologies.
Q & A
What significant sale involving AI-generated art is mentioned in the script?
-The script mentions a significant sale of an AI-generated artwork that sold for over four hundred thousand dollars at a Christie's auction in 2018.
What technologies are cited as enabling almost anyone to create new art in seconds?
-Technologies such as Dolly and Stable Diffusion are cited as enabling almost anyone to create new art in seconds.
What is the main ethical concern mentioned in the script regarding AI art generators?
-The main ethical concern mentioned is that many artists did not give their consent for their artwork to be used in training datasets for AI art generators.
Who is Carla Ortiz and how has AI technology impacted her?
-Carla Ortiz is a concept artist in San Francisco who discovered that her art had been used in an AI image dataset without her permission, leading her to remove her work from the internet to avoid further unauthorized use.
What legal action did Carla Ortiz and other artists take against AI image generators?
-Carla Ortiz and a group of other artists filed a class action lawsuit against Stability AI and other AI image generators.
What is 'Glaze' and how does it protect artists' works from AI exploitation?
-Glaze is a technology developed by Professor Ben Zhao and his lab at the University of Chicago. It alters images in subtle ways that are nearly imperceptible to humans but significantly distort how AI models perceive the images, thus preventing accurate replication of the artist's style.
How do AI art critics justify the use of artists' work without explicit consent?
-Critics argue that AI art generators take inspiration in a similar way to humans studying other artworks and learning from them, suggesting that these are not direct copies but rather are inspired creations.
What changes have companies like Stability AI and Adobe made in response to concerns over the use of artists' works?
-In response to concerns, Stability AI has stated that their new generators will be opt-out, while Adobe claims its new image generator, Firefly, has only been trained on images from its stock library.
What ongoing challenges do technologies like 'Glaze' face according to the script?
-Technologies like 'Glaze' face ongoing challenges such as efforts by people to circumvent or 'break' the protections it offers, indicating that these solutions may not be permanently effective against all forms of attacks.
What broader impacts does Carla Ortiz hope for with tools like 'Glaze'?
-Carla Ortiz hopes that tools like 'Glaze' will buy artists time for regulation and public awareness to catch up, helping to protect artists' rights and ensure that revolutionary tools are developed in conjunction with those who create the art that inspires them.
Outlines
🎨 AI Art Controversy and the Rise of Glaze
The first paragraph discusses the recent advancements in AI art, highlighting the sale of an AI artwork for over $400,000 at Christie's in 2018. It explains how AI models, such as Dolly and Stable Diffusion, are trained on millions of images and text descriptions to create new art from text prompts. The issue of artists not giving consent for their work to be used in AI training is raised, with Carla Ortiz, a concept artist from San Francisco, discovering her art was scraped into an AI dataset without her permission. The paragraph also introduces a solution called 'glaze' developed by Professor Ben Zhao and his team from the University of Chicago, which modifies images in a way that is nearly imperceptible to humans but significantly changes how machines perceive them, thus protecting artists' styles from being replicated by AI.
🛡️ The Future of Artistic Protection and AI Regulation
The second paragraph focuses on the ongoing debate about AI art generators and the legal actions taken by artists. It mentions a class action lawsuit filed by Carla and other artists against AI image generators like Stability AI. The paragraph also discusses the decision by artists to remove their work from the internet to prevent unauthorized use by AI systems. The potential of 'glaze' as a temporary solution to protect artists' work online is explored, with the acknowledgment that it may not be infallible. The need for regulation and public awareness to ensure that AI tools are developed ethically and with the consent of the artists whose work inspires them is emphasized.
Mindmap
Keywords
💡AI art
💡Training
💡Image generators
💡Class action lawsuit
💡Glaze
💡Opt-in vs. Opt-out
💡Copyright
💡Concept artist
💡Machine learning models
💡Digital art
Highlights
AI art sold for over $400,000 at Christie's in 2018, highlighting the commercial value and impact of AI-generated artworks.
AI models like Dolly and Stable Diffusion use large datasets of images, some of which are scraped from the web without artist consent.
Artists are concerned about AI generators using their styles and artwork without permission, raising ethical and legal issues.
Concept artist Carla Ortiz found her artwork scraped into an AI image dataset, leading to her involvement in a class action lawsuit.
Carla Ortiz has designed art for notable projects including Magic the Gathering and Marvel's Doctor Strange.
In response to unauthorized use of her art, Carla decided to remove her work from the internet to prevent further misuse.
Professor Ben Zhao and his team at the University of Chicago developed 'Glaze', a tool designed to protect artists' work from being used by AI without altering human visual perception.
Glaze works by making subtle changes to the artwork that significantly mislead AI models while remaining almost imperceptible to humans.
Critics argue that AI art generators mimic the process of human inspiration and learning, not directly copying artworks.
Stability AI's new generators and Adobe's Firefly are moving towards opt-out models, but concerns about explicit consent remain.
Despite efforts to secure artwork, people are already attempting to bypass protections like Glaze.
The evolving landscape of AI art calls for regulatory input, artist involvement, and public awareness to ensure fair and ethical use.
Carla Ortiz and other artists aim to use tools like Glaze as a temporary measure while pushing for more substantial legal and public solutions.
The controversy underscores the need for an opt-in model for AI image training, respecting artists' rights and contributions.
AI's potential in art continues to grow, making it essential to balance innovation with respect for intellectual property.