Nightshade: A new weapon for artists against AI scraping
A new tool that lets artists protect their work from being used by AI companies without their approval has received a surprising 250,000 downloads in just five days. The tool, called Nightshade, was created by a team of computer science researchers at the University of Chicago and released on January 18, 2024.
Nightshade works by adding imperceptible changes to the pixels of images that can cause serious damage to image-generating AI models, such as making dogs look like cats, cars look like cows, and so on. The tool is intended to protect the artists’ intellectual property and creative style from being exploited by AI companies that scrape artworks from the web to train their models.
The creators of Nightshade say that they were motivated by the recent lawsuits filed by artists against AI companies such as OpenAI, Meta, Google, and Stability AI, who supposedly used their artworks without permission or compensation. By using Nightshade, artists can increase the cost of training on unlicensed data, and make licensing images from their creators a practical alternative.
The demand for Nightshade has been overpowering, according to Ben Zhao, the leader of the project and a professor of computer science at the University of Chicago. “I expected it to be extremely high enthusiasm. But I still underestimated it…The response is simply beyond anything we imagined,” he wrote in an email to VentureBeat.
Nightshade is not the only tool developed by Zhao’s team to authorize artists against AI scraping. They also created Glaze, a tool that allows artists to disguise their style by modifying the pixels of images in refined ways that are invisible to the human eye but manipulate machine-learning models to interpret the image as something different from what it shows. Glaze has received 2.2 million downloads since it was released in April 2023.
Zhao’s team plans to integrate Nightshade into Glaze, and give artists the option to use the data-poisoning tool or not. They also intend to make Nightshade open source, which would allow others to modify it and make their versions. The more people use it and make their versions of it, the more powerful the tool becomes, Zhao says.
Nightshade and Glaze are examples of how artists can fight back against generative AI models that use their work without consent. They also raise ethical questions about the use and misuse of AI data and the possible penalties of data poisoning for the future of AI research and development.
Also Read: How Edge Computing is Transforming AI and IoT
Also Read: Google and Samsung to Offer Free AI Upgrade for Android Users
Also Read: Emagine Solutions FZE is seeking a visionary Head of Generative AI
Also Read: The EU AI Act Summit 2024: London, UK, February 6, 2024
Also Read: IIQAF Group Dubai is seeking an Artificial Intelligence Developer
Also Read: 16th installment of CFO StraTech 2024 in Riyadh, Kingdom of Saudi Arabia
Also Read: Amazon’s GenAI Shopping Assistant Faces Skepticism
Also Read: Artificial Intelligence Developer Needed in Dubai
Also Read: Faculty is Hiring a Senior Software Engineer for AI Platform