In our Spring 2023 issue of the magazine we shared information on a product called Glaze that helps ensure AI scanning technologies cannot easily read the photographs on your website. This was a great first step to defending artisan’s work from AI mimicry.
If you haven’t already read about Glaze, this article will help you learn more about this tool. After deploying the initial software, the researchers have been gathering early artisan feedback. We even tried an early version of the product ourselves to see how it worked.
While the first software was a step in the right direction, it required the use of a computer with a powerful processor. This is great if you are editing photos while working at your computer. But this doesn’t work for a lot of artisans that are working from their mobile phones for image capture and upload.
The team that originally produced Glaze has released a new update to the product. “We deployed WebGlaze, a free web service that artists can run on their phone, tablet, or any device with a browser to have their art be glazed on GPU servers we pay for in the Amazon AWS cloud. Like the rest of Glaze, WebGlaze is paid for by research grants to ensure it is free for artists.”
How do you get access to this new product? If you go to the Glaze site, you can request an invitation to their product. This invite does expire, so act right away once you receive the invitation!
If Glaze doesn’t stop AI, maybe it’s time for some Nightshade
It’s no secret that there has been an explosion of AI products introduced into the market. Many artisans have real concern that their images are being used to feed AI models. The AI industry is still in its infancy without regulations or rules enforcing any kind of behavior standards. In absence of legislation regulating this activity others are fighting back with technologies that make it harder for AI to scrape and steal designs. This is particularly important as artisans are not being compensated for the very designs that are being fed into most AI algorithms.
The team behind the Glaze is back with a new product that should be even more helpful. This new product is called Nightshade. And this one is powerful in new ways. Nightshade is intended to disrupt AI models being built on unlicensed data. To use a food analogy, it messes with their whole batch of flavors. It’s like adding hot sauce to a mild and creamy cheese. Two things that simply don’t belong together.
How does it work? It takes what looks like a normal image and distorts it so that an AI tool that is performing unauthorized scanning will get a very different image than the one a normal person would see on a website.
“For example, human eyes might see a shaded image of a cow in a green field largely unchanged, but an AI model might see a large leather purse lying in the grass. Trained on a sufficient number of shaded images that include a cow, a model will become increasingly convinced cows have nice brown leathery handles and smooth side pockets with a zipper, and perhaps a lovely brand logo.”
We will continue to bring you updates on the developments of these programs, as the team producing these products at the University of Chicago will be creating a version of the software that has both Glaze and Nightshade.