Lauren Leffer: Generative artificial intelligence tools can now instantly produce images from text prompts. It’s neat tech, but could mean trouble for professional artists.
Rachel Feltman: Yeah, because thoseAI tools make it really easy to instantly just rip off someone’s style.
Leffer: That’s right, generative AI, which is trained on real peoples’ work, can end up really hurting the artists that enable its existence. But some have started fighting back with nifty technical tools of their own.
Feltman: It turns out that the pixel is mightier than the sword. I’m Rachel Feltman, a new member of the Science, Quickly team. Leffer: And I’m Lauren Leffer, contributing writer at Scientific American.
Feltman: And you’re listening to Scientific American’s Science, Quickly podcast.
[Clip: Show theme music]
Feltman: So I have zero talent as a visual artist myself, but it seems like folks in that field have really been feeling the pressure from generative AI.
Leffer: Absolutely, yeah. I’ve heard from friends who’ve had a harder time securing paid commissions than ever before. You know, people figure they can just whip up an AI-generated image instead of paying an actual human to do the work. Some even use AI to overtly dupe specific artists. But there’s at least one little tiny spot of hope. It’s this small way for artists to take back a scrap of control over their work and digital presence.
Feltman: It’s like a form of self-defense.
Leffer: Right, let’s call it self-defense, but it’s also a little bit of offense.
It’s this pair of free-to-use computer programs called Glaze and Nightshade developed by a team of University of Chicago computer scientists, in collaboration with artists. Both tools add algorithmic cloaks over the tops of digital images that change how AI models interpret the picture, but keep it looking basically unchanged to a human eye.
Feltman: So once you slap one of these filters on your artwork, does that make it effectively off-limits to an AI training model?