Art & Tech
Artists With Early Access to A.I. Tool Release Blistering Open Letter, Decrying ‘Art Washing’
The artists published a copy of the A.I. tool on Hugging Face, an open-source platform for machine learning and artificial intelligence.
The artists published a copy of the A.I. tool on Hugging Face, an open-source platform for machine learning and artificial intelligence.
Adam Schrader ShareShare This Article
A group of artists who received early access to test Sora, an artificial intelligence video generation tool, has leaked its code and released a blistering letter critical of its creator, OpenAI, the company behind ChatGPT. The artists claimed the firm had taken advantage of their unpaid or under-compensated labor to “art wash” its image. OpenAI has suspended access to Sora in response to the claims.
“Artists are not your unpaid R&D,” the artists said in the open letter. “We are not your free bug testers, PR puppets, training data, validation tokens.”
The letter, which opens with the address “Dear A.I. Corporate overlords,” was published on Tuesday with the leaked code for Sora on Hugging Face, an open-source platform for machine learning and artificial intelligence that provides tools for building and deploying A.I. models. The artists claimed the company, valued at $150 billion, failed to compensate them and engaged in practices like censorship of artistic content and gatekeeping by only allowing around 300 selected artists to use Sora.
“We received access to Sora with the promise to be early testers, red teamers, and creative partners,” the open letter reads. “However, we believe instead we are being lured into art washing, to tell the world that Sora is a useful tool for artists.”
OpenAI announced in May that it would providing early access to Sora for visual artists, designers, and filmmakers “to gain feedback on how to advance the model to be most helpful for creative professionals.” The platform would also be opened up for “red teaming,” a term borrowed from cybersecurity to describe practices used to test a product’s safety and security protocols. Though adopted widely across the tech industry, recently many A.I specialists critiqued the practice, arguing that opaque company rules among A.I. firms are reducing transparency and corporate accountability in the process.
The roughly 20 artists who penned the open letter to OpenAI echoed these sentiments. Only a handful of the hundreds of artists who used Sora were chosen through an internal competition to have the films they created screened. Those chosen received minimal compensation that they said “pales in comparison” to the marketing value generated by allowing them to use the tool.
“Furthermore, every output needs to be approved by the OpenAI team before sharing. This early access program appears to be less about creative expression and critique, and more about PR and advertisement,” the artists said in the letter, which has garnered over 500 signatures.
The artists noted that they are not against the use of A.I. technology to make art or they probably wouldn’t have been invited into the program by OpenAI. But they said they are concerned with “how the tool is shaping up” ahead of a possible public release.
“We are sharing this to the world in the hopes that OpenAI becomes more open, more artist friendly and supports the arts beyond PR stunts,” the artists said, urging people to use open-source software instead of proprietary platforms like those of OpenAI.
The letter’s authors—among them Jake Elwes, Katie Peyton Hofstadter, and Solimán López—declined to comment further, citing concerns over potential repercussions.
A spokesperson for OpenAI addressed the controversy by email, noting that Sora is still in a research preview to balance its creative applications with the development of safety measures for broader use. They confirmed that hundreds of artists have been involved with the process.
“Participation is voluntary, with no obligation to provide feedback or use the tool. We’ve been excited to offer these artists free access and will continue supporting them through grants, events, and other programs,” the spokesperson said. “We believe A.I. can be a powerful creative tool and are committed to making Sora both useful and safe.”
Artists participating in the testing program have no obligations to the company beyond sharing confidential details about its development with the public, and there are no explicit usage requirements for them. The company has also made efforts to provide artists with access and funding, and has appointed its first resident artist, Alexander Reben.
Sora has already faced scrutiny when OpenAI’s former CTO, Mira Murati, avoided answering questions about whether the system was trained using YouTube videos. Murati, who resigned in September along with two senior research executives, told The Wall Street Journal in March that Sora would be available by the end of the year but that, “we will not release anything we don’t feel confident on.”
This article was updated on November 27 at 5:37 a.m. ET.