Every week, Artnet News brings you The Gray Market. The column decodes important stories from the previous week—and offers unparalleled insight into the inner workings of the art industry in the process.
This week, more proof that not all art is created equal…
HEARING TEST
Contemporary artists are no strangers to the idea that, all too often, might makes right in politics. But a recent hearing in the U.S. Senate underscores how much more motivated federal lawmakers are to protect the most popular artists than the ones who most need a shield. The wide gap between how these two groups are treated signals bad things to come for visual artists as technology intrudes further and further into their area of expertise.
Many readers will be happy to learn that, on some level, they can blame Taylor Swift for this insight. On Tuesday, January 24, the Senate Judiciary Committee convened a day-long hearing to investigate the November 2022 meltdown of Live Nation–Ticketmaster’s sale of seats for Swift’s first tour since 2018. The proceedings included a grilling of executives from the concert-promotion and ticket-sales giant, as well as testimony from multiple witnesses aimed at teasing out whether Live Nation–Ticketmaster has been wielding monopoly power to hurt competitors, artists, and the public alike, largely by charging exorbitant prices for substandard service.
The most remarkable aspect of the proceedings was the high level of cooperation on display between the two parties, whose members for once took turns taking swings at the same target. As Democratic senator Richard Blumenthal of Connecticut told Live Nation president and CFO Joe Berchtold during his time in front of the mic: “I want to congratulate and thank you for an absolutely stunning achievement: You have brought together Republicans and Democrats in an absolutely unified cause.”
Meanwhile, American visual artists and their reps are still waiting for either one of the nation’s political parties to defend their rights with a fraction of the same vigor.
It was already discouraging enough that lawmakers at the state and federal levels have shown little interest in establishing even something like the modest resale royalty provisions on unique physical artworks adopted by the U.K. and several European countries. (In 2018, an appeals court all but neutered the 1977 California Resale Royalties Act, the last domestic law to try to do so.) Now, everyday visual artists are watching lawmakers rally to the cause of a megawatt, multimillionaire pop star—even as new technologies chomp away at their own meager creative rights.
Over the holiday break, Shanti Escalante-De Mattei of ARTnews reported on the escalating number of NFT platforms that had been trying to juice trading by rolling back crypto-artists’ right to resale royalties—a right that had been touted for more than a year (at minimum) as evidence that NFTs were an evolutionary step forward for artist equity.
Then, last Thursday, Escalante-De Mattei wrote about why artists’ rising anxieties over A.I. image generators haven’t managed to elbow their way onto international policymakers’ agendas by late January 2023. Leading the list of reasons are a greater perceived urgency around the biases embedded in “decision-making A.I.” (think: facial-recognition software used in policing) than the copyright weirdness of content-generating A.I.; and lawmakers’ fears of dooming their constituency to runner-up status in a “global intellectual arms race” against China for A.I. dominance.
Those are both shrewd points. But the Live Nation–Ticketmaster hearing underscores to me that we should add a third to the list: So far, the outcry from visual artists against algorithmic image generators has been conspicuously absent of star power. I’m increasingly convinced that it would take a Swift-like name entering the fray for lawmakers in the U.S. (if not elsewhere) to call A.I. developers in for questioning with anything resembling the same intensity.
At the same time, I’m becoming increasingly pessimistic about a move like that materializing. It’s true that the backlash against A.I. image generators and their developers has leveled up in real terms lately. In January, a trio of artists filed a class-action lawsuit against the corporate entities behind Midjourney, Stable Diffusion, and the less-known DreamUp engine. The plaintiffs allege that the companies violated copyright and unfair-competition laws by using their work without either permission or compensation as a part of the enormous dataset used to train their respective text-to-image generators.
The names of the artists who brought the suit—Sarah Andersen, Kelly McKernan, and Karla Ortiz—will be unfamiliar to members of the fine-art establishment. They make their living primarily as commercial artists and illustrators. While they’ve achieved enough notoriety within those niches for their names to be used as prompts in algorithmic image generators, they have nothing like the profile of a Taylor Swift among the mainstream public. (Financial stability has been elusive, too; McKernan said in a recent interview that she “barely [makes] rent most months.”) That fact should make no difference in court, but it makes all the difference in the world to lawmakers, particularly here in the U.S., where the line between electoral politics and online clout-chasing was smudged into oblivion years ago.
Last week’s Senate hearing on Live Nation–Ticketmaster illustrates the point. The company’s disastrous November sale of Swift’s upcoming tour tickets left millions of her fans out in the cold, even after many spent six hours or more in a virtual queue hoping to complete a purchase. (Berchtold, the Live Nation president and CFO, “largely pointed to an assault from bots… as the primary problem,” Ben Sisario of the Times wrote, “saying that the bots had even attacked Ticketmaster’s servers.”) The mass fury that followed gave congressmen a golden opportunity to lash out at an old punching bag, as Ticketmaster was eliciting accusations of monopolistic behavior for decades prior to its 2010 merger with Live Nation.
The political calculus is simple: Every one of those disgruntled Swifties is a potential voter or donor. Which means that being seen as striking back against their shared enemy could have huge value to the careers of any American congressman. No wonder several senators went out of their way to work (frankly, embarrassing) references to Swift’s lyrics into their commentary during the judiciary committee hearing. It was a great reminder that even if your interest in a cause is genuine, your pursuit of justice for said cause can still read as shameless pandering.
This same dynamic leads to the central challenge in front of anyone hoping for an equitable solution to the A.I. image-generator problem. The issue isn’t just that there is no art star with a fanbase approaching the scale or intensity of emotion of the millions of fans who felt wronged by the sale for Swift’s “Eras” tour. It’s also that, so far, the closest equivalents have shown no interest in pursuing a resolution.
My take is that, outside our niche industry, the only artists’ names with enough popular appeal to penetrate Capitol Hill mostly belong to the deceased. Think: Jean-Michel Basquiat, Keith Haring, Andy Warhol. I reached out to the foundation or estate of each of the three to ask about the executors’ level of concern over unauthorized use of each artist’s works by algorithmic image generators. All three entities declined to comment or chose not to respond.
The reality is that it may not be worth the time, energy, or money required for the estates and foundations of these megawatt artists to lobby lawmakers about the harms of A.I. image generators—precisely because they don’t feel the harms as acutely as artists further down the fame hierarchy.
“The bread and butter of the major foundations is selling art,” said Maxwell Anderson, president of the Souls Grown Deep Foundation and Community Partnership. “Mid-level and entry-level artists don’t have that same consistent revenue stream. So when it comes to this topic, I don’t think that artists and their foundations would necessarily roust themselves from their comfort yet.”
Anderson and his organization have a unique perspective on the danger of A.I. image generators. Souls Grown Deep stewards the legacy of Black artists from the American South, a group whose contributions have been simultaneously exploited and undervalued for generations. In one sense, text-to-image generators are literalizing a process that has preyed on Black artists across genres for centuries, one in which their works, styles, and innovations are scooped up, deconstructed, and used as templates to produce derivative works that profit other (primarily white) entrepreneurs instead.
It makes sense, then, that Anderson said he is paying “rapt attention” to the debate around text-to-image generators. In his mind, however, the stakes are “not just purely about the topic of infringement, especially with respect to artists whose earnings power has been historically subjugated due to racism, geographical exclusion from the art market, and other factors.” They are about the urgent need for both the art world and the North American legal system to update their priors on what constitutes fair use of an artist’s original work in the first place, as well as how to enforce a contemporary revision of the concept.
The main problem is scale. Before machine learning and artificial intelligence came to prominence, fair-use violations were rare enough and discrete enough for artists and their agents to be able to identify and take action against them as each arose. Now, Anderson said, the Artists Rights Society (with whom Souls Grown Deep works closely) “can’t offer up a blanket protection or commitment to chase down these infringements individually or singly. There’s too much latitude in datasets of this size,” which swell to billions of images each. The old tools simply aren’t up to the new task.
Anderson gave the example of a recent collaborative project between generative artist Anna Lucia and the Gee’s Bend Quilters, executed through Souls Grown Deep and the Artists Rights Society. Despite all the work done by all parties involved to ensure equitable outcomes, he said, “Along come the training, harvesting, web-scraping sites, and they grab Anna Lucia’s work and off they go.”
The result is that groups of all kinds, from artists and their representatives to judges and lawmakers, are either scrambling to keep A.I. in their field of vision, or not even aware of how far behind it has already left them.
“I don’t think there’s an appreciation yet for the capacity of this technology to subsume creative rights,” Anderson said. “People are still catching up with NFTs. That’s the case for estates, foundations, museums, and others… We’re standing in a river, and the world is going to keep rushing past us if we stand still.”
If there is more motion among the art establishment, it’s happening behind closed doors. To try to gauge the level of alarm over A.I. image generators at the upper echelons of the U.S. art market, I contacted Maureen Bray, the executive director of the Art Dealers Association of America. My questions were whether the algorithms’ unauthorized use of artworks in their datasets had been discussed among ADAA membership and leadership, and whether the group (which has spent hundreds of thousands of dollars lobbying Washington lawmakers since 2018) has asked its advocates on Capitol Hill to raise this issue with elected representatives yet.
Through a spokesperson, Bray wrote the following: “This is a situation we will continue to monitor as it pertains to the interests of our membership here at the ADAA. One key concern that rises to the fore regarding A.I. image generators is the potentially skewed nature of the datasets that they draw from. Much of publicly available art historical imagery is that of white male creators, so if their work is to be overrepresented amongst such datasets and source materials, newly generated images will only recapitulate more of the same images, narratives, and styles (with all of those inherent biases).”
The statement is valid in one sense, and completely beside the point in another. I too want an art history that appropriately credits, studies, and contextualizes artists of all gender identities and ethnicities. But if one of the outcomes of constructing a more progressive art history is that artists of all gender identities and ethnicities get to have their work equally cannibalized without authorization or compensation by Silicon Valley developers, it seems like all that we’ll have done is to raise up these long-marginalized artists so they can be mugged just as easily as the old white guys. The fact that it’s impossible for the culture to do one without also doing the other might be the most revealing aspect of all about the interplay between A.I. and artwork.
Maybe lawmakers would be more help in solving this problem if the starriest visual artists and foundations around viewed it as an existential threat. For now, they don’t seem to see it that way. Their lack of concern over the issue all but guarantees that we won’t see a congressional hearing about A.I.’s endangerment of artists’ rights anytime soon. So if you’re hoping to hear, say, Utah Senator Mike Lee roast the founder of Stability A.I. with a Damien Hirst pun, don’t put the rest of your life on hold waiting.
Instead, keep your eye on the emerging and midlevel artists, the illustrators, the cartoonists, the commercial artists, and the other image-makers with the greatest need to be compensated for unlicensed use of their work. Right now, they appear to be some of the only ones in the game with their eyes on the ball—in large part because they don’t have the luxury to do anything else.
That’s all for this week. ‘Til next time, remember: in the art world, sometimes a double standard is the only standard.