As A.I. systems continue to play an increasingly prominent role in every industry, the culture sector is looking for guidance on how to strategically integrate new technologies while safely navigating treacherous territory around ethics, data, and copyright. Serpentine’s latest report, Future Art Ecosystems: Art X Public AI (FAE4), brings new insight from experts and opens the floor for productive discourse around best practice.
Serpentine has long established itself as a forerunner in the field of art and advanced technologies with a public program that has featured artists like Ian Cheng, Danielle Brathwaite-Shirley, and Gabriel Massan. It kicked off 2024 as its “year of A.I.” last month with a new show by Refik Anadol and will open a hotly anticipated exhibition this fall by Holly Herndon and Mat Dryhurst, who are known for creating an A.I. recording system called Spawn that they refer to as an “A.I. baby.”
In 2020, Serpentine launched the FAE series with the aim of promoting public cultural infrastructure to support both artists and arts organizations. Now in its fourth year, we spoke to three of FAE4‘s authors—Kay Watson, Eva Jäger, and Victoria Ivanova—about the new report’s key takeaways.
1. The Cultural Sector Can Shape the Future of A.I. for Public Good
“We see the art and advanced technology ecosystem as something that mediates between society and technology,” explained Ivanova, emphasizing that activity within the cultural sphere has broader significance for society. For this reason, the report contextualizes traditional arts institutions like libraries, museums, and galleries as existing within a larger ecosystem that also includes tech companies like Google DeepMind and Nvidia.
The report also emphasizes the importance of improved technical literacy. The section “Defining Public A.I.” dispels the common myth of A.I. as “a single, monolithic, godlike technology coming from Silicon Valley.” Instead, the report’s first chapter disambiguates the inner works of A.I. “in a way that is both accurate technologically, but also legible to larger audiences.” This high-level understanding should encourage artists and institutions to play a more active role in wielding their “soft power” to shape the public’s relationship to advanced technologies.
“The idea that public A.I.is something that can’t happen because A.I. has already been created by Silicon Valley is the wrong narrative,” said Jäger. “We’re seeing more and more opportunities for new kinds of public participation, public governance, and public ownership.”
“The publication is advocating for ecosystem of art and advance technologies as a valuable site for experimentation and testing out of new ideas and new models,” added Watson. “We have inherent value, like academia or the tech sector, so we need to think about what our impact on the future of technology should be.”
2. Cultural Organizations Should Cooperate Not Compete
Acknowledging that not every organization has the same mission, capacity or resources, the report’s authors are advocating for a model where the knowledge acquired by some organizations at the forefront of technological developments is shared with the wider cultural sphere. “We have to acknowledge that different actors in an ecosystem are able to do different things,” said Watson.
The objective is a non-competitive, collaborative approach inspired by the idea of “interoperability,” which in the world of tech means the ability of computer systems to exchange information with each other. An example in the case of A.I. would be universal policies and infrastructure for all museums, no matter how large or small, to develop their own datasets. Similarly, A.I. tools that could be widely adopted at an operational level are currently being developed as part of the U.K.’s five-year, £14.5 million ($18 million) research project Towards a National Collection.
3. The Cultural Sector Can Prototype New Models for Governing Data
Many in the artistic community have panicked over the sudden arrival of generative A.I. tools like OpenAI’s DALL-E and Midjourney, many of which were trained on data scraped from the internet without the creators’ consent. So far, the backlash has primarily been fought via a handful of class action lawsuits, but an optimistic spin on the issue suggests that now is the moment for collective bargaining.
“There has been a really interesting thing that happened where I.P. rights, when it comes to A.I., are very much being legislated through the cultural sector,” said Jäger. “We have to take advantage of this moment where people are looking to the cultural sector to set a standard, to self-litigate and decide how we want that economy to work.”
The Serpentine’s exhibition with Holly Herndon and Mat Dryhurst this fall will present to the public a possible new model for licensing data to A.I. developers called the Data Trust. A data trustee would be appointed to represent a group of data subjects, meaning people who have contributed their data, creating “a networked version of I.P. rights.”
The idea, which has been trialed in some other industries like medicine, was inspired by extensive discussions with civic tech organizations, policy makers advising the U.K. government on A.I. regulation, and other tech experts.
“As a public arts institution we are advocating for a data market where lots of artists can pool their interests together and advocate for the value of their work in aggregate,” said Jäger. “It’s going one step beyond the litigation cases that you hear about because even if they win, what’s the economy they’re advocating for?”