#9: There is No Ethical Use of Any OpenAI Tool
Ars Electronica's award of an OpenAI commissioned project, and where resistance is needed.
“The rise of consensual AI made the old devices like us defunct. The consensual AI smart speaker made transparency central to its design. It was built to be user-serviceable, repairable, with a clear supply chain for its components. The screws were not hidden, like in my design. Nor did you void the warranty as soon as you tried to undo the screws, also like me. The voice assistant service was paid for by subscription – just a few pounds every month, but it meant that your data was kept away from the advertisers and data brokers. The company was a co-operative, meaning that owners of the devices had a say in the direction and fortunes of the company. Any user could request access to the stored information in the company’s databases on them, including voice recordings and training data. These were all simple things, really. The idea of paying for a service, and that giving you certain rights over your own information, was not new. Nor was the idea of a co-operatively run business, there were plenty of those. But people had gotten so used to being told that if they wanted the new technology they had to accept it in the form it was given to them, the form that denied them rights, that kept them from their own data, that treated them as a resource to extract capital from. That shift in perception, of what people were willing to exchange for the promise of new technology, was the end of things like us.”
Voice: ‘Alexa’, via AmazonVoiceServices API.
Soundscape: Extracted + disassembled + processed + recomposed ‘Alexa’ voice.
Script: Wesley Goatley, from Newly Forgotten Technologies.
This week, the influential international digital arts festival Ars Electronica awarded its Golden Nica Prize in AI+Arts to Paul Trillo for a music video that was produced using OpenAI’s currently unreleased text-to-video tool Sora. In the opening of his award acceptance statement, Trillo describes the work as “a groundbreaking project, being the first full generative video made with OpenAI's SORA text-to-video model”, highlighting his privileged access to a much-hyped tool that is not available to the wider public. As the statement continues, Trillo repeats familiar strategic misrepresentations of AI tools as having human-like intelligence and capacities, describing the Sora tool as possessing “dream logic”, producing “dream-like” outputs, and that the “machine’s hallucinations…resembl[e] a dream”. As I’ve covered before, these are narratives common to governments, corporations, militaries, and some artists, that serve to both manipulate public understanding of AI technologies and reinforce many of their most violent and exploitative applications. OpenAI are unsurprisingly one of those who amplify such narratives, as seen when referring to Sora as “understanding” and “creating” at multiple points in their promotional materials for the tool, with the latter being echoed by Trillo in his statement.
Beyond these misrepresentative narratives, there are other worrying cultural and political questions surrounding both Trillo and Ars Electronica’s relationship to OpenAI. Trillo has stated elsewhere that OpenAI approached him with an offer of access to the Sora tool, which is otherwise completely closed both to the public and to those investigating Sora’s training data and its implications for privacy. Trillo vaguely describes the video as a ‘commission’ in the award statement without specifying that this was in fact commissioned by OpenAI, rather than by Ars or Washed Out (the musician who this music video is ostensibly for, who is only mentioned once in Trillo’s statement). OpenAI uses quotes from Trillo discussing the affordances of Sora as promotional material on their website for the tool, describing him as helping to ‘improve’ their model, though the details of this improvement work are unclear. Troublingly, there is also no information on who the jury was for the AI+Art prize, which is the only prize in the festival for which there is no named jury. Ars Electronica do not provide a list on their website of the “hundreds of sponsors” that fund the festival every year, making it unclear if OpenAI are one of these sponsors, or if this has influenced the outcome of this award. It is notable that Ars retweeted an article about Sora’s launch in February (several weeks before the submission deadline for the prize), from an account otherwise largely dedicated to promoting Ars-specific events and news.
Regardless of the justification, the result is that Ars have awarded one of their most prominent prizes (and its cultural and creative legitimacy) to a work commissioned by, and actively promoting, a company implicated in the mass capture and exploitation of the work of millions of artists, non-consensually and from all around the world, to train Sora and their other generative AI products. Multiple ongoing lawsuits are pursuing OpenAI over this, with the company admitting that the tools they have designed are impossible to train without using huge amounts of personal and copyrighted material and attempting to defend this by arguing it to be ‘fair use’. This contentious context is acknowledged by Trillo himself who makes a single reference in his statement to the fact that “there are valid criticisms regarding how these models were built on stolen data and the impact of AI on the creative industries”. This weak attempt at acknowledging the ethical implications of working with OpenAI and their tools demonstrates that both Trillo and Ars Elelctronica are well aware of OpenAIs actions. Apparently these ‘valid criticisms’ are not valid enough to impinge upon Trillo’s explicit and lengthy praise of OpenAI’s work, or of undertaking a public partnership with the company, or for Ars Electronica to award a project that so transparently celebrates the fruits of OpenAIs theft from artists.
OpenAI’s mass-exploitation of creative labour is not the only reason that they cannot be trusted and should not be supported or endorsed. Until January 9th 2024, the company had a long running clause in its terms and conditions stating that they would not use their tools for “military or warfare”. This was quietly removed on January 10th, before announcing that they had begun a partnership with the US military research arm DARPA. Since then, they have appointed ex-General Paul Nakasone to the OpenAI board of directors, who was previously installed by Donald Trump as the head of the US National Security Agency (NSA), and the commander of the US Cyber Command. While OpenAI are far from the only tech company with connections to the military, few have pivoted so quickly from promising not to work with the military to having an ex-General and head of the NSA on the board within six months. This is a company not only entangled with the military industrial complex, but actively embracing it.
Most troubling for creative practitioners who may have moral or ethical objections to supporting such a company is that every use of OpenAI’s platforms provide unintentional and unpaid work for them. For example: entering a prompt or uploading an image, making in-platform edits or refinements, choosing to save or not save the final outcome, and the text/image/video outcome itself are all data points that signify the accuracy of the system, how satisfied the user was with it, and be used to further train their model. OpenAI’s open-ended rights to exploit this data for these and other unspecified ends is made clear in their terms of use for ChatGPT and DALL-E. We can assume that Trillo had to accept similar terms of use of Sora. Through these forms of feedback, the user performs digital labour that is not only unpaid, but actively serves to enhance the capacity for OpenAI to further establish market dominance and concentrate political and economic power around themselves. Nobody is exempt from this: even Trillo has clarified that OpenAI did not pay him for this work, even though OpenAI state that his work on or with the tool ‘improved’ it. It seems that nobody can escape OpenAI’s exploitation of creative labour.
If you object to OpenAI’s practices, its ongoing exploitation of creative labour, its entanglement with the military industrial complex, then you should not give them any of your data and creative labour; as OpenAI have stated, their tools literally do not function without it. And when every use of OpenAI’s tools extracts this data and exploits this labour, there can be no ethical use of their tools. Given this, the positive promotion of OpenAI or their tools by artists or organisations is an endorsement, tacit or explicit, of this company’s actions and practices. We do not want to live in a world where tech demos and unpaid labour for those in power are seen as a route to success for young artists, which is exactly the message this award projects. To avoid this, we should not only critique OpenAI, but we should also be calling to account those who knowingly and eagerly lend their creative labours to them.