Imagine dedicating years to honing your craft, only to find that your expertise might be used—without your consent—to train your potential replacement. That was a concern raised by leaders from the Writers Guild of America when it went on strike over artificial intelligence (AI) and compensation issues for almost 150 days in 2023. The union objected to AI models being trained on their creative content and the potential for AI usage to distort authorship, compensation, and credit. They viewed AI as theft, plain and simple.
Screenwriters are not alone in this concern. Creative artists of all stripes have come out almost unilaterally against the use of generative AI, raising concerns about the ethics and legality of training models on copyrighted work to the definition and value of art. The Graphic Artists Guild has issued multiple statements of concern about AI image generators. The Authors Guild, along with novelists such as John Grisham and George R. R. Martin, filed a copyright infringement lawsuit against OpenAI in late 2023. When the Screen Actors Guild–American Federation of Television and Radio Artists settled its strike against studios at the end of 2023, the agreement covered use of digital replicas and synthetic performers.
Creative artists of all stripes have come out almost unilaterally against the use of generative AI.
Share on TwitterTo gain direct insight into how the entertainment workforce is thinking about AI, RAND hosted a live Q&A with Meredith Stiehm, President of the Writers Guild of America West, and Laura Blum-Smith, senior director of Research and Public Policy, Writers Guild of America West.
Current State of AI in Entertainment Industry
The use of generative AI in the entertainment industry is fairly nascent, at least in the case of script writing. When the WGAW surveyed its members in May 2022, AI wasn't on most writers' radar. But that was six months before ChatGPT (3.5) was released to the public. In the member meetings leading up to triennial negotiations in spring 2023, however, the growing drumbeat of concern led the guild to make AI a central piece of their proposed contract revisions.
In contrast, the entertainment industry has been using nongenerative AI for years. AI is used in video editing, visual effects, sound selection, and even in the recommendation engines driving the demand for content on streaming platforms. Generative AI is slowly making its way into Hollywood as well.
Areas of Concern: Intellectual Property, Model Training, and Worker Replacement
The primary concern shared in the conversation was about the ethical and legal use of scripts in training new AI models.
Some in the RAND audience argued that all creative enterprises are inspired by prior work: She's the Man is an adaptation of Twelfth Night, Clueless was inspired by Emma. Parts of Mr. Robot allude to Fight Club. As Mark Twain — a staunch defender of copyright protections for authors — said, “There is no such thing as a new idea…We simply take a lot of old ideas and put them into a sort of mental kaleidoscope.” Is the human “mental kaleidoscope” distinct and deserving of protection from the AI's digital kaleidoscope?
Reuse has been an integral part of writers' compensation for decades. Stiehm explained to the audience that script writers are paid for their original script, and then receive additional compensation if a show is re-aired, if it is sold for home video, if it is relicensed to another distributor or another market. Training AI models on creative content could be construed as a similar reuse…except that, as the WGA pointed out, the writers aren't cut in on potential proceeds.
And then there's the classic AI concern: Will it put me out of a job? Currently, though, Stiehm is not concerned for writers, whose creativity and inventiveness are their stock-in-trade. “Writing, like any art form, is based on a lived human experience, an emotion. Our whole job is to react in a human way…AI can't bring lived experience.”
Blum-Smith concurred, focusing more on the use of content in AI model training. “At this point, writers are very skeptical of the technology and [their] position is that [AI-generated content] is based on theft of their work.”
The Strike and Negotiations
When the WGA put AI on the negotiating table, the studios initially refused to discuss any concessions. This impasse, as well as other issues, led to a strike authorization.
Striking isn't a tool the guild uses lightly, Stiehm said. “Five months is a long time” to be out of work. But the strike was generally successful. “There was a lot of fear and what-if, but what we got was quite good,” she said. “We were the first union that had to negotiate our contract and figure out what those protections and stopgap measures should be.”
The guild negotiated a guarantee that an AI-generated script wouldn't be handed to a writer to “rewrite,” which would have denied the writer the higher script fee as opposed to a rewrite fee. For purposes of compensation and credit, AI cannot be an author. The studios also must be transparent about their AI use.
The guild did not, however, win a prohibition on the studios using writers' prior work (scripts written for the studios) to train AI models. That will be for the courts to decide. (So far it is third-party AI companies, rather than entertainment companies, that are doing the mass scraping of copyrighted material.)
As to whether these protections are enough, only time will tell. Blum-Smith noted that that's why the guild renegotiates every three years. It has always needed to keep ahead of technological changes (e.g., home videos, streaming) and frequently sets the contract standard for others in the industry.
Lessons for Other Industries
The WGA's actions are helping to shape the AI playbook for workers outside the entertainment industry. Stiehm participated (alongside tech industry leaders) in an AI insight panel organized by U.S. Sen. Chuck Schumer. Her remarks called for legislation to complement the concessions won by the guild and to protect impacted workers in industries without collective bargaining.
The WGA's actions are helping to shape the AI playbook for workers outside the entertainment industry.
Share on TwitterIndeed, Blum-Smith attributed their success to the strength and unity of the guild. “This industry is heavily unionized. Workers should use their collective voice to help control how AI is going to be used such that it impacts their work.”
At the RAND event, the guild leaders pointed to proactive monitoring of technological developments as an essential strategy for workers in other industries. Future-proofing its contracts is crucial on the WGA's three-year cycle for negotiations; it might be even more so for industries with longer contracts and thus fewer opportunities to modify terms.
The 2023 WGA strike and contract offer a potentially important case study for how AI can be navigated by workers and industry leaders, even before any threat to jobs is imminent. The WGA's strategy of identifying AI's potential impact on its members and establishing clear guidelines for AI use in the industry could serve as a blueprint for other sectors grappling with the technology's implications. Such a strategy can pave the way for continued—but regulated—incorporation of AI into the workplace.