Hollywood writers strike to keep AI out of their industry
AI doesn't pose a serious threat to screenwriters—at least not yet.
The Writers’ Guild of America, which represents film and television writers, went on strike on Tuesday after failing to sign a new contract with Hollywood studios. The two sides are at an impasse on a number of issues, including writers’ demands for a larger share of streaming revenue. But the dispute that caught my eye was the writers’ demand for a ban on AI-generated scripts.
Rolling Stone has a great (paywalled) writeup of this aspect of the negotiations. According to a summary obtained by Rolling Stone, WGA is asking studios to agree that “AI can’t write or rewrite literary material,” that AI-generated content “can’t be used as source material,” and that the writers’ work “can’t be used to train AI.” The studios reportedly rejected these demands, offering only that the two sides could meet once a year to “discuss advancements in technology.”
A useful distinction to make here is between material written by AI and material written with AI. The latter is already becoming commonplace in the software industry. Software products like Github Copilot allow programmers to describe a function in plain English and have the corresponding code generated automatically. The Copilot-generated version may contain mistakes, but it’s often faster to fix those mistakes than to write a new function from scratch.
Writing a script might look a lot like this a decade from now. Maybe in 2033, a Hollywood writer will write some bullet points describing the script they want to write—the main characters, the location, the main beats of the story, etc.—and have a large language model write five or 10 first drafts. The writer could then choose the draft they like best and refine it from there.
Still, I think it will be a long time before studios can dispense with writers altogether and have the AI produce finished scripts.
When I’ve experimented with ChatGPT, I’ve found that it’s perfectly capable of producing a basic news article. The articles it writes just aren’t very interesting: they don’t break news, offer original analysis, or otherwise stand out from the pack. That’s because the hardest part of being a reporter is reporting: going out in the world to find accurate information that hasn’t been reported before. Once you have a good scoop, turning that into a news article is relatively easy.
I expect something similar will be true of AI-written scripts. It won’t be long before a large language model can produce something that meets the basic requirements for a movie script. But Hollywood studios don’t need more mediocre scripts. There are already thousands of amateur writers trying (and mostly failing) to get Hollywood studios to pick up their scripts. WGA members are able to make a living because they write scripts that are significantly better than average. And it’s going to be hard for AI software to produce those.
An important part of a screenwriter’s job is to have their finger on the pulse of popular culture, anticipating what themes, dialogue, and jokes will resonate with audiences. Human writers have a key advantage over AI software here: they spend a lot of time interacting with other human beings, which gives them up-to-the-minute insight into what people are interested in at any given point in time.
Large language models, in contrast, are inherently backwards looking. They can only learn from material that’s already been published, often with a lag of months or even years. That makes it difficult for them to generate material that’s as fresh as the best human-written scripts.
Writer Adam Conover also told Rolling Stone that Hollywood scriptwriters do more than just put words on the page.
“They’re required to understand the actual filming process,” Conover said. Writers “consider the overall budget and think about which scenes are more expensive to make compared with others, communicate with line producers about edits, rewrite scenes if an actor doesn’t like their character, talk to costume designers and people in the prop department to figure out if they can or can’t bring elements of the script to life, and think about the economics of filming locations, among other details.”
Current large language models are nowhere close to being capable of handling all these tasks. So it’s going to take some major advances for AI software to generate professional-quality scripts from scratch.
This means that a contract banning AI-written scripts probably won’t make much practical difference—at least for the next few years. Still, it’s not crazy for the WGA to be pushing for language like this now.
Precisely because AI is unlikely to be a major factor in screenwriting in the next few years, studios may be willing to give ground on this issue relatively easily. And if they do, that will put writers in a stronger bargaining position in future rounds of negotiation when AI really might pose a threat to writers’ jobs.
Thanks to Tim Carmody for tipping me off to the AI dimension of the writers’ strike. His newsletter The Amazon Chronicles is a must-read if you’re interested in the Internet’s biggest retailer.
My understanding from reading about the strike was that the issue is not focused as much on the technology but the money. I don’t think there’s much disagreement over the limitations of LLMs here, but rather how the LLMs could be exploited to lower the studio’s cost base on the back of the writers by lowering how much they get paid for *effectively the same work* using what amounts to a loophole. Vox notes this:
“Second, the WGA says it’s imperative that “source material” can’t be something generated by an AI, either. This is especially important because studios frequently hire writers to adapt source material (like a novel, an article, or other IP) into new work to be produced as TV or films. However, the payment terms, particularly residual payouts, are different for an adaptation than for “literary material.” *It’s very easy to imagine a situation in which a studio uses AI to generate ideas or drafts, claims those ideas are “source material,” and hires a writer to polish it up for a lower rate.* “We believe that is not source material, any more than a Wikipedia article is source material,” says August. “That’s the crux of what we’re negotiating.”
(Source: https://www.vox.com/platform/amp/culture/23700519/writers-strike-ai-2023-wga)
I’m not making a moral judgement on this either way, only to say that I think “follow the money” applies here in understanding the motives of both sides.