The WGA AI proposal needs to get a lot smarter – analysis

AI is evolving so fast that it will be obsolete by the time the WGA has settled on an interpretation.

The explosion of publicly available artificial intelligence services in recent months has accelerated discussions about the limits of their use. As the Writers Guild of America prepares a new set of demands for upcoming studio negotiations, the guild has begun incorporating AI considerations into its proposal. It takes work.

In February, the WGA said it would propose “regulating the use of material made using artificial intelligence or similar technologies.” Last week, she clarified that she would require that no AI-generated material be considered “literary” or “source material.” In short, if a writer were hired to adapt an AI-generated idea, the writer would receive sole credit for the script—neither the AI ​​nor the software company behind it could get any additional credit for the writing.

The WGA is right to consider the potential impact of AI tools on its members, but their current proposal is based on a narrow understanding of the technology. The rules currently being discussed relate to the use of GPT-4 or other text-based Generative AI services – just the most well-known type of Generative AI technology that has garnered media attention in recent months.

Much more is happening now than that. A recent open letter from scientists, researchers and other technologists called for a six-month hiatus in all AI research. Currently, however, generative AI is evolving so rapidly that by the time the WGA settles on an interpretation, it will be obsolete. A meaningful strategy for dealing with AI requires proposals that anticipate more sophisticated AI integrations.

This week, software startup Runway AI started making its services available to testers. Unlike OpenAI, the company behind GPT-4 and the image-generating DALL-E, Runway can translate text prompts into videos. (Runway was used to speed up the VFX pipeline, including the green screen technology used to eliminate the background of the rock universe sequence in Everything Everywhere All at Once.)

While early examples of Runway’s text-to-video effects might look rudimentary – here’s a river, there’s a cow at a birthday party – the potential for AI to create real moving images, and therefore scenes, goes much deeper the creative process as words on a page or concept art. As strange as it sounds, we’re not that far from a scenario where a single pitch meeting could end up with raw footage of an entire feature.

Consider an arbitration scenario in which an author was asked to punch out material from that session. Why would a studio give an author full credit when the AI ​​did an actual rough cut? The outcome in this case could make executives more inclined to just build the AI ​​version and not bother with the union headache, even if the product underperforms.

If the WGA is to prevent such conflicts, it must negotiate a role in the general use of AI in the creative process and begin to consider how authors can adapt to an evolving process. One possible requirement: if a studio determines that it wants to include AI in the brainstorming process—and many will if they haven’t already—then that decision needs to be outsourced to the writer in question who is being hired.

In other words, it should be part of the work: the WGA should insist that its authors play an active role in AI prompts that stimulate the idea for a project, since that’s where the true creative process of AI collaboration begins. Any idea generated by AI prior to an author’s involvement should be off the table. To that end, authors need to learn to use text and image prompts: a lot of AI-generated content is pretty bad, but that’s also because the prompts are mediocre. Hire writers to work with AI, and the results will almost certainly get stronger—these machines work better when they take cues from talented human minds.

Many artists may find these ideas contradictory to their individual processes. Fair enough. There is real danger in the notion that AI could overtake originality in Hollywood and beyond. As a safety measure, the WGA and other Hollywood unions should consider setting a specific threshold for AI-generated film and television scenarios each year. This would mean that while studios would indulge in the power of AI, they would not be able to do so along with appealing original ideas that didn’t use AI at all. It would also require a level of transparency from studios that the WGA already requires in other areas.

The result would provide a clearer sense of what AI can and cannot do – a certain balance in the studio product that would position AI and non-AI projects side-by-side. This would provide clearer metrics on how AI can succeed in the marketplace, as well as its shortcomings, so the union and studio could grasp the terms of their future negotiations.

At the moment they are only dealing with hypotheses. If author-driven, AI-generated projects are found to be more commercially viable and otherwise more valuable to a company’s bottom line, everyone – including authors receiving residuals – will benefit from adapting to a technology that shows no signs of slowing down their role shows .

Registration: Stay up to date on the latest movie and TV news! Sign up for our email newsletter here.

Related Articles

Back to top button