As concerns mount in publishing circles about what AI means for the industry, the Authors Guild has mounted an aggressive campaign to try to ensure that writers are protected from any negative impacts of what the new technology may bring. To that end, the Guild has released four new model clauses concerning AI to its Model Trade Book Contract and Literary Translation Model Contract. The new clauses are in addition to one already introduced that prevents the use of books on training generative AI without an author’s permission.
The new clauses require an author’s written consent for their publisher to use AI-generated book translations, audiobook narration, or cover art. These clauses, the Guild maintains, can benefit publishers and the publishing industry at large by maintaining the high quality craftsmanship that consumers are used to. The Guild also urges publishers to identify any books that contain a significant amount of AI-generated text. This summer, the Guild will publish AI guidelines for authors and publishers containing each of these conditions.
"The purpose of these demands is to prevent the use of AI to replace human creators," the Guild's statement reads. "The Authors Guild strongly believes that human writing, narration, and translation are vastly superior to their AI mimics. Moreover, as an ethical matter, the Authors Guild opposes relying on these tools to replace human creators, in part because current AI content generators have largely been trained on pre-existing works without consent. The Guild stands in solidarity with human creators in other industries, who like authors, face professional threats from AI-generated content flooding the markets for their work.
The Guild’s statement continues: "Publishers may feel compelled to turn to AI because of competitive pressures to save costs; AI, of course, is cheaper than human labor, but we feel strongly that it is to the whole industry’s benefit to resist replacing human authors, artists, and narrators with generative AI. Book authors should not be required to start with AI-generated text. Writers report that it is easier to start from scratch to produce well-written text that resonates with readers and reflects the authors’ voices than it is to start with AI generated text. Similarly, the quality of narration matters, with many audiobook listeners choosing titles based on the narration as much as the book’s content. Book cover designs play an important role in attracting readers, and designing a memorable cover that expresses the book’s spirit requires an intimate understanding of the text. The translation of books is a form of literary writing that requires understanding the essence of each phrase. While AI may be able to produce “accurate” translations, it cannot conjure the layers of meaning that a skilled human translator can. We encourage publishers to adopt these clauses and authors and agents to request that they be added to their contracts."
Guild Survey Highlights Concerns over AI
The release of the new clauses follows a recent survey of Guild members about the concerns and issues authors see arising from AI. The Guild acknowledges that many creators, including authors, are using generative AI tools to improve their creative process, but warned that “unchecked proliferation of these technologies, without adequate guardrails, poses a significant threat to human creators and the future of the arts.”
Unsurprisingly, 90% of the writers who responded to the survey believe that they should be compensated for the use of their work in training AI. Similarly, 86% believe they should be credited for the use of their books in training generative AI.
According to the Guild, more than 1,700 members took part in the survey, which examined how writers are using generative AI or might use it in the future, how they think they will be affected by its widespread use, and how the writing profession might be transformed as a result.
Among the highlights are:
- 23% of writers reported using generative AI as part of their writing process
- Of writers who reported using generative AI in their writing process, 47% said they use it as a grammar tool, 29% for brainstorming plot ideas and characters, 14% to structure or organize drafts, and 26% in their marketing. Only around 7% of writers who employ generative AI said they use it to generate the text of their work.
- 65% of writers said they support a collective licensing system that would pay authors a fee for use of their works in training AI, while 27% were unsure. Only 9% said they did not support a collective licensing system, with many of those opposed to it not wanting AI to use their works at all.
- 91% of authors surveyed believe readers should know when AI has created all or even portions of a work.
- 94% think the publishing industry should adopt a code of conduct or ethical approach related to AI.
- 69% of authors think their careers are threatened by generative AI.
- 70% of authors believe publishers will begin using AI to generate books in whole or part—replacing human authors.