It’s been just over a year since ChatGPT was introduced to a public mostly unfamiliar with artificial intelligence. It appeared initially to have no obvious relevance to book publishing. Since then everything has changed—and nothing has changed. Publishers are beginning to dive into the new AI tools, exploring the edges, engaging in tentative chats with ChatGPT. But there’s no evidence of a revolution in the practice of publishing. It’s just too soon.
In September, Publishers Weekly presented a half-day online program covering AI and book publishing titled “Artificial Intelligence: Revolution and Opportunity in Trade Publishing.” Peter Brantley and I cohosted eight panels and keynote sessions. The videos are now available without cost on YouTube. If you weren’t able to join us in September, check them out at youtube.com/user/publisherswkly.
We were fortunate to secure the participation of a range of publishing practitioners and authors (and a lawyer!) who shared details of their early experiments and outcomes in bringing AI into their workflows. The sessions were divided by topic, including editorial, production and marketing. The examples provided were real-world and hands-on, and inform many of my thoughts in this article.
AI: A very brief history
Despite all of the commotion around ChatGPT, it’s worth remembering that AI is not new. It has been with us for decades. It just never played the part in our lives that it now demonstrates, perhaps just a little too visibly. Machine learning and natural language processing (NLP) were among the technologies prominent in previous generations of AI. Some publishers sought to incorporate these into their processes but made little progress.
The current generation of AI, based on large language models (LLMs), was developed mostly over the past decade. ChatGPT appeared suddenly on Nov. 30, 2022. Two months later it had 100 million monthly users, the fastest that any technology has moved into the consumer space. (By comparison, Facebook took over two years to reach 100 million users.)
There are three fundamental reasons for such rapid adoption. First, it’s free. Second, you don’t need to buy a new device to use it. And third, you don’t need any training to access ChatGPT (more on that below). Those same factors applied also to Facebook, so why ChatGPT?
As Arthur C. Clarke famously noted, “Any sufficiently advanced technology is indistinguishable from magic.” ChatGPT is magic. The experience of “talking” in everyday language to a machine, and then continuing the conversation... It’s magical. The experience of saying, “I want an image of a book in a balloon in a cloud near the sun,” and, seconds later:
...also magical. GPT-generated images are starting to look similar in style, a little too colorful and fanciful. So I send a second prompt, “now in a style that looks like a 15th century illustration.” Thus:
If I want a video of a book in a balloon in a cloud near the sun, there are more than a dozen tools to choose from, and presto. And a musical soundtrack to go with the video. Well, that too is just like magic.
It’s too late to avoid AI
For authors and publishers preferring not to be sullied with AI, the news is bad: you’re using AI today, and have been using it for years.
AI, in different forms, has already been integrated into most of the software tools and services we use every day. People rely on AI-powered spelling and grammar checking in programs like Microsoft Word and Gmail. Microsoft Word and PowerPoint apply AI to provide writing suggestions, to offer design and layout recommendations, and more. Virtual assistants like Siri and Alexa use natural language processing to understand voice commands and respond to questions. Email services leverage AI to filter messages, detect spam, and send alerts. AI powers customer service chatbots and generates product recommendations based on your purchase history.
And much of this is based on LLMs, such as the one used behind the scenes at ChatGPT. For an author or editor to say “I don’t want AI used on my manuscript” is, broadly speaking, all but impossible, unless both they and their editors work with typewriters and pencils.
The heavyweights
“Which software tools do I use?” is the number-one question that emerged from the September Publisher Weekly AI conference, understandably so. Everyone has heard about ChatGPT—is that all there is? Do you just sign up for ChatGPT, and that’s it?
If only it were so simple. ChatGPT, from OpenAI, is, unquestionably, the number one tool. It’s built on OpenAI’s underlying GPT-3.5, and the more recent GPT-4 (“10 times more advanced” than GPT-3.5). GPT-5 is in the works, with no release date set. When people talk about using AI, they usually mean that they’re playing with ChatGPT. If they’re not paying for it, they’re using version 3.5. If they’re paying ($20 per month), they’ve got GPT-4.
OpenAI has competitors. Some are big names, like Amazon, Google, Meta, and X (formerly Twitter). Apple, curiously, is MIA. Others are smaller newcomers, like Anthropic, Cohere, Inflection, and Stability AI. Not all of them offer conversational chat interfaces to their LLMs. Google has Bard (now powered by Gemini), which, by all accounts, does not come close to the sophistication of ChatGPT, though that could well change next year.
Anthropic
OpenAI competitor Anthropic calls its chat interface Claude. Claude has been appealing to publishers because it can work with large documents, including, of course, manuscripts. It’s most recent version can ingest files up to about 150,000 words. Editors can ask Claude to analyze files for structure and content, to generate summaries, and to build q&as. It’s powerful stuff.
Microsoft
Microsoft occupies an ambiguous space in this sphere. It has the exclusive third-party license to OpenAI’s underlying technology. But, of course, it has lots of its own technology to layer onto OpenAI, and so Microsoft’s offering is a bit of a hodgepodge, and vastly confusing. On top of this, as described above, the Microsoft Office Suite (now called Microsoft 365) has included a variety of AI-enabled tools for awhile now.
None of it is called ChatGPT per se, though much of it is built on OpenAI’s product. Consumers mostly encounter Microsoft Copilot (which until a month ago was called Bing Chat). Copilot can be accessed directly in Windows computers, or through browsers, or via the Microsoft 365 suite of software, most notably Excel, Outlook, PowerPoint, and Word. Copilot is the chatty interface, though AI tools are starting to emerge from deep within our everyday Microsoft toolset. Have you tried Word’s transcribe feature? It’s powerful and accurate. How about the Rehearse with Coach tool in PowerPoint? Unprecedented. AI is seeping into much of our day-to-day software. (Adobe has, thus far, focused on AI for images. Expect more next year.)
Training for ChatGPT
Anyone can use ChatGPT for free. Just go to chat.openai.com. Accessing the most recent version, GPT-4, costs $20 per month, or is free at copilot.microsoft.com (best accessed via Microsoft’s Edge browser). Opinions vary as to whether GPT-4 is worth it. When it comes to software, I always advise using the latest version. Keep in mind that training the underlying language models is date-sensitive. OpenAI’s ChatGPT, for example, has only just recently been updated to information from April 2023.
When you first access ChatGPT, you find essentially a blank screen and the question, “How can I help you today?”
You can ask questions. Better still, you can upload a 100 MB PDF, though it can only process, analyze, or respond to a portion of the text at a time. You can also upload images, which it can describe, or a scanned page: it can recognize the text.
What has emerged as a gating issue in the successful use of ChatGPT (and other similar software) is learning how to “speak” with it (which you can literally do in the latest mobile app version), which in ChatGPT-ese is called creating “prompts” or “prompting.” Users have discovered that the more precise and detailed their prompts are, the better the responses they receive from ChatGPT. Further, prompts are not just one-offs. ChatGPT can continue a conversation for quite a while (though not indefinitely), and if you don’t get the answer you’re looking for you can revise and refine your prompts. This takes some getting used to, and has spawned a series of how-tos, written and online, to train users on how to get the most out of prompting.
Try asking ChatGPT to explain a concept like developmental editing. Then ask it to craft an explanation that a 12-year-old could understand. The results are dramatically different. Amusingly, ChatGPT also seems to respond to emotional pleas. Adding “this is very important to my career” to a prompt will solicit more useful responses. (This is as good a time as any to add a parenthetical: it’s both reassuring and deeply troubling that the top scientists working on language-based AI are unable to explain why things like this occur.)
Prompting is now at least partially supplanted by a new technology developed by OpenAI, called, confusingly, GPTs. GPTs are packaged prompts, specific tasks encoded as plug-ins to ChatGPT (easily confused with actual ChatGPT plug-ins, which are now being phased out). They’re easy to create—you can ask ChatGPT to help you create a custom GPT. How they differ from standalone prompts is less clear.
At the same time, numerous developers are building applications that offer a front end to ChatGPT without exposing the underlying chat layer, and then move beyond mere chat functionality. Copy.ai and Jasper.ai, for example, provide front ends to multiple LLMs, focused on marketing tasks.
More about the software
There’s a range of AI software that’s specific to writing and publishing. The Publishers Weekly Startup Database now lists more than 100 AI-focused publishing startups. The bulk, by far, are directed toward authors, rather than publishers. Some look to generate whole books, others seek to guide authors on their journeys. Sudowrite remains the most prominent in this class of vendors.
For publishers, the choices are beginning to coalesce around editing and marketing tools. Shimmr is a high-profile player in the latter category. Veristage is building Insight, its “AI publishing assistant,” a task-specific front end to multiple publishing functions.
AI and copyright
The copyright issues are a miasma of complexity and ambiguity. It appears certain that some books still in copyright (not in the public domain) were included in the training of some LLMs. But it’s certainly not the case, as some authors fear, that all of their work was hoovered up into ChatGPT. The laws surrounding copyright obviously did not anticipate the unique challenges that AI brings to the issue, and searching for legal solutions will take time, perhaps years.
Meanwhile the broader issue is perhaps more about ethics than about law. LLMs are built unreservedly on the writing of others, whether it’s newspaper articles, social media posts, web blogs, or books. Wikipedia is a part of most LLMs. The companies peddling software built off those LLMs are valued in the billions of dollars. Should the authors of the content be compensated?
The ethics appear less complex than the legal considerations.
Integrating AI into publishing
There are few things that publishing companies are less comfortable doing than integrating complex digital technologies into their day-to-day operations. That’s understandable. AI, in particular, is causing anxiety for everyone, and not just in publishing. It’s new, it’s mysterious, it’s personalized, it’s powerful. People are threatened by AI for numerous reasons. Changing attitudes takes time. But this is not a great time to be timid with technology.
The impetus must come from the top. The very top. Senior executives need to embrace a vision of AI’s potentially transformative presence and communicate a program to staff across the organization. The program may be little more than “experiment, document your experiments, and share.” That’s a good start.
Publishing companies are handicapped by the hubbub surrounding copyright. Authors are up in arms. A May 2023 Authors Guild survey found that “90% of writers believe that authors should be compensated if their work is used to train generative AI technologies,” and 67% said they “were not sure whether their publishing contracts or platform terms of service include permissions or grant of rights to use their work for any AI-related purposes.” Those uncertain authors are now asking their publishers if AI is being used in the editing or production of their work, and some powerful authors are insisting that it not be. They’re looking for the AI equivalent of a peanut-free bakery.
This is a thorny problem for publishers—if you can’t use AI on the books you’re planning to publish, what can you use it for?
Trying and testing
The Economist recently referenced an organizational tactic for new technology adoption called the “lighthouse approach.” You create a beacon by selecting one high-profile proof of concept that can be implemented quickly, that everyone can relate to.
Ideation is one of ChatGPT’s more impressive talents. Just now I uploaded a fiction manuscript and asked ChatGPT, “What are some ideas for marketing this book of fiction?” It responded broadly. So I then prompted, “only provide suggestions that are specific to the content of the book—nothing generic in your responses,” and now the ideas were solid, including several that would never have occurred to me. I then prompted, “be more creative in your responses—don’t worry about whether the ideas are practical or easily implemented.” The responses were creative and, for the most part, neither practical nor easily implemented. You get the idea. This also illustrates the importance of what’s sometimes called HITL, or the human in the loop—an acknowledgement that AI is not autonomous but rather an interaction between humans and machines, and that the machine is merely a tool.
Thad McIlroy is an electronic publishing analyst and author, based on the West Coast and at his website, The Future of Publishing. He is a founding partner of Publishing Technology Partners.
This is an excerpt from A Concise Guide to AI for Book Publishers by Thad McIlroy. It will be published
in January 2024.