Monday, 9 October 2023

Rembrandt’s banana: AI, authors, and why it matters (Anne Rooney)

bananas in the style of Rembrandt Bananas in the style of Rembrandt,
img2go AI art generator
I’ve been talking to people a lot recently about the impact of AI on authors and their livelihoods. Addressing the AI dilemma for authors is why I’m standing for re-election to the Management Committee of the Society of Authors. One comment that crops up again and again in discussions is ‘jobs have always been lost to technology. What’s the difference?’ Here’s a stab at an answer (it’s quite long…)


The problem, in case you don’t know already, is that AI programs like ChatGPT “scrape” text and images from the Internet and anywhere else they can get them. They analyse the patterns of their construction (typically, which words are likely to follow any given word) and use the patterns to construct new work in writing or art. AI can be instructed to draw a picture of a banana in the style of Rembrandt or write a Petrarchan sonnet about climate change, for instance. More worryingly, an AI can be told to draw a cat in the style of illustrator Satoshi Kitamura or write a short story about a ghost in the style of Margaret Atwood (these are hypothetical examples). Or it can be instructed to write a series of books about four fairies suitable for 7-9-year-olds, produce the illustrations for a picture book (given precise instructions), write a book of 100 facts about cars, or translate a story into or from French. Or Greek, or Swahili…

The first two might just sound like fun — a cross-time mash-up. The second is fairly obviously something that can only be done if the works of these artists has been absorbed, analysed and then emulated, without consent of the copyright holder. So that’s a problem as it’s arguably illegal and certainly immoral/unfair. (The argument about legality is going through the US courts right now.) The third is where the livelihoods of artists is most clearly impacted, but it also impacts consumers. These are all projects that would have been given to a living author. (The term ‘author’ includes illustrator and translator.) Even the first example (Rembrandt’s banana) might be depriving an illustrator of a commission — though I wouldn’t have paid an illustrator for art for a blog post.

When you buy a book, you expect the words and pictures to have been created and put together by a human being: a person with feelings, experiences, and reasons for their choices. A person brings an entire hinterland of human experience to their work, which provides a rich context and underpins the choices they made. Here’s a really stupid example. I live in Cambridge, technically within the city boundaries. Yet when I write an author bio, I will often say I live in East Anglia (Cambridge is in East Anglia, so this is true). I don’t want it to be straight and dull — my latest account is ‘near the eel-slathered fens of East Anglia’. If I said I lived in Cambridge, that would set up an association with the university, or perhaps the high tech and bio-tech industries for which the city is now well known. I have links with both of these, but they’re not what I want to stress. They come with their own package of expectations and prejudices (good or bad) that I don’t want to define me or my writing. In creating my public identity, I want to get people to think beyond what they know about Cambridge. I don’t spend much time walking around colleges. I spend more time seeing the huge open skies of the flat land, the wet bogginess (though not really eel-slathered so much these days), the bird pond near the bio-tech park. There is intention and thought here. You might not like or approve of it — that’s fine. If you don’t like how I write my bio, you probably won’t like my books, so I’ve saved you buying something you won’t enjoy, which is great. An AI doesn’t have this type of consideration. It will look up some facts and string them together. And the facts will generally be wrong. I just told it to write 100-word bio including two titles of books I’d written. One of the books it picked was actually written by Mary Hoffman. Accuracy is a whole other can of worms…

Back to expectations. You expect a book or picture to reflect human experience, and feelings about the human conditions, some of which are common to all of us and some of which are defined by individual lives. If I write a book about dinosaurs (again) the choice of dinosaurs will be influenced by (a) how interesting/unusual they are (b) how easy it is to find reliable illustrations of them (c) an attempt to get a fair geographic spread so readers around the world will find some local to them (d) which dinosaurs I personally like (look for the Kentrosaurus in every book). An AI doesn’t know what people find interesting; it can only go by which are mentioned most frequently and have most readily available information (whether or not the information is accurate). That makes for a boring book the same as previous books. It’s not what I want to write, and it’s probably not what a dino-keen young reader wants to read. An AI will produce ‘more of the same’ unless very carefully briefed. And if you’re going to brief if that carefully, you might as well just write the book. Especially as you will fact-check it and an AI won’t.

What about ‘jobs have always been lost to technology’? How far back do you want to go? Most people start with the weavers and the Luddite movement of the early 19th century. The weavers objected to the use of machinery replacing hand-woven fabrics. They were skilled workers, displaced by machines that could work faster and more consistently (so no idiosyncrasies or unique features in the product). They destroyed machinery and refused to operate it (a lower skilled job). Factory owners shot them, the government executed them or deported them to hard labour, often in Australia. I’m not expecting publishers to shoot or deport stroppy writers, but perhaps I’m being optimistic.

More recently, most jobs that have been replaced by technology have been lower skilled, usually repetitive (and so easy to mechanise), or either dangerous or difficult for people to do. These jobs provided a valuable source of income to the workers who did them, but they were generally not jobs people aspired to and trained for years to do.

Undoubtedly, people in many areas of work have been forced to swap one badly paid and boring job for another, the replacement often being worse. There are fewer checkout assistants but more online shopping pickers, for example. I don’t want to minimise the impact on people forced to change their job in this way. But there are some crucial differences. Most people have not set out to become, say, a cashier in Tesco. They have not practised unpaid for years, or paid for a degree course to do it. They are not losing their dream and purpose in life along with that particular job (though they may be losing many other things, including colleagues and an environment they like, convenience, and so on). The impact on a customer of a car being put together by robots instead of people is zero. The car is the same. Putting a car together doesn’t require imagination (indeed, it would be bad to do it imaginatively), or insight into the human condition, or verified research.

Personally, I don’t think mechanization should replace people in jobs that they want to do, aspire to do, train to do and bring unique skills to. It can usefully replace tasks in those jobs, though. The important thing is to automate the boring tasks to free people in those jobs to do more interesting and valuable work. That’s AI being employed by the worker rather than by the employer, and to the benefit of both. The calculations for the early space programs were carried out by people. There’s been far more progress in space exploration since the widespread use of computers has freed astronomers to do work that uses their expertise rather than treats them as animated calculators. That’s what technology should be for.

Writers aren’t actually very different, but they’re the canary in the coalmine (though canaries are probably glad their job was mechanised). Huge swathes of ‘knowledge work’ is capable of being carried out by AI. Solicitors, many medical doctors, teachers, CEOs, MPs, engineers, programmers… anyone whose work centres on working with a body of established knowledge and picking out the bits that are relevant, matching case to context, is vulnerable. Oddly, they should be more vulnerable than writers as they cost more to use and most aren’t supposed to make things up.

As a society, we need to think ahead. What do you do with all the people with these skilled jobs when you’ve replaced them? And how do you motivate children and students if the careers they might aspire to have gone away? For decades the idea of freeing people from dreary and soul-destroying work was sold on the basis of leisure to do things they want to do, such as making music, painting, writing. People take whole degrees to perfect their skills in creative endeavour. On the whole, people who want to make music, paint or write hope for an audience. They want to express their ideas, communicate something about being human — even if that is their interest in planes. And we consume artwork and music and books because we want that connection with other people. It’s the same reasons we want to talk to a human teacher, a human GP, a human architect.

Two groups think they can benefit from using AI to create text and pictures. One is the non-writers who think they can ‘create’ a short story or a novel (or whatever) by just feeding in their ideas, and then get money from their book. The other is publishers (broadly, any organisation that publishes anything) who want to cut back on the small amount of money they pay to authors, illustrators, translators — and copy editors, proofreaders, indexers and everyone else involved but not mentioned here. So, shareholders benefit. Creators and society will suffer. If we’re going to automate away jobs, they should be jobs people don’t want, freeing them either to do jobs they do want or to live pleasant lives on a universal and sufficient basic income. No one suffers from that. Let’s not sleepwalk into a world where all the good jobs are done by machines and we’re left picking tomatoes and working in Amazon warehouses. Let’s use AI responsibly, for society’s good.

AI can be fun. You can generate images you don’t have the skill to create yourself (like Rembrandt’s bananas), but don’t use it to cut out an illustrator who would have been paid. You could use it as a starting point for text you have to write but don’t want to write. But don’t kid yourself you’ve created something. You’ve only generated it. Creation takes work, imagination, innovation, time, reflection. And humanity. So does being a GP, a defence lawyer, a teacher.

Anne Rooney

Out now: Little Monkey and Tiny Tadpole, illustrated by Caroline Rabei and Qu Lan, 2023


 

7 comments:

Mary Hoffman said...

One of your two books was written by me? I know we are close but how did AI come up with that one?

Abbeybufo said...

I wouldn't trust anything that shows bananas with two stalk ends, either!

Stroppy Author said...

Mary, I imagine I mentioned (Shakespeare's Ghost) somewhere or other - maybe even on here - and so it's linked it with my name. But to be honest, if we can be attributed any book we've ever mentioned — well, I want my PLR for King Lear!

Penny Dolan said...

Thank you for your precise analysis of all these differences, Anne.

catdownunder said...

Thank you. So far AI does not seem to be able to do my job!

Katherine Roberts said...

I've had a little play around with AI, and it is scary how quickly it can produce something formulaic that might suit a publisher looking for "more of the same, only different". I've also asked ChatGPT to write my bio, and it came up with one book I actually did write (fairly random, as part of a series), plus a whole series of books written by someone else I'd never heard of... so accuracy is clearly a big problem at the moment!

Whether AIs will replace us as writers remains to be seen, but for me the whole experience of seeing an AI churn out text at high speed, generating what is essentially a rough and not very original first draft of something I've instructed it to write, takes away the whole reason I write in the first place... I actually ENJOY writing my own first drafts (even though they take months or sometimes years instead of a few seconds), and AI cannot take that creativity away from me. However, it can take the income those creations might eventually earn, which might mean I can no longer afford to spend my time creating them.
Sadly, however, that is nothing new.

The publishing business has been casting off talented authors for years in search of greater profits. Maybe it is even a blessing that we'll soon have AIs to churn out all the formulaic stuff, leaving the real (human) authors to create those books human readers will surely need more than ever? Before that can happen, though, publishing is going to need to make some serious changes to their business model, so that human authors are properly acknowledged and paid for their time and skill.

Stroppy Author said...

I don't think it is a blessing, Katherine. Many writers learn their trade writing the formulaic stuff, or writing advertising copy, or in journalism, or technical or educational writing. Few writers spring fully formed into highly original and creative novelists/poets/screenwriters. What the competition might do, though, is get us (ie real, human writers) to push the boundaries in trying to write in ways an AI can't