Phil Kneen has cycled through a few jobs in his time. He’s been a wedding photographer, a photojournalist and, for a spell during the pandemic, a baker. He didn’t like the early morning hours and eventually returned to photojournalism, starting work on a project to document the fishing industry in Britain, where he lives. He’s taken the occasional freelance gig, too.
Last year, he was hired to shoot a photograph for a book cover. “They wanted a black and white photograph of a woman on the metro, looking a bit grungy,” he recalled. Mr. Kneen went to work on the arrangements, even finding a model. About two weeks before the shoot, the client e-mailed to say they no longer needed him. They planned to use an image generated by an artificial intelligence application instead.
“I was obviously a bit angry, because there’s nothing you can do about it,” he said. Mr. Kneen later saw the AI image used by the client. “It was terrible.”
That wasn’t the only time he had a gig lined up, only to be thwarted by AI. Another client, who tapped him to photograph jewellery, opted to use AI to create the different image variations they required. Mr. Kneen had also scouted locations for a band that wanted him to shoot an album cover, a job that would have paid around £4,000, only to be told that they would go with an AI-generated picture instead.
These setbacks were more frustrating than existential for Mr. Kneen. The type of work he likes to do, photojournalism, cannot be replicated by AI. But the odd commercial freelance gig might be harder to find.
Ever since generative AI applications that produce text, pictures and videos have exploded in both quality and popularity, some creative professionals have become deeply concerned about their livelihoods. There is a double whammy. Companies that build generative AI models train them on huge amounts of content from the internet, typically without consent or payment to the creators. These models can then be used by anyone with minimal skills to produce writing or create images quickly and cheaply, cutting the trained professional out of the equation.
Freelance workers, who in some cases have no union protections or the same benefits as employees, could be more vulnerable to disruption from generative AI. And for some, that disruption is already here, whether that be a lost opportunity, more competition or having to work with AI applications for less pay. Some are having to adapt. What’s hard to determine is how widespread these experiences are, and what the long-term implications will be.
For copywriters, tools such as ChatGPT can be useful for brainstorming, allowing them to do their jobs faster, while still imbuing the work with human ingenuity and creativity. But not every client cares about the human touch; cost and speed are paramount instead.
Tina Monday, who lives in Michigan, had recently been working as an independent contractor for a content agency on a project penning blog posts for law firms to bring in web traffic. Typically, she would earn about four cents a word – at least before she was told to use generative AI. Instead of writing the entire article, Ms. Monday would first craft a prompt for the AI application, and then rewrite the output to sound better and ensure it was factually correct. Her pay was cut in half.
“Honestly, it’s a lot more work to make the garbage that comes out of that machine usable,” she said. To even write the prompt, she would have to spend time researching everything she wanted to include in the blog post. “I was basically writing a 100-word summary of the 400-word output,” she said. “That wasn’t worth the pay any more.”
In June, she received an e-mail from her bosses at the agency that acknowledged a decrease in writing and editing assignments in recent months. As a result of the massive impact of AI on the content industry, the e-mail continued, the company could no longer offer work to independent contractors.
Ms. Monday had been working with the company since 2014, and had a steady flow of gigs until generative AI took off. “I have not had all of my bills paid on time every month since 2022,” she said.
Cynthia Heinrichs had a similar experience with a British company called Story Terrace, which produces memoirs. Ms. Heinrichs, who lives in Vancouver, worked on a freelance basis interviewing clients who wanted to document their lives, and would then compose a memoir that could run as long as 20,000 words. “It’s been the best job I’ve ever had,” she said. “Clients didn’t realize how rewarding it would be, not even the product at the end of it, but telling their story to a person and being heard.”
Last year, the company introduced AI. In this new model, writers conduct the interviews, but a generative AI application developed by the company creates an outline based on the transcripts. After feedback from both the client and the writer, the AI application produces the entire manuscript for writers to edit – for half the pay.
The company pitched AI as an opportunity to take on more memoirs, but Ms. Heinrichs had no interest in working with it, especially because she had concerns about the quality of the writing. “There was going to be a huge amount of learning for me just to do one project,” she said. “I thought, ‘Screw this.’ ” She has since gone solo as an independent memoir writer.
Another writer for Story Terrace in Canada said the introduction of AI eliminated the most rewarding, creative and lucrative part of the entire memoir process. A third said Story Terrace’s rates were low to begin with. Now that pay has been sliced in half, it’s not worth it to work with the company any longer. (The Globe and Mail is not identifying the writers in order to preserve professional relationships.)
The majority of the memoirs produced by Story Terrace are now written with the assistance of AI, according to founder and chief executive officer Rutger Bruining, cutting the time for each project in half. There were quality issues at first, but that was because both the company and the writers needed time to learn a new process, he said. He declined to disclose any specifics about writer fees. “It is lower because it’s significantly less work,” he said. “I didn’t start Story Terrace to help writers. I started Story Terrace to make it accessible for the population to write their life stories. I’ve always been very honest about it. I haven’t been like Uber, saying, ‘We’re here for the drivers.’ ”
Mr. Bruining is open to more automation in the future, too, even for interviews. He doubts that clients would actually enjoy being questioned by an AI bot, but if it’s technologically possible, he would give it a try. “If people love that, and it helps younger generations learn more about their grandparents, we would probably do something with it,” he said.
So far, research is scant on how generative AI is affecting employment and earnings for freelancers, in part because the technology is still nascent. But at least two academic studies provide some insight.
In a study first published last year, researchers at Harvard Business School, London Business School and DIW Berlin examined more than 1.3-million job postings on a large online platform for freelancers, which is not identified in the paper. Postings for gigs that were prone to AI automation, such as writing, declined 21 per cent in the eight months after the introduction of ChatGPT in November, 2022, compared with jobs that required more manual skills. Postings related to image generation, meanwhile, declined by 17 per cent.
“Given the already intense competition for job opportunities in online labor markets, the increased substitutability between freelancer jobs and [generative] AI could further decrease earnings in the short term,” the authors wrote.
A study from Washington University found much the same. The researchers looked at postings on the freelance platform Upwork, and found that those in automation-prone roles experienced a 2 per cent decrease in the number of monthly jobs, and a 5.2 per cent drop in monthly earnings after ChatGPT’s release, compared with freelancers in other positions.
After the release of image generation applications such as Dall-E and Midjourney, freelancers in design-related occupations experienced a similar reduction in work and earnings, according to the study. It didn’t matter how many years of experience the freelancers had, either. The evidence showed that high-quality workers are “disproportionately affected by the release of generative AI tools, especially in the image-focused occupations,” the authors wrote.
The researchers do not hazard any guesses about how these trends could play out. But it is likely that the quality of generative AI will improve, and that could force further disruption. “If I had to make an educated guess, then I would say that freelancers who will not adapt to the new technology are going to experience a negative shock,” said Oren Reshef, an assistant professor of strategy and entrepreneurship at Washington University, and one of the authors of the study.
Adapting can mean a few things. Freelancers could adopt generative AI into their own workflows, switch careers to an industry that is less affected by AI or to one that is benefiting from it.
It’s hard to know what those industries will be, but in some places, the insidious creep of generative AI is unnerving. Bartol Rendulic has been a freelance concept artist in Toronto for more than 25 years, dreaming up the looks of characters, props, vehicles and settings for film, television and video games. (His credits include Transformers: Rise of the Beasts and Star Trek: Discovery.) On his latest gig, the art director used generative AI to produce two concept images. “That flustered me,” he said. Art direction is more of a managerial role, he noted, and he has serious ethical issues with generative AI. “What they should have done is asked me, or hired another concept artist,” he said.
Mr. Rendulic had been without work for some time before he landed that job, and he was hesitant to raise his concerns. “In retrospect, I wish that I did,” he said.
Even union protections might not be sufficient. The International Alliance of Theatrical Stage Employees represents more than 170,000 behind-the-scenes entertainment workers in the U.S. and Canada, including concept artists. A subset of film and television members primarily in Los Angeles ratified a new contract in the summer that ostensibly provides a bulwark against displacement by AI, but some workers have argued the language is too weak. The contract also allows producers to require that workers use AI. “This is the first time ever that the studio has been able to dictate to us what tools we use,” said Phil Saunders, a Canadian concept artist in Los Angeles who is known for his work on Iron Man, among other Marvel titles.
The contract still leaves workers vulnerable, in his view. “It’s not a question of forcing us to use these tools to be more productive and generate more imagery in the same amount of time, therefore displacing our colleagues,” he said. “It’s about displacing us entirely.”
The industry is currently in a serious slump, providing even more motivation for studios to find ways to keep costs low, including through AI. “When there’s this much benefit in terms of labour cost savings, nothing’s going to stop that flood,” Mr. Saunders said.
One of the notable things about generative AI is how impressive these applications can appear, at least until some time passes and the flaws and limitations become obvious. But because first impressions can be so strong, some people overestimate what these applications can do. James Hattin has experienced that first-hand. He’s the co-founder of a visual effects company called VFX Legion, which has offices in California and British Columbia.
Recently, a client contacted him to make a trailer under a tight deadline, and Mr. Hattin passed along a budget. The client recanted. “They said to me, ‘We’re going to go AI on this one, for all the visuals,’ ” Mr. Hattin recalled.
It’s likely the client is going to encounter trouble, he said. The quality of AI generated video simply isn’t good enough, in his opinion, and AI models often fail to render characters and objects consistently across different scenes. In the end, that client could be in need of serious help. “We have no intention of doing last minute repairs,” Mr. Hattin said.
He doesn’t have any issues with the technology itself, and his company uses a variety of AI tools. “But with image and video generation, that’s not a tool,” he said. “That’s somebody trying to spit out a final product. And it’s not. It’s a neat card trick.”