April 2023. It’s been a couple of months since we published our first blog on generative AI. And while the discussion in the industry continues unabated, the initial dust is settling somewhat, to reveal a few key aspects which I think should guide the news industry in how we relate to and work with this new tech.
Have a sense of constructive urgency. While AI tech is changing rapidly, the knee-jerk reaction to it in parts of the publishing industry is steadfast. I recently had an email from an opinion editor at one of the UK’s national newspapers asking me whether I wanted to write an op-ed “on how AI is going to change journalism and put us all out of a job!” I declined. We need to replace this sense of panic with intentional and constructive urgency.
A few weeks ago, the renowned Future Today Institute released its annual Tech Trends report, a mammoth piece of work looking at over 700 trends across 14 different aspects of the world from Climate + Energy to Bioengineering and Space. News + Information is one, and in early April the institute ran a webinar on its findings in our sector. AI was a big focus and Senior Expert Advisor Sam Guzik made the comment that the biggest mistake the publishing industry can make at this juncture is “to assume the future will look exactly like the past.” He encouraged everyone to track developments in AI and foster discussions internally. He also pointed out that partnerships and collaborations can be a helpful support in getting started, mentioning industry initiatives such as AP’s Local News AI and Journalism AI at the London School of Economics. (The Tech Trend report has a list of “Ones to watch” in News & Industry, which includes these initiatives and other industry players, including, I’m happy to say, United Robots.)
Understand what generative AI can and can’t do. We’ve heard a lot about how AI built on large language models (LLMs) tends to hallucinate, to make up “facts” from thin air. From a journalism point of view, this feature of the tech becomes particularly troublesome when it starts to reference non-existent sources, and even articles that were never actually written, as documented by The Guardian recently.
At our workshop during INMA’s Subscription Summit in Stockholm, Elin Stueland from Stavanger Aftenblad in Norway was asked if they had ever tried putting the soccer data (used for their automated reporting with rules based AI, see below) into ChatGPT. She said they’d tested it, but that the persistent problem was that it kept adding events that had never happened.
And yet, clearly this type of AI can provide valuable efficiencies if used right. At United Robots, our core tech is built on rules based AI, where the rules try to emulate the decision-making abilities of a human expert. We use this type of AI because the texts we generate must adhere to well-defined language patterns and editorial styles as well as include the facts that are in the data (and only those!). We are now leveraging large language models to generate variations on text segments, like headlines, in our robots. A next step will be to test deploying LLMs to speed up the actual code generation process for our NLG algorithms.
The challenge for the news industry more widely will be to work large language models into robust, reliable and useful processes. It will be crucial to keep a razor sharp focus on the use we’re trying to extract from the tech and not get sidetracked by its inherent capabilities. Humans in the loop will be key. Which leads me to my final point.
Differentiate your journalistic offering on what only people can bring. Good journalism is about people – those who produce it and those who consume it. AI does not understand the concept or processes of journalism — where the real value of the news industry lies. Professor Charlie Beckett heads up the Journalism AI project at the London School of Economics and he put it perfectly in a new Practical AI in Local Media report: "AI is going to change how you think about your journalism. If the routine stuff becomes automatable…the onus is very much on what you can add. Can you add empathy, entertainment, insight, expertise, judgement, the human touch, creativity? All those things are going to be at a premium."
AI will affect publishers, journalists and – not least – readers; their behaviour and expectations. Publishers who work proactively and make the right choices in how they use the tech, not least in the context of trust and transparency, will be at a clear advantage.