Artificial intelligence (AI) poses an imminent threat to many occupations, but for journalism, an already precarious field, the peril is immediate and nightmarish. With social media steadily chipping away at news outlets’ earnings from subscriptions and advertising, the AI “reporter” algorithm is becoming far more widely used, striking at the heart of every human reporter’s function.
What people do in a traditional editorial office is prioritise the news and forge stories that are as comprehensive and easy to understand as possible. Now Google has an app for that, called Google News. It tracks the user’s interests among current events, pieces together all the latest developments and presents a completed story that reads pretty much as though it had been composed by an experienced human journalist. If the app gets the user’s preferences wrong or provides incomplete information, it happily accepts criticism – and then improves.
For now at least, the information the algorithm collates still has to originate with humans. It can track down all the pertinent details about what’s happening on the Korean peninsula, for example, and assemble it in an orderly, easy-to-digest form, but those details came from a real person watching Kim Jong-un and Moon Jae-in.
Just the same, AI is already so advanced and capable that it will have a massive impact on the news media. No matter how large and talented an outlet’s pool of human reporters, AI can get more information from more sources more quickly. Anyone with a Wi-Fi connection can already get all the news they want without having to pay for a subscription at the source. It doesn’t take much searching online to find ample, up-to-date information on any given topic. Who needs the New York Times with its paywall? And AI will relieve consumers of even the simple task of searching. It knows what you’re interested in and stacks the
stories neatly in your app feed, even sounding a bell when something really interesting has occurred.
There are other concerns about this, of course. Some will argue it opens the door to a new form of thought control, since the algorithm could be crafted to serve devious ends. It might also contribute to cognitive bias, replicating the “echo chamber” experienced on social networks whose algorithms feed the user’s personal ideology and block alternative opinions.
No such concerns will protect real, live journalists.
Even if humans are paid to feed data to Google News and similar apps, and even if the apps allow users to compare information from various sources and thus recognise what’s fake, highly experienced and relatively well paid human editors will struggle to remain relevant in this world. At any rate, robotic news coverage doesn’t need many human reporters feeding its mechanism, and those it does need will not be paid much.
A good, solid news story comes with three levels. There are the plain facts, the analysis of those facts, and the edifying, measured opinion of those facts. AI is excellent at Level 1 and, with its multiple sources, will soon be succeeding at Level 2. As for Level 3, the commentary, no one should rule out the possibility that AI can get there too.
Published : May 18, 2018