Newsletters




Can an AI Write My Column?


The impact of generative AI (GenAI) is massive and is only going to increase. One area most immediately affected is journalism. Any job that involves the processing and production of the written word is a job that could be impacted by GenAI, which—at the end of the day—is designed to process language inputs and produce convincing output.

Can AI produce credible journalistic output? Let’s try it and see! I asked GPT-4 to write this article for me. Here’s the prompt:

Write a 500-word article in the style of Guy Harri­son from Database Trends and Applications on how LLM-based AIs are disrupting journalism, including the threats to jobs, the chance that AIs will create “echo chambers,” propagate false information, or otherwise diminish the role of journalism in a liberal democracy.

And here’s what GPT-4 produced. I’ll leave it to you to judge whether I’m ready to be replaced by an AI!

Large Language Models Disrupting the Fourth Estate

In the fast-paced world of digital technology, LLM (Lan­guage Learning Model) based AIs have been making waves recently, particularly in the field of journalism. In an era where data is king and information is the new currency, AI has been a ground-breaking force, altering the traditional dynamics of journalism, and creating ripples of change that are both intrigu­ing and disconcerting.

The first, and perhaps most palpable, impact of LLM based AIs in journalism is the threat they pose to jobs. Automated journalism, powered by AI, is capable of producing news arti­cles, reports, and even narratives with little to no human inter­vention. This becomes especially concerning when you consider that AI can churn out these pieces at lightning speed, and with a level of accuracy that matches, if not surpasses, human report­ers. For instance, the Associated Press has been using AI to produce corporate earnings reports for years now, a task that was traditionally performed by its reporters.

However, the threat to jobs is just one piece of the puzzle. There’s a larger concern that these AI systems may inadvertently create “echo chambers,” propagating the same information over and over again, thereby limiting the diversity of content. The algorithms that power these AI systems are designed to learn from existing data, and thus, they tend to repeat what they’ve learned. This could lead to a lack of fresh perspectives in news reporting, which is a cornerstone of healthy public discourse in a liberal democracy.

Further, there’s the risk of AI systems propagat­ing false information. While AI can certainly help in fact-checking and debunking fake news, the same technology can also be weaponized to spread misinforma­tion at an unprecedented scale. Deepfakes, AI-generated fake news articles, and other forms of AI-propagated falsehoods are real threats that need to be addressed.

Moreover, the rise of AI in journalism could potentially diminish the role of journalism in a liberal democracy. Journal­ism is not just about reporting facts; it’s about providing context, analysis, and commentary. It’s about holding power to account and giving voice to the voiceless. These are tasks that require a level of empathy, judgment, and moral reasoning that AI, as of now, is incapable of.

In conclusion, while LLM based AIs have certainly revolu­tionized journalism, shaking up its traditional structures and workflows, they also pose significant challenges. As we move forward, the key will be to strike a balance, leveraging the power of AI to enhance journalism, while also mitigating its potential risks and pitfalls. It’s a brave new world, and only time will tell how this fascinating interplay between AI and journal­ism pans out.


Sponsors