Science journalist Ranga Yogeshwar reflects on the impact of AI-generated narratives and disinformation on society, and the importance of strong, independent media
Ranga Yogeshwar: As journalists, we must be very much aware of how credible our sources are. We have already seen many cases where AI has generated false information. [In journalism,] there are very classical rules: If you have a fact, you triple-crosscheck it. If you don't do that, then you might end up in trouble.
The application portfolio of AI within journalism is much more and much richer than just producing a news article. We can use AI in a very constructive way for research or getting different data and putting it together. AI has been incredibly useful in programming and coding. But there are aspects of journalism like analyzing, for example, a conflict, where we need journalists in the future. A good journalist brings up new relations, shows new aspects. I think this is very difficult with AI, at least right now.
With the advent of large language models, we see a new quality. For the first time in the history of mankind, language is no longer the big monopoly of humans. Don't forget that language is the basis of everything. Of democracy, economy, jurisdiction. Language is crucial.
We can argue that there is something like a hack of our civilized world. We can imagine in the future that the machines generate new narratives, new religions. Even religion is language: "In the beginning was the word." Now the machine can produce these words.
It can produce entirely new narratives, and this can sort of destabilize known structures. From social networks, we already know that there is, for example, a strong increase in polarization. We see a break-up of democratic structures of communication structures. This is something already happening, and it could be amplified by the use of AI if we don't pay attention.
In my talks, I have asked people: Are you prepared to use an AI even if it's not transparent, if you do not understand how it works? Interestingly enough, a majority says yes.
We have to make sure that artificial intelligence does not support bad things. If I ask an AI: How can I rob a bank? AI is going to give me quite a cool answer which could be quite helpful for a robber. We must make sure that this doesn't happen. A lot of effort has been invested in what you call the alignment problem. Right now, companies such as Open AI, Google or Microsoft are trying to include a value filter. This is done mainly by people who train AI based on the evaluation of the statements of another AI to filter out bad things. This is done in countries like Kenya, for example, or Bangladesh.
In a global world, values vary between cultures and times, they are not a constant. A Chinese AI might focus on and allow or forbid other things than a European or a US-American AI. With the advent of artificial intelligence, technology also reflects the underlying cultures.
The tech world has always talked about the big problems they want to solve: We want to fight hunger, injustice, climate change. I think that these are all ostensible arguments, because if you fact-check what the tech-world did in the past, you see very clearly that all these promises are standing aloft and have never been realized.
Right now, the use of AI has been primarily driven by business models, investments, and profit – by capitalism. It has been a tool, like many others, to enhance capitalistic structures. And the entire narrative of capitalist structures naturally is not a world of justice, fairness, or equality, but a world of inequality. A world where, for at least the last centuries, the North dominated the South, the rich dominate the poor. Technology is not going to solve our future issues.
And at one point we must requestion what we exactly mean by progress and innovation. At least in the past, innovation was always focused on a rich white minority that produced for another rich white minority. Vast majorities were left out. In the future, we therefore should reshape the core of innovation and set new values, a new purpose.
In social networks, the basic underlying model is a business model, driven by what is called the attention economy. They want to sell us ads at the price of our private data. There's been some interesting analysis on the advent of fake news. For example, (analysis by) Sinan Aral from MIT found that the spread of fake news is a stunning six times faster than the spread of correct information. We totally underestimated how our interest for extraordinary newsfeeds in combination with commercially driven attention algorithms results in the spread of false information. Ironically, fake news generate income for the platforms. Many feeds on X (formerly Twitter), YouTube, Facebook etc. are simply wrong.
Credible information is the basis of democracy. It is more than just a product and we now realize the consequences of this misguided development: Polarization, the disintegration of society, mistrust… We are now seeing that our democracies are in a process of dissolving.
A couple of years back with Cambridge Analytica, we saw the impact of Facebook on democratic elections. On my Facebook account I wrote an article and said: Let's ground Facebook. We must demand from these platforms that certain things are not acceptable. Unfortunately, this has not happened.
Slowly, the fear of politics towards big tech has become so big that nobody dares to press the [regulation] button. And if you do so, the tech-industry immediately claims that this is censorship. But if democracy wants to survive, we need reflected politics and finally clear legislation.
There was a big hope that AI could identify synthetically generated texts compared to human-generated ones. By now, we know this is not the case. In a historical sense, the Turing test works. We are not able, at least at first sight, to tell the difference. And this is not only true for words but naturally also for pictures or deep fake videos. This brings us to one essential aspect which is credibility. In a couple of years, credibility is going to be the main unique selling point.
AI is self-referencing, and the internet is filling up with AI-generated content. It therefore will be very difficult to draw the line between false and correct information. This credibility aspect could be lifesaving for journalism. But this also means that journalism has to change. Right now, it is sometimes an attention business. It has to change into a credibility model that is different from a hallucinating AI world.
Journalism is the very important advocatus diaboli [devil's advocate] which we need in any society. It questions and fact-checks. It has been a very valuable part of our society and of our democratic culture.
But with the advent of AI and digital media, the strength of journalism has vanished because the business models of classical journalism have changed. We have seen and we are seeing a crisis because the entire advertising business has shifted, and journalism does not have an economic basis anymore. Many big publishing houses are dying. This is already very critical.
Our society should protect independent journalism, as it is so important for our democracy. In Germany, we have a public broadcasting system that was founded after World War II and was the answer to the propaganda system of the Nazis. This system should be adapted to the changing media world. In a commercially driven surrounding we need credible and independent information. We could reinvent these public structures which are independent and are a strong advocatus diaboli within our societies. If we don't care, we will end up in a world where advertising and all other interests, political interests, sort of nudge a society. Where is the voice of the critics? You won't hear it anymore.
It's like a fire brigade. In every town, all the citizens agree that we have to finance a fire brigade. Even if last year there was no fire in town. But we all realize this is something which is important for society. We need much stronger support for journalism, knowing that it is important for us as a society, for us as a civilization.
***
About the author: Regarded as one of the leading independent science journalists in Germany, Ranga Yogeshwar is interested in bringing science to the general public. He has received major media awards for his journalistic contributions and book publications. Yogeshwar graduated from RWTH University Aachen with a diploma degree in physics. In 1987, he joined the German TV station Westdeutscher Rundfunk (WDR) in Cologne, where he oversaw the science department. Since 2008, he has worked as an independent science journalist.
Ranga Yogeshwar's website: https://yogeshwar.de
Interview: Alexandra Spaeth
The interview has been edited for clarity and brevity.