Disinformation is a problem that can't be viewed in isolation and for which there isn't a clear solution. Attempted solutions often have unintended consequences.
Disinformation has the characteristics of a so-called wicked problem, that is, a problem that can't be clearly defined or definitively solved due to its complexity and its innumerable interrelated or interdependent factors.
Design theorists Horst Rittel and Melvin Webber outlined the 10 characteristics of wicked problems, which they contrasted to tame problems, in their much-cited 1973 paper, "Dilemmas in a General Theory of Planning" (see Figure 2).
Their paper defines tame problems as a problem for which the goal is clear and unambiguous, and it is also clear if the problem has been solved.
In contrast, wicked problems don't have a clear and definitive set of solutions. They also don't have a "stopping rule" in that we can't know when the problem is solved. In addition, wicked problems are always linked to, or a symptom of, other wicked problems.
As a result, wicked problems have innumerable solutions, and these solutions depend heavily on values and beliefs that can be interpreted differently by diverse interest groups.
Disinformation is a problem, for example, that is a symptom of other wicked problems, such as the formation of anti-Western blocs (Russia, China), the loss of trust in democratic institutions and traditional media, the polarization of society, the dominance of big tech corporations or climate change denial.
Treating wicked problems like tame problems, which have clear, definable solutions, is problematic, for example, when attempts are made to solve individual issues one after the other in a traditional manner using linear, isolated methods.
In his 1997 book, "The Logic of Failure: Recognizing and Avoiding Error in Complex Situations," German behavior expert and psychologist Dietrich Dörner describes the devastating consequences that such "repair-service behavior" (as he calls it) can have when trying to solve complex problems.
Firstly, it can lead to the wrong problems being solved or problems being overlooked, he writes.
"Because we don't understand the connections between problems (and don't even know that we don't understand them) … we select the problems we will solve on the basis of irrelevant criteria, such as the obviousness of a problem or our competence to solve it. Another possible consequence of repair-service behavior is a total disregard of failings and malfunctions that may not exist at the moment but that will emerge later."
Secondly, in complex systems, every solution to an individual problem has an effect that may create a new, sometimes worse, problem elsewhere.
This is also true for disinformation. To give a few examples:
The worldwide attention and a nebulous understanding of disinformation has led to the term being used in numerous ways to assert various power-political interests.
The promotion of third-party fact-checking initiatives or fact-checking as a genre can lead to undermining of the legitimacy of traditional media as well as journalists favoring these donor-funded initiatives as employers because they can pay better salaries than local media houses (see for example, the discussion by Gwen Lister and Toivo Njabela).
This repair-service behavior is unfortunately common in the fight against disinformation. Experts working in the field are realizing, however, that many stakeholders are attempting to optimize their own pieces of the puzzle without knowing what the whole puzzle should look like in the end.
This is why the European Union focused on a comprehensive approach, which aims to better coordinate measures, when it updated its Code of Practice in 2022.
In a 2022 commentary for the Carnegie Endowment for International Peace, a US-based think tank, Gavin Wilde also identifies this type of "solutionism" as a central problem in the field of disinformation.
Disinformation is "deeply entangled with other systemic issues, where causal relationships are poorly understood, and where interventions to correct one harmful aspect create unwanted ripple effects elsewhere," he writes.
"There is an inherent irony in the fact that the core answers to such problems are usually apparent and relatively simple. This irony lends itself to what scholars call reductive tendency, a process whereby people simplify complex systems into easily digestible parts. This distillation can offer benefits, such as quicker decision making, but is often inaccurate and overlooks the complexities of the problem."
This article is part of Tackling Disinformation: A Learning Guide produced by DW Akademie.
The Learning Guide includes explainers, videos and articles aimed at helping those already working in the field or directly impacted by the issues, such as media professionals, civil society actors, DW Akademie partners and experts.
It offers insights for evaluating media development activities and rethinking approaches to disinformation, alongside practical solutions and expert advice, with a focus on the Global South and Eastern Europe.