2. AGGRAVATING FACTORS FOR THE DISSEMINATION OF DISINFORMATION

2.3 Societal factors: democracy of self-reliance, decline of trust in expertise and authority

Ruxandra Buluc

Abstract

The present chapter aims to uncover the main societal reasons for which democratic systems seem to be more and more contested at present. To this end we explore the link between three types of trust: epistemic, institutional and interpersonal, disinformation and the evolution or involution of democratic societies. It is important to examine the causes for shifts in trust relations in democratic societies and the role that disinformation plays in subverting this trust with a view to cancelling democratic processes and demobilizing democratic actions. We also propose a list of measures that could be taken to prevent further corrosion of the democratic societies because of disinformation and to restore trust in epistemic and institutional authorities. The research is limited in point of possible solutions to the current crisis in democratic societies caused by disinformation, as the process of legitimizing democratic systems is multifaceted and entails more than the set of measures presented in this chapters.


Main research questions addressed

● What is trust and why is it vital for democratic systems?
● What kinds of trust are subverted by disinformation?
● What can be done to counter the effects of disinformation of the disengagement of citizens in democratic societies? 

Societal systems in democracies such as the government, the economy, healthcare, education, the military, etc. rely on specialists’ expertise and citizens’ trust for their well-functioning. If knowledge, the basis of expertise, and trust are subverted, then democracies fail. Disinformation compromises the foundation of knowledge and trust, and, consequently, democratic societies are in danger of falling apart at the seams as constructive dialogue and debate between experts, policy-makers and citizens become impossible due to distrust of powerful elites, be they epistemic or institutional, that disinformation promotes.

In democratic societies, it is up to the citizens to distinguish between facts and alluring falsehoods, and this cannot be done in the absence of healthy debates, of a shared common understanding of facts, without consensual truth. As Snyder (2018) explains, “authoritarianism arrives not because people say that they want it, but because they lose the ability to distinguish between facts and desires,” because people become subjugated by their emotions.

Democratic systems are based on citizens’ trust that societies will develop, that progress will continue, that even if elected leaders at one point prove incapable of promoting policies in the citizens’ best interests, then the next election cycle they will be replaced with better suited candidates. Snyder (2018) explains that if this trust in the possibility of change and improvement is compromised, if citizens begin to question the importance of voting and become disengaged, then democracies, and the progress they presuppose, die. Sunstein also states that people need to engage in healthy debates with others who do not hold the same views as they do. “For a healthy democracy, shared public spaces, online or not, are a lot better than echo chambers.” (Sunstein, 2017, 21) Citizens in a democracy must not become entrenched in their beliefs, but remain open to discovery, to learning, to listening, in order to foster progress and community development. Otherwise, they find themselves in groups that simply mirror one another’s views, which are in fact prison of their own making.

However, at present in democratic societies, we are noticing an increase in distrust in democratic institutions and their abilities and availability to safeguard the citizens’ best interests, in distrust in knowledge and science and their commitment to promoting progress and societal development, and an increase in the trust that individuals place in their own abilities to discover the truth, to understand the complexities of various field of human knowledge, from engineering to medicine, regardless of what their specializations are, and an increase in the trust citizens place in their close(d) communities of like-minded individuals who share the same beliefs as they do, and do not contradict or challenge them in any way. In the present section, we will explore these evolutions and evaluate various means proposed of countering their polarizing and negativistic effects on democratic societies.  

What is trust?

The issue of trust is central in democratic systems. Trust legitimises democratic institutions because the officials are elected by the people in the hope and trust that they will act in the citizens’ best interests to ensure societal progress. However, if this trust is shattered or even simply shaken, the very legitimacy of those institutions is questioned.

Hardin’s (2001) definition of trust is based on encapsulated-interest theory and on the belief in the moral commitment of the trusted. This means that the trusted proves to be reliable and to have the best interests of the trusting persons’ in mind, and to constantly work on aligning policies with social evolutions. Möllering (2001) defines the process of trust as a mental leap on the part of the trusting party which presupposes that they accept the unknowable when moving from interpretation of reality to expectations of change. Consequently, there is a power and/or knowledge imbalance at the core of societal trust which can be mitigated by an alignment of interests. People do not have access and cannot hope to know and master the vast volumes of specialised knowledge that exists in all fields of human activity. Therefore, they resort to specialists and rely on their expertise and trustworthiness to mitigate the complexities of contemporary societies: they go to doctors for medical problems, they trust economists to regulate economic policies; they vote for politicians to propose and adopt regulations that safeguard the citizens’ interests and guarantee their welfare.

Moore (2019, 113) emphasises that democracies are based on a delicate and complex balance between trust and distrust, which can be termed the “paradox of democracy”: “we need trust in order to enable effective democratic governance, but we need to implement institutions that suggest a deep distrust of what our legislators [and other officials] will do when offered an opportunity to control the levers of power.” These control mechanisms are of several types:
a) Constitutional - the separation of powers;
b) Popular vigilance - citizens are empowered to oversee the well-functioning of the institutions and to signal their possible malfunctions;
c) Partisan distrust - in a multi-party democratic system, opposing parties keep an eye on one another’s activity and compete to propose the best policies.

Moore (2019, 116) explains that when disinformation targets democratic institutions, the positive dynamic by which adverse events might trigger more public scrutiny into institutional activities and better monitoring of their practices so as to correct possible derailments, might be replaced by a negative dynamic in which distrust is engendered and it only promotes more and more distrust and eventually a public withdrawal from democratic processes and a downward spiral into conspirational thinking.

It is our contention that understanding trust in a democratic society requires a multi-level approach which needs to consider the following level: epistemic, institutional and interpersonal. At every level trust needs to be manifest in the citizens’ belief that those who operate at that level have their best interests at heart. Institutional trust refers to trust in democratic institutions that are responsible for promoting the individuals’ best interests. Epistemic trust refers to trust in the legitimacy, competence, expertise of authorities in various fields, from science to culture. Interpersonal trust focuses on the trust each individual places in those around them, on the people they personally know and interact with, in their close community.  

Dismantling epistemic trust

Science and democracy are both based on transparency, rationality and trust. If these are contested and undermined, democratic systems lose their footing. Nichols (2017) explains how democracy and science are connected and how their mutual progress is connected: “expertise and government rely upon each other, especially in a democracy. The technological and economic progress that ensures the well-being of a population requires the division of labor, which in turn leads to the creation of professions. Professionalism encourages experts to do their best in serving their clients, to respect their own boundaries, and to demand their boundaries be respected by others, as part of an overall service to the ultimate client: society itself.” Along the same lines, Nicodemo (2017) states that, if the true and the false cannot be distinguished, distrust in institutions and experts proliferates and the crisis targets the rational understanding of reality.

Researchers Kavanagh and Rich (2018, x-xi) introduce the concept of “truth decay” and define it as “a set of four related trends: 1. increasing disagreement about facts and analytical interpretations of facts and data; 2. a blurring of the line between opinion and fact; 3. the increasing relative volume, and resulting influence, of opinion and personal experience over fact; 4. declining trust in formerly respected sources of factual information.” The direct consequence of truth decay is the annihilation of shared understanding of social reality on which informed and constructive debates can be predicated. If facts that do not conform to the individual’s already held beliefs are refuted and discarded, then knowledge itself is dismissed and progress is impossible to achieve. This trend has been termed in different ways by various researchers: counterknowledge, death of expertise, knowledge resistance.

Counterknowledge is defined as “misinformation packaged to look like fact–packaged so effectively, indeed, that the twenty-first century is facing a pandemic of credulous thinking.”(Thompson, 2008, 8). Despite counterknowledge claiming to be actual knowledge, it is not, since it fails empirical validation as it “misrepresents reality (deliberately or otherwise) by presenting non-facts as facts” (Thompson, 2008, 9). That is the true and the false become indistinguishable and interchangeable.

Nichols (2017) examines in more depth the effects of knowledge rejection which he terms the death of expertise, and defines as “fundamentally a rejection of science and dispassionate rationality, which are the foundations of modern civilization.” Given the fact that, at present, information is so readily available to any person with a smartphone and an internet connection, there is an endemic confusion between information and knowledge. However, as Nichols points out, the two are not at all similar, knowledge is domain-specific, it is functional and operational, it presupposes not only access to information, but also the development of specialised skills. Therefore, not everyone has knowledge in all fields, irrespective of how much information they can access. However, this is difficult to accept because it undermines people’s sense of autonomy and self-reliance, and consequently engenders feelings of rejection and hostility to institutionalized knowledge.

Strömbäck et al (2022, 1) refer to a similar trend when analysing knowledge resistance, which they define as “the tendency to resist available evidence, and more specifically empirical evidence”. Glüer & Wikforss (2022, 30) term this rejection as a form of irrationality, which seeks to negate the link between the empirical evidence and a claim or conclusion, not on the basis of other, contradictory or refuting empirical evidence, but on motivated reasoning and preexisting beliefs. Thus polarization emerges (Glüer & Wikforss, 2022, 30) and common knowledge and truth become controversial and open to debates.

What all these concepts capture is the decline of reliance on knowledge and expertise in contemporary democratic societies, a rejection of knowledge in favour of personal opinions and emotional beliefs. Truth decay subverts the very essence and promise of democratic systems which is to encourage and foster progress for all individuals.

However, all these trends, which are essentially similar, fail to capture or simply ignore the essence of the scientific method. As Keeley (1999), O’Connor and Weatherall (2019) point out, it is not that science does not make mistakes or scientists do not produce erroneous results at times. The essence is that scientists are always vigilant and scrutinize their work, through peer-review, replication, etc., openly admit when they uncover they were wrong, and constantly attempt to find ways to correct themselves and improve their research.

Ultimately, the reason to rely on scientific knowledge when we make decisions is not that scientists form a priesthood, uttering eternal truths from the mountaintop of rationality. Rather, it is that scientists are usually in the best position to systematically gather and evaluate whatever evidence is available. The views of scientists on issues of public interest—from questions concerning the environment, to the safety and efficacy of drugs and other pharmaceuticals, to the risks associated with new technology—have a special status not because of the authority of the people who hold them, but because the views themselves are informed by the best evidence we have. (O’Connor and Weatherall, 2019, 44)

These scientific endeavours can be thwarted not from within, as the scientists, as previously stated, have inner control mechanisms to identify mistakes and correct them, but from without, when science is manipulated to serve particular interests, or it is simply dismissed because it does not serve the policy-makers’ interests. Kavanagh and Rich (2018, 26) explain that the first trend promoting truth decay, the increasing disagreement about facts and analytical interpretations of facts and data, affects not only recent research, in whose case the data may be still inconclusive or in need of further verification (e.g., a new possible cure for cancer), but also clearly established and confirmed scientific conclusions (e.g., such as the vaccines are beneficial, climate change is real). They notice that there is an “increased divergence between public attitudes and facts and data emerging from scientific research.” In fact, people seem to be rejecting facts and data in favor of personal experiences, personal stories, and opinions. And this rejection fuels a vicious circle, in which people refuse to learn more about scientific findings and thus know less, and rely even more heavily on their personal interpretations, and accept opinions as facts, because opinions they can comprehend more easily than sometimes very complex scientific facts.

This rejection also favors those whose interests are to manipulate and dismiss scientific findings. A famous example given by O’Connor and Weatherall (2019) is called “the Tobacco Strategy”. It was first developed by tobacco manufacturers in the 1950s, when physicians first drew the alarm with respect to tobacco-induced ailments, including lung cancer. In essence, the strategy relies on fighting science with more science. The tobacco producers could not deny the fact that tobacco caused serious diseases. However, they could induce doubt, by funding research into other causes for those types of cancer frequently associated with tobacco consumption, and by concluding that research into tobacco effects on health were not as definitive as they seemed. Not that they were wrong, but they were not definitive. This strategy was widely successful, and it led to decades of delays in health regulations regarding smoking. It was then translated successfully in other areas, such as sugar consumption. In essence, the idea is that science is not dismissed, it is merely drowned out; thus the public becomes confused by the myriad of possible causes for various ailments, and confusion leads to inaction.
Levy explains that epistemic authority is properly constituted when it has the right structure, namely a “inquirers, methods and results are publicly available (especially, but not only to other members of the network), inquirers are trained in assessing knowledge claims according to standards relevant to the discipline, and rewards are distributed according to success at validating new knowledge and at criticizing the claims of other members of the network” (Levy, 2007, 188).

He also points out that knowledge thus produced is “deeply social”. There are some kinds of knowledge that once acquired are always available to individuals (e.g., the multiplication scale); however, the more specialized knowledge is only available in these networks, in which specialists from different fields contribute. Levy states that cutting ourselves from these networks of knowledge does not merely breed sceptisicm and distrust, it “more radically means cutting ourselves off from our own best epistemic techniques and resources” (Levy, 2007, 188) which can stop societal progress.
Epistemic trust is the foundation of democratic progress, but in order to reach its potential, it needs to be doubled by institutions willing to put into practice scientific discoveries, and to act in good faith with respect to the available knowledge. This leads to the second type of trust that is affected by disinformation and the contestation it entails. 

Dismantling institutional trust

What disinformation does is undermine institutional trust (powerful secret actors with nefarious interests have corrupted said institutions) and epistemic trust (scientists are not objective, they do not engage in accurate research, but respond to financial stimulants). This creates a vicious circle in which disinformation weakens trust, leading to a state of uncertainty, anxiety, perceived lack of control which enhance the need to find meaning and explanations that, in turn, are readily available and easily understandable in conspiracy theories (as seen in chapter 1.3).

Democratic institutions need the citizens’ trust and support, to the same extent as they need the best knowledge available to inform their policies. However, if societies are polarized, and citizens are reluctant to trust anybody except for their close circle of family, friends, acquaintances, this affects the institutions’ abilities to perform their societally appointed roles. For example, if citizens do not trust that the medical system works for their welfare, then they are unwilling to fund it through their taxes, which will lead to a decline in the quality of the services it provides. If citizens do not believe that the government works in their interests, but rather that it is the slave of a global cabal whose goal is to destroy society as we know it, then they will not participate in elections, on the one hand, and/or not obey by those elections results once they take place (e.g., the 1/6 insurrection in the USA, contesting the results of the presidential elections).

Snyder (2018) introduces the concept of strategic relativism that is at the core of Russian propaganda. Strategic relativism is defined as the transformation of “international politics into a negative-sum game, where a skillful player will lose less than everyone else.” The idea behind strategic relativism is that progress, reform, must seem impossible in all societies, not only in Russia, and therefore, Russians would no longer seek it, and Europeans and Americans would no longer see it as viable, and their democratic systems would therefore disintegrate.
If the citizens of Europe and the United States joined in the general distrust of one another and their institutions, then Europe and America could be expected to disintegrate. Journalists cannot function amidst total skepticism; civil societies wane when citizens cannot count on one another; the rule of law depends upon the beliefs that people will follow law without its being enforced and that enforcement when it comes will be impartial. The very idea of impartiality assumes that there are truths that can be understood regardless of perspective.

Researchers notice that one of the most important pillars of participatory democracy is a strong and independent mass media ecosystem. It is the institution meant to safeguard democracy by bringing to light and holding accountable any deviations from the rule of law, from the principles and tenets of democratic societies. However, at present, the mass media is under assault as well. Several factors have led to the current fragmented media ecosystem, with highly ideologized broadcasts, and polarized audiences, in which the very foundation of a free press has been subverted, as it is no longer viewed as an objective informer, as a promoter of facts not opinions, but as just another biased voice in an already very crowded public space.

Kavanagh and Rich (2018) explain that the transformation of the media started with the emergence of the 24/7 news channels which encouraged the development of the talk shows in which experts and/or pundits appear and present their opinions on the events, rather than the events themselves, thus further blurring the line between facts and opinion which leads to truth decay. Moreover, these media conglomerates are dependent on financing and advertising, meaning that there are financial constraints they operate against in presenting the news; they have to respect the ideological lines dictated by the owners. Ratings also matter, which means that the news no longer presents events and facts objectively but slanted in such a way as to be appealing to their target audiences and thus keep them glued to the screen and to the channel. Mass media has relinquished its educational goal, and has become subjected to viewer’s whims, biases, beliefs that they have to respect and reinforce.

The news thus becomes a daily spectacle in which the shows which elicit the most emotional responses, the most anger, anxiety and revolt are the most successful, to the detriment of those which present facts objectively, without ideological bias. If it lacks emotion, the news is not compelling, which means it is not believable, and the viewers switch channels in search of the sensational.
This already challenging, fragmented and emotionally charged media ecosystem, is further complicated by the emergence of social media. More and more people, all over the globe, report that they get their news from social media platforms. Snyder (2018) explains that “the internet is an attention economy, which means that profit-seeking platforms are designed to divide the attention of their users into the smallest possible units that can be exploited by advertising messages,” and the news on these platforms is not tailored to encourage reflection, but to fit and decreasing attention span and the hunger for reinforcement, thus forming a “neural path between prejudice and outrage” which does not encourage action, but rather a continuous spiral of discontent and distrust.

Filter bubbles (Pariser, 2011), selection algorithms (Oremus, 2016), and echo chambers (Sunstein, 2001) have customised the content that users access, based on their beliefs, and thus have personalised the information and knowledge they gain, as well as the interactions they engage in. Sunstein identifies three main problems that this continuous filtering raises:

1. Fragmentation, the creation of communities that only engage and listen to their own members and reject any external interventions. The danger here goes beyond mere fragmentation and the erosion of public discourse; fragmentation can lead to extremism, hatred and even outright violence.

2. Information is a public good, which should be shared with others who might benefit as well. What one person learns, they share with others and thus knowledge remains the social product that Levy spoke of. However, in a system designed to insulate people from information that may contradict their already held beliefs, this becomes increasingly if at all possible, as people will relate to each other only the little they know, without accepting anything new.

3. Not understanding the connection between freedom and the relationship between consumers and citizens. When speaking of consumers, filtering is a way of getting them what they want in a short amount of time. However, citizens need more than satisfaction that their beliefs are the same as others. They also need to be exposed to contrasting or even opposed beliefs in order to truly examine a problem from all angles and come to the best-informed decision. This is what freedom is truly about in a democracy. Not the freedom to insulate oneself, but the freedom to express oneself and come into contact with others’ expressions.

Social media, as an increasing insulated environment in which people interact only in small, tailored groups, lead to accentuated truth decay since facts and opinions are not differentiated and the content is actually segregated to be in tune with the groups’ pre-existing beliefs which are thus reinforced (Kavanagh & Rich, 2018). Consequently, ruptures and polarization increase in societies, as people insulate themselves in communities with no contact with opposing or diverging views, where debate does not exist, only a spiral of confirmation and “tribal” belief reinforcement(Kavanagh & Rich, 2018; McIntyre, 2018). Nichols presents the results of a study performed at University College of London which revealed the fact that despite having more available sources of information than ever before, students limited their reading to the very first lines of an article and then moved to the next. He explains that this is not actually reading, but scrolling in search of confirmatory details for a pre-existing belief, and marks the unwillingness to engage in the attempt to follow and understand contradictory articles. Ultimately, this disengagement is detrimental to democracies because it “undermines role of knowledge and expertise in a modern society and corrodes the basic ability of people to get along with each other in a democracy” (Nichols, 2017).

The factual common ground so necessary for informed democratic debates in the public sphere with respect to how and what societies should do is fractured, due to lack of adherence to common facts and consensual truth, and to lack of constructive debates. If one compounds truth decay and trust decay, one has the perfect fertile ground for conspiracy theories and unfounded rumours to disperse in society.   

Interpersonal trust as a vector for spreading disinformation

It is important to notice that interpersonal trust remains intact. Its foundation is mainly emotional, therefore contestations of facts, scientific truth and explanations do not weaken it. As Kavanagh & Rich expound, “social relationships and networks play a large role in the formation of beliefs and attitudes” (2018), however, they severely limit the diversity of information that one comes in contact with and reinforce echo chambers in which information is never externally verified and confirmation of even the most outrageous belief is readily available. In search of personalised content, people have personalised knowledge and facts, and while people are entitled to their own opinions, they are not entitled to their own facts. Unfounded rumours and conspiracy theories circulate freely in close(d) communities, in which people share them with their peers, who accept them on the basis of interpersonal trust.

But there are inherent limitations that come with relying too extensively on individuals and their knowledge and understanding of the world. Individuals do not possess as vast and as detailed model of the physical and social world as they come to believe. Levy (2007, 184) explains that these models are actually located outside of individual cognition, in a social network; they are in fact external representations, which require fewer individual resources. This means, that when an individual has a problem, they do not have to find the solution on their own. They know where to go, who to contact, to provide them with the best approach. But this requires trust, and trust outside one’s own internal knowledge, or close(d) group knowledge. This brings back the issue discussed in 4.2, regarding epistemic trust, trust that there are knowledge communities for various fields whose expertise an individual should take advantage of and have confidence in and in 4.3 regarding institutional trust, trust that the democratic institutions have their best interests in mind when promoting certain policies to aid with their concerns and problems.

In Muirhead and Resenblum’s opinion, this is a matter of common sense, defined as “our acceptance of the intractable facts about the world and our already existing shared experience and understanding about our social world” (Muirhead & Resenblum, 2019, 127). Common sense, they argue, comprises “shared perceptions, experiences, and moral sensibilities, which make democracy possible.” (Muirhead & Resenblum, 2019, 128) “Common sense creates a world in which it is possible for people to exchange reasons and feelings that “make sense” to one another— even under conditions of diversity and political conflict. Common sense is a resource against the tyranny that imposes its own reality.” (Muirhead & Resenblum, 2019, 128) But they see common sense as affected and even betrayed by conspiracism (see 1.3), which contests facticity and common interpretation. In the absence of these two factors, political discussions become impossible as their no common, shared ground of understanding for them to rely on and resort to.
The emphasis on individuals is seen as detrimental for democracies which presuppose communities by Banti (2020, 47) as well. His grim view is that once individual success is the only yardstick for measuring welfare, then the collective nature of the democratic societies is discarded and the individuals can become mere anonymous pawns that follow the light of the successful individuals, without adhering to the benefits that cooperation and collaboration could bring about.

Snyder also analyses another effect that promoting individuality and emotions to the detriment of factuality has on democratic societies. His analysis highlights the Russian propaganda playbook and its goal to discredit democratic societies, not only in the eyes of the Russian society, but also in the eyes of citizens from democratic societies. If citizens doubt everything and trust nothing, then they cannot have sensible debates about reforms and progress and cannot trust each other to organize politically to change the status quo. The discreditation of knowledge paves the way for inaction, for turning emotion into the only sterile response to any discontent.

There is another downside to people trusting only like-minded individuals and grouping themselves according to their beliefs rather than remaining open to debate and exposed to new information. In an experiment carried out by Schkade, Sunstein and Hastie (2007), they discovered that if people with similar views were grouped together and asked to deliberate on certain issues that were ideologically laden, even a 15-minute debate led the most moderate participants to adopt the most extreme views in the group: “deliberation increased consensus and decreased diversity”. Once the deliberation was over, the groups were more extreme in their convictions and fewer people remained in the middle. The researchers identified four contributing factors to increased polarization:

1. Informational influences – the group members provided information that pertained to one extreme of the ideological spectrum. As that was the only information provided, there seemed to be consensus and the group became more radical and uniform.

2. Corroboration effects – when people do not have confidence that they know enough they stay away from extreme viewpoints and have a more neutral position. However, agreement from others, who corroborate what they might have only tentatively thought, encourages them towards the extreme point of that belief. People become more confident once they find out that others share their beliefs.

3. Social comparison – people share views because they might want to be perceived favorably, by the other members of the group, and will adjust their positions to be more in line with those of the group.

4. Shared identity – people want to feel they belong, they are part of a group, they have a shared identity, and they are willing to polarize in order to achieve this sense of shared identity, to be part of the ingroup and more clearly separated from the outgroup.

The problems arise as this phenomenon of division into small groups who have similar beliefs and reinforce one another to the point of extremism is deeply facilitated by online platform that allow people to pick and choose who they listen to, and by the increasing tendency of mainstream media to conform to people’s preexisting beliefs, rather than try to educate and mould them. As Kavanagh and Rich (2018, xvi) explain, the effects on democratic societies are dire: civil discourse is eroded as the parts involved are unwilling to listen to each other; political paralysis appears as there is increasing uncertainty about facts which creates difficulties in agreeing what the best policies would be; citizens feel that the government is further letting them down by not acting which leads to disengagement from political action and civic institutions; and in general policy becomes uncertain and even inoperable at a national level. Given all these challenges, it is more important than ever to try to identify means of countering them and of rebuilding trust. 

Main challenges and means to overcome them

In light of the aspect presented above, the greatest challenges in tackling the democracy stifling effects of disinformation require a multi-faceted approach. A survey of the literature has revealed several methods that could be put into practice and that, if adhered to and carried through, could help rebuild trust in epistemic authorities, in democratic institutions and processes, and, in democracy itself. 

Believe in truth

Snyder explains that fighting tyranny and upholding democratic systems starts from the individual and from individual action. “To abandon facts is to abandon freedom. If nothing is true, then no one can criticize power, because there is no basis upon which to do so. If nothing is true, then all is spectacle. The biggest wallet pays for the most blinding lights” (2017, 65). Speaking and asserting truth helps spread the message and uphold the criteria of judgement for democratic institutions and experts. 

Speaking truth to disinformation

(Muirhead & Rosenblum 2019, 14, 116-120) . This starts with small steps that presuppose a change in the way in which citizens interact with one another. It is the shift from small communities of shared and similar beliefs who reinforce one another’s opinions to a watchful and engaged civil society that bears witness and upholds directly and transparently established and proven knowledge. This step combined with the mass media reassuming its role as a purveyor of information, facts, data, not (only) opinions, could lead to countering the societally corrosive effects of disinformation. 

Enacting democracy

This refers to “the scrupulous and explicit adherence to the regular forms and processes of public decision-making.” It entails a consistent, deliberate and sustained response to disinformation in order to mitigate the increasing distrust in democratic systems. “Enacting democracy makes government legible. That is, it gives citizens reasons to understand and appreciate the meaning and value of institutional integrity and ordinary democratic processes” (Muirhead & Rosenblum, 2019). This process, if repeated with perseverance, can lead to a cumulative effect which can help relegitimize democratic processes and institutions and rebuild institutional trust. It also means that the processes themselves are openly articulated, presented in a transparent manner to the citizens, thus educating them on how the institutions function and encouraging them to take an active part. Citizens need to witness institutional integrity and transparency in order to regain trust.

Defending knowledge producing institutions

These institutions and the scientists and experts that serve them, might be wrong at times, either due to errors or intentional corruption. However, scientific endeavours are continuously monitored and evaluated and their mistakes are more likely to be discovered and corrected. The experts and policy-makers should be held accountable for their judgements; but they should not be dismissed and distrusted. Rather, they should be questioned openly, they should be required to present their evidence, explain their reasonings, have their conclusions reviewed. The world of expertise and democratic policy-making institutions should be bridged so that they could inform each other and ensure societal progress.
Sunstein (2017) also proposes two methods that could help rebuild trust and societal cohesion:  

Rebuilding the media ecosystem 

By supporting “general-interest intermediaries” (Sunstein 2017, 26-27) such as newspapers, magazines, broadcasters, that promote facts presented as facts and opinions presented as opinions and thus encourage debates on the same common, shared framework of knowledge which allow people with diverse opinions to interact with one another and to be exposed to various points of view, without losing sight of the facts. “A system in which individuals lack control over the particular content that they see has a great deal in common with a public street, where you might encounter not only friends but also a heterogeneous array of people engaged in a wide array of activities (including perhaps bank presidents, political protesters, and panhandlers).” This helps create a democratic system that fosters deliberation not among like-minded individuals but among all citizens. Snyder also speaks of the importance of investing time and effort into the mainstream, objective mass media outlets. He argues that it is every citizen’s obligation in democratic societies to make a custom if investigating and not taking things for granted, just because they are pleasing to hear or easy to understand. “Figure things out for yourself. Spend more time with long articles. Subsidize investigative journalism by subscribing to print media. Realize that some of what is on the internet is there to harm you. Learn about sites that investigate propaganda campaigns (some of which come from abroad). Take responsibility for what you communicate to others. It is not only the responsibility of epistemic and institutional authorities to promote transparency and information, it is just as much the responsibility of each citizen to stay engaged and active in the sometimes tiring process of countering disinformation and maintaining a healthy ecosystem. 

Participating in public forums.

Sunstein (2017, 40-44) promotes the idea of forming public forums, on the basis of the freedom of speech in public places, with three main goals in mind:
a) To give speakers access to a range of citizens with varying opinions and beliefs and thus expose those citizens as well as the speakers to new information and other points of view that may make them reconsider their positions and/or may bring to light points of public discontent, thus aiding the improvement of public policies and institutions.
b) To give access not only to heterogeneous groups of citizens but also to specific groups and/or institutions that citizens may be discontented with so they could make their dissatisfaction and complaints known.
c) To enhance the possibility that people will come into contact with a wide variety of people and views, they will have unexpected encounters that expose them to diverse viewpoints and experiences, thus reducing risks of polarization and promoting and (re)developing civil discourse and empathy.

A similar idea is proposed by Snyder as well when he argues in favor of practicing “corporeal politics. Power wants your body softening in your chair and your emotions dissipating on the screen. Get outside. Put your body in unfamiliar places with unfamiliar people. Make new friends and march with them.” (2017, 83) Stepping out of the boundaries of what is known and familiar and experiencing new ideas, leads to a less fragmented society, in which individuals are more open to debate and free exchange of ideas in a constructive manner, in the public sphere, that could lead to the improvement of the democratic societies they live in.  

1. Banti, Alberto Mario. La democrazia dei followers. Neoliberalismo e cultura di massa. GLF: Editori Laterza, 2020.
2. Glüer, Kathrin & Åsa Wikforss. “What is knowledge resistance” (29-48) Knowledge Resistance in High-Choice Information Environments. Strömbäck, J., Wikforss, Å., Glüer, K., Lindholm, T., & Oscarsson, H. (eds.). New York & London: Routledge, 2022.
3. Hardin, Russell. “Conceptions and Explanations of Trust” (3-39) in Cook, Karen, ed. Trust in society. New York: Russell Sage Foundation, 2001.
4. Kavanagh, Jennifer & Rich, Michael D. Truth Decay. RAND report. Santa Monica, California: the RAND Corporation, 2018.
5. Larson, H. J., Clarke, R. M., Jarrett, C., Eckersberger, E., Levine, Z., Schulz, W. S., & Paterson, “Measuring trust in vaccination: A systematic review”. Human vaccines & immunotherapeutics, 2018, 14.7: 1599-1609.
6. Levy, Neil. “Radically socialized knowledge and conspiracy theories”. Episteme, 2007, 4.2: 181-192.
7. McIntyre, Lee. Post-truth. MIT Press, 2018.
8. Moore, Alfred “On the democratic problem of conspiracy theories” (111-134) in Conspiracy Theories and the People Who Believe Them, ed. Joseph E. Uscinski. Oxford University Press, 2019.
9. Möllering, Guido. The nature of trust: From Georg Simmel to a theory of expectation, interpretation and suspension. Sociology, 2001, 35.2: 403-420.
10. Muirhead, Russell & Nancy L. Rosenblum. A Lot of People Are Saying The New Conspiracism and the Assault on Democracy. Princeton and Oxford: Princeton University Press, 2019.
11. Nichols, Tom. The death of expertise: The campaign against established knowledge and why it matters. Oxford University Press, 2017.
12. Nicodemo, Francesco. Disinformazia. La comunicazione al tempo dei social media. Venezia: Marsilio Editori, 2017.
13. O’Connor, Caitlin and James Owen Weatherall. The Misinformation Age. How False Beliefs Spread. New Haven & London: Yale University Press, 2019.
14. Oremus, Will. Who Controls Your Facebook Feed. A small team of engineers in Menlo Park. A panel of anonymous power users around the world. And, increasingly, you, 2016, available at http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html?via=gdpr-consent
15. Pariser, Eli. The filter bubble: How the new personalized web is changing what we read and how we think. Penguin, 2011.
16. Schkade, D., Sunstein, C. R., & Hastie, R. “What happened on deliberation day”. Calif. L. Rev., 95, 915, 2007.
17. Snyder, Timothy. On Tyranny. Twenty Lessons from the Twentieth Century. New York: Tim Duggan Books, 2017.
18. Snyder, Timothy. The Road to Unfreedom: Russia, Europe, America. Tim Duggan Books, 2018.
19. Strömbäck, Jesper, Åsa Wikforss, Kathrin Glüer, Torun Lindholm and Henrik Oscarsson. “Introduction. Toward understanding Knowledge Resistance in High-Choice Information Environments” (1-28) in Knowledge Resistance in High-Choice Information Environments. Strömbäck, J., Wikforss, Å., Glüer, K., Lindholm, T., & Oscarsson, H. (eds.). New York & London: Routledge, 2022.
20. Sunstein, Cass R. #republic. Divided Democracy in the Age of Social Media. Princeton and Oxford: Princeton University Press, 2017.
21. Sunstein, Cass R. Echo Chambers: Bush v. Gore, Impeachment, and Beyond. Princeton, N.J.: Princeton University Press, 2001.
22. Thompson, Damian. Counterknowledge. How we surrendered to conspiracy theories, quack medicine, bogus science and fake history. New York, London: W. W. Norton & Company, 2008.
 

Co-funded by European Commission Erasmus+
ANIMV
University of Malta
University Rey Juan Carlos
Logo New Strategy Center

Project: DOMINOES Digital cOMpetences INformatiOn EcoSystem  ID: 2021-1-RO01-KA220-HED-000031158
The European Commission’s support for the production of this publication does not constitute an endorsement of the contents, which reflect the views only of the authors, and the Commission cannot be held responsible for any use which may be made of the information contained therein.


Ivan, Cristina; Chiru, Irena; Buluc, Ruxandra; Radu, Aitana; Anghel, Alexandra; Stoian-Iordache, Valentin; Arcos, Rubén; Arribas, Cristina M.; Ćuća, Ana; Ganatra, Kanchi; Gertrudix, Manuel; Modh, Ketan; Nastasiu, Cătălina. (2023). HANDBOOK on Identifying and Countering Disinformation. DOMINOES Project https://doi.org/10.5281/zenodo.7893952