Challenges to science communication in a post-truth World

This is an original manuscript of an article published by Taylor & Francis in Communicatio on November 8, 2021, available online here: Challenges to Science Communication in a Post-Truth World.

Introduction

Communicating scientific research to a lay audience – or, for that matter, communicating any contestable or potentially controversial ideas in areas such as politics or policy – would be a significantly easier task if the audience agreed on what the relevant facts are, and also on the extent to which the facts are even relevant (rather than facts being regarded as of secondary importance to political or pragmatic interests).

The first issue (“what are the facts?”) is an empirical matter, and one which reasonable people can disagree on in situations of uncertainty or incomplete knowledge. The second issue is of more concern when communicating complex or possibly controversial ideas, particularly in light of the increased polarisation of opinion in public discourse, alongside an increasingly widespread mistrust of mainstream media and “authorities” in general.

Facts certainly still matter, but they sometimes don’t appear to matter as much to others as they do to ourselves – or at least, as much as we might like them to matter to others. And even those of us who do care about facts are familiar with situations in which bare facts – clear propositions that we consider to be uncontroversial – are impractical or impolitic to convey in emotionally- or politically-charged contexts.

Consider this empirical question: is it really true that fewer people care for robust evidence and sound reasoning now than in the past? Or is it instead the case that we’re mistaking what is merely the amplification – via, for example, Twitter and other social media platforms – of the voices of those who don’t care about facts for a proportional increase in the number of people who don’t care about evidence as much as we think they should?

Similarly, apparent demonstrations of the rise of an “idiocracy” via events such as the Brexit vote in 2016, and the election of Donald J. Trump as the President of the USA later in that same year, could be evidence of a greater mobilisation of a group who care less for sound evidence and reasoning than they do about securing a certain outcome, rather than evidence for a widespread decline in the willingness or ability of the public to distinguish between truth and falsity.

These sorts of details matter, for at least two reasons, if we are to improve our communicative efficacy in an online world, where noise easily drowns out signal, and where conversations are frequently more performative than constructive (in that participants often use public communication platforms to demonstrate membership of a group that holds a particular worldview, and where that worldview is both formed and then also validated by the similarly-amplified arguments and evidence they are exposed to, in what Eli Pariser (2012) termed the “filter-bubble” of the Internet and social media).

First, because when people who claim membership of a reason-based community, which takes the virtue of objective evidence and fair argument for granted, say things which can sound like “more and more people are idiots”, one of the consequences might be to exacerbate the anti-elitist, and anti-evidence, tendencies of the opposing group, whatever its size is. We won’t encourage people to respect non-partisan facts or arguments through mockery and ridicule, as the philosopher Daniel Dennett (2013, 33) reminds us in his summary of Rapoport’s Rules for composing a “successful critical commentary,” where he notes that rebuttal or criticism is best received only after one has noted points of agreement, expressed from a position of argumentative charity.

Second, a misdiagnosis of the problem will get in the way of finding any solutions. It is clear that at least some of the growth in anti-elitism, often manifesting as anti-science, stems from a feeling of being misunderstood, derided or under threat. For example, Craig and Richeson (2014) found that when white Americans – even liberal white Americans – are reminded that demographic projections indicate that current minority groups are likely to become demographic majorities in America by 2050, their views on questions of political ideology tilt significantly to the conservative side.

Major et al. (2016) further found that this reminder of declining influence led to increased support for Trump. Demographic change is a threat to an established way of life, and to settled power dynamics, but instead of seeing the possibilities in multiculturalism, this perceived threat activates zero-sum thinking along the lines of “if some other group is growing, surely they are taking resources that should be ours?” One consequence of this is that defenders of reason and evidence need to find a language for, and then to clearly articulate ways of thinking about, facts and their importance without triggering any sort of “Backfire Effect” (Nyhan and Reifler 2010).

The Backfire Effect describes an under-appreciated psychological consequence of being told that you’re wrong about something. Instead of weighing the counterarguments or evidence you have been presented with, you might instead react to a threat to your beliefs by committing yourself to those beliefs even more strongly, rather than being motivated to take the counterarguments seriously. None of us operate in a space of pure reason, and the problems introduced by emotional investment in beliefs could in fact sometimes be exacerbated if you’re a proudly rational person, because you might start with an inflated estimation of the integrity and virtues of your existing views. Furthermore, when our deep convictions are challenged and we dig in our epistemic heels, we are perhaps better able to explain to ourselves why the perceived threat was never worthy of serious attention in the first place.

In summary, combating pseudoscience – and irrationality in general – is not simply a matter of asserting the evidence, but also a task that is aided by an awareness of the political and rhetorical drivers of false belief. Those who are tasked with communicating complex and controversial ideas need to incorporate these and other factors into their strategies to increase the chances of their message being conveyed effectively.

What follows is an outline of some of the key concepts that can impede or enhance our ability to communicate effectively in a digital age.

Detached Rationality versus Contextual Rationality

In addition to the Backfire Effect, a key challenge for those who think that argument and good evidence should trump political or emotional interests – summarised as the “reason-based community” – might lie in their assumptions regarding what rationality even is, and in the belief that it is self-evident that people value rationality at all, or that they can easily be persuaded of the virtues of doing so. But the claim that it’s rewarding to be rational will only have motivating force if the proper incentives are in place: if being rational will likely not improve someone’s life as much as we tell them it will, it is no longer at all clear that most people have much reason to care about being rational. By way of justifying this perhaps implausible-sounding claim, imagine two different ways of understanding what “rationality” means.

First, we can understand rationality as holding the beliefs best justified by the evidence, as it is communicated and understood. This view implies all the usual sound epistemic habits that we’ve been teaching to undergraduate philosophy students for centuries. Relevant concepts would include focusing on the amount and quality of evidence, trying to avoid biases and fallacies, and most of all, being a dispassionate seeker of truth. Let’s call this “detached rationality” for the sake of the contrast being offered, leaving aside the debate on whether detachment is in fact ideal.

A second way of understanding rationality, though, is to describe it as involving the efficient pursuit of your goals or values – whatever those goals or values might be. “Contextual rationality,” used as a similar shorthand to the above, involves two critical departures from detached rationality. First, it is driven by subjective values, rather than the objectivity prized in detached rationality. Second, whether you succeed or fail in pursuing those values will be significantly determined by the context in which you’re operating, and whether that context provides incentives that reward those values or beliefs, even if they would not always be prized from a detached rationality perspective.

Even properly irrational (on the detached view) beliefs can be both subjectively valued, and also reinforced by the community in which you hold and pursue them – and this community now includes the vast numbers of people who might either support or denounce you on social media. So, if you are a climate-change denier or a Trump supporter who spends all your time in a (virtual) community of like-minded people, consuming all your news from like-minded sources, that will be identity-affirming, offering you a reason to persist with that belief.

This partially explains why 34% of Republican voters might have believed “that a secretive power elite with a globalist agenda is conspiring to eventually rule the world through an authoritarian world government” compared to just 15% of Democrats (Public Policy Polling 2013) – it’s not only the case that we live in filter-bubbles, but also that sometimes, we don’t care that we might be living in such a bubble, because we don’t share a commitment to detached rationality in the first place. And by extension, it means that we might assess the value of communication by its efficacy in persuasion, rather than by its accuracy.

If you are a member of a self-reinforcing community that shares and affirms falsehoods on Twitter (or even just propositions that are less likely to be true than others), being told that your beliefs are “irrational” is a charge that simply won’t gain any traction, because sometimes, there are no obvious negative consequences to those beliefs – they are instead positive markers of group solidarity, and would be rewarded as such in boosting your standing in that group. We would need the interlocutor to care about detached rationality in order to be persuaded (Ståhl and van Prooijen 2018), but don’t – and perhaps, can’t – provide reasons to do so that are as convincing to others as they are to us, which will be a significant impediment to communicating any information or counterarguments in a persuasive way.

To summarise the point: self-identified skeptics tend to care about the detached version of rationality so deeply that we often think its value is self-evident. However, it is not clear that our communication strategies, and even (perhaps especially) our politics take into account the fact that we might be in the minority regarding even caring about detached rationality (never mind acting in ways consistent with that commitment). The pessimistic implication of this point is that teaching probability, rationality, logic and so forth might not work as well as one would hope. What, after all, is the incentive for people to care about rationality in the detached sense? If you live in a robust and large enough filter-bubble, the fact that you believe some irrational things might never be an issue in your community, and might never attract the negative consequences we try to warn people about.

Before proceeding to consider other aspects of this disconnect between how proponents of detached rationality think people should reason, and how they in fact do so, I want to caution against despondency with regard to the notion of our now living in a “post-truth” or “post-fact” world. While it is undeniable that evidence appears to matter less, and to fewer people, than we think it should, it is nevertheless true that many people continue to be outraged by lies – it’s rather that they often disagree on what counts as a lie, because they regard different sources as authoritative purveyors of reliable information.

We should also be wary of misdiagnosing the issue. A conspiracy theorist, for example, remains committed to the truth, and often appears more committed to discovering the truth than the average person is. What counts as evidence for truth is the key issue here, so it would be a mistake to diagnose the conspiracy theorist as a postmodernist or even an epistemic relativist (who, in summary, rejects the possibility of objective truth altogether).

The lever we would need to pull for changing a postmodernist’s mind, or even for the more conservative goal of causing them to entertain a contrary idea, is to find just one claim they would regard as absolutely true (in order to dissuade them of the notion that “truth” is necessarily contextual), while the lever relevant to persuading the conspiracy theorist involves looking at what count as good reasons to believe a claim is true. These distinct situations require different responses, and communication which flattens distinctions such as these (on the presumption that bare facts and objectivity are still the gold-standard of persuasion) will be immediately compromised in its prospects for success.

The Failure of Facts, and Challenges to Truth

The British philosopher Julian Baggini recently published A Short History of Truth (2017b), which is indeed short, and is furthermore highly accessible even to non-philosophers thanks to its clear arguments and uncomplicated language. In it, he “identifies ten types of supposed truth and explains how easily each can become the midwife of falsehood” (Quercus n.d.). In a blog post discussing some of the themes in the book, he focuses on six challenges to truth, starting – perhaps counterintuitively to some – with science itself. As he points out, “such is … [science’s] power to overturn common sense that its growth has arguably contributed to skepticism that we can ever know the truth at all” (Baggini 2017a).

Put another way, it is perhaps obvious to some that all scientific knowledge is contingent on what we know now, and that our views will – and should – change in light of new evidence. But for others, dramatic shifts in theories and models might be perceived as evidence that “real” knowledge isn’t possible, and that science is all about confirming your biases or satisfying your funders. In this context, it shouldn’t surprise us that conspiracy theories abound, and that you might well be regarded as a better scientist by a significant, and now amplified (by social media), subset of the public if you embrace contrary rather than mainstream views.

Then, he discusses two related aspects; namely liberty, and the decline of elites. Speaking of liberty, Baggini describes the trend towards “an exaggerated sense that we are sovereigns of our selves, the sole authors of our lives,” where the “right to a point of view became so sanctified it became detached from the responsibility to ensure that view is well-grounded” (ibid). The decline of elites is related in that the reasonable – and necessary – revolt against power being in the hands of a privileged few has been replaced by a rejection of authority more generally.

Statements such as “all opinions merit respect” or “that’s just one view among many” confuse a moral claim for an epistemological one. When we speak about respecting opinions, we are generally talking about respecting people and their right to sometimes hold peculiar opinions, rather than respect for the opinions themselves. We are (or should be) making the claim that interesting and valuable contributions to debate can come from anyone, even if that source seems an unlikely one, and that we should be open to hearing what people have to say without prior judgment based on any preconceptions we might have.

The phrase about respecting opinions means that all expressed viewpoints should be given a fair hearing. But it does not mean that the opinions themselves merit equal respect, once evaluated, and it certainly does not mean that we can’t interrogate opinions at all. My opinion that Lagavulin is the finest Islay whisky doesn’t matter to anyone except perhaps others I inflict the beverage upon, or my accountant, so we might not feel inclined to bother interrogating it. But if I held – and communicated – the opinion that one gender or race was superior to others, I’d hope that someone would challenge that outlook, because we don’t want to live in a world where such an opinion is treated with respect, given how easily it can be defeated by evidence and rudimentary thought, and because of its negative implications for society.

You can nevertheless choose to treat a person who holds such a view with respect, or at least not in such a way that it is immediately apparent to them that you consider them defective in some way, because productive communication is not possible when people are not listening, due to the insults they believe you have subjected them to. Furthermore, keeping the communicative space open in this way will in turn often help you understand why they and others might hold that view. This understanding can then sometimes help you persuade them of the wrongness of that position, or inform future attempts at communicating these ideas (or others, to similar groups of people).

Globalisation and a free press are also addressed by Baggini. Globalisation, simply because – similarly to science – exposure to vast differences of opinion between people can lead you to the mistaken conclusion that all knowledge is determined by time and place, because (some might think), surely if a proposition were true, it would be more widely accepted as such? But claims being contingent on available knowledge is not the same thing as complete subjectivity. And while properly-conducted science and communication about science would ideally entail that people of all cultures or nations eventually converge on the same answers, the mistaken impulse arising out of globalisation is that all we can possibly have are “culturally relative ‘truths’, none of which is more valid than any other” (ibid).

Finally, and to emphasise the key role that communication plays, Baggini points out that the problem with a free press is that the press is also free to lie, or at least, free to present a distorted view of a situation. If you had the time and attention to spare, you might read various publications carefully enough to realise that it’s often less a matter of outright lying than one of partisan bias or poor journalism, and you could then do the work of assigning each newspaper or website some relative position in an ideological spectrum. But, if you subscribe to the contextually rational view articulated earlier, you might instead just conclude that certain newspapers or broadcasters tell untruths in general. Even worse, you might conclude that no sources can be trusted and that, in Baggini’s words, “suspension of all belief seems preferable to commitment to any truth” (ibid).

The unifying theme of these points is that they all describe positive developments in human history, which have on the whole brought us far more benefits than costs. They will likely continue to do so, and we will hopefully – in time – come to see the present moment as an interruption of a trend of increased global prosperity, justice and knowledge, as the cognitive psychologist and popular science author Steven Pinker argues (2018). But at the same time, the concerns described above warn us that some of the very foundations of knowledge, truth and authority – which we might take for granted in trying to communicate a particular idea – are subject to varying interpretations and responses that we need to be sensitive to if we are to have a productive dialogue with others.

Skepticism versus Hyperskepticism[1]

The complications and confounders related to truth – and to the task of persuading others of that truth – discussed above are also part of what leads to what is known as “hyperskepticism”, which is more akin to an epistemically crippling cynicism than to skepticism. What scientific skepticism means is elegantly captured in Canadian skeptic Daniel Loxton’s essay titled Why is there a Skeptical Movement?, where he defines it as “the practice or project of studying paranormal and pseudoscientific claims through the lens of science and critical scholarship, and then sharing the results with the public” (Loxton 2013, 2).

Put briefly, skepticism should not be conflated with outright cynicism or “hyperskepticism”, as Caleb Lack and I argue in Critical Thinking, Science and Pseudoscience: Why We Can’t Trust Our Brains (2016). Skeptics are open to being wrong, and follow the evidence where it leads them, but they do not automatically distrust authorities or consensus – as far as it is expressed for public consumption – on principle. A skeptic would likely not believe in chemtrails,[2] or in a tabloid’s claim that “Big Pharma” is deliberately not healing you in order to sell more medicine, whereas a hyperskeptic – or more plainly, conspiracy theorist – would be more inclined to think that the “official narrative” is disguising the fact that these things are happening, and that cellphone tower masts are giving you cancer too.

It should be noted that scientific skepticism both permits and encourages asking questions about received wisdom. The thing about asking questions, though, is that those who are genuinely interested in the answers tend to stop doing so once a plausible, evidence-based account is presented. Not so with hyperskeptics (or toddlers). The available scientific consensus usually provides us with sufficiently robust answers, even if those answers will sometimes be challenged by future evidence.

It’s not a failure of skepticism to trust the scientific consensus until good reasons for mistrusting it comes to light. But it is a failure of the “scientific” part of scientific skepticism – therefore, an example of hyperskepticism – to pathologically distrust perfectly reasonable answers, especially when no equally plausible (or superior) alternatives to those answers are available. When our reasons for doubting the consensus arise from paranoia, poor research, or “cherry-picking”[3] evidence to support a conspiracy theory, we are engaging in hyperskepticism and sometimes even outright denialism, as Daniel Dennett and other Fellows of the Committee for Skeptical Inquiry (2014) have argued in calling for “climate skeptics” to more accurately be described as “deniers”, in that they traduce the reputation and importance of scientific skepticism when appropriating that label.

While the world waits for an effective Covid-19 vaccine to be widely available [readers from Nov. 2021 and later, please note that this paper was submitted in 2019], the existence of hyperskeptics, their possible influence on public understanding of vaccine science, and ways in which we can mitigate against possible cynicism about – and outright distrust of – a vaccine needs to be taken into account when communicating on these issues.

Sensational claims and conspiracy theories spread faster and easier than soberly reported and verified information, thanks to the algorithms of Facebook and Twitter, which is part of the reason that 20% of UK residents said they would refuse a vaccine, and only 49% said they would willingly have it administered (Bosely 2020). In September 2020, half of the USA respondents in a similar survey said they would “probably or definitely” not accept a vaccine if it were available (at the time of polling) (Osborne 2020).

While the reasons for this mistrust of a possible Covid-19 vaccination are not yet clear, they would include at least a contribution from fears invoked by the politicisation of science, as well as conspiracy-theories around funding for research, such as the claim that Bill Gates is funding vaccine research in order to put a microchip into the vaccine, so that (most) human movement could be tracked. As absurd as this claim is, 44% of Republican voters in the USA believed it to be true in April 2020, as did 19% of Democrats (Brown and Weise 2020).

As Bill Gates said in a media call on this topic, “It’s almost hard to deny this stuff because it’s so stupid or strange that even to repeat it gives it credibility” (ibid). Nevertheless, science communicators who seek to persuade the public to take such a vaccine need to understand the motivations and fears of those who are primed to believe such narratives, and they need to develop strategies to encourage trust in the credibility of the scientific process, without giving cause for further conspiracy in being secretive or untransparent about aspects such as funding.

The lesson for communication strategists is that this is yet another complication that must be anticipated, and addressed as effectively as is possible, given the audience and what is known about their existing levels of trust in authority and evidence, rather than inadvertently triggering the Backfire Effect addressed above.

Democratised expertise and the “Easiness Effect”

The current ubiquity of science denialism, and the perceived decline in the value of recognised experts, has arguably made hyperskepticism seem far more reasonable and attractive than is merited. The media are partly to blame, in that in an attempt to satisfy their audiences and create the impression of debate, “false balance” has become far too prevalent in coverage of scientific controversies.

False balance describes the practise of presenting two sides of a story as equally credible, when they are not in fact so, as for example in Andrew Wakefield’s[4] film “Vaxxed: From Cover-up to Catastrophe.” Wakefield was invited to screen the movie at Robert de Niro’s[5] Tribeca Film Festival, before scientific and public outcry forced its withdrawal (Goodman 2016). The film uses images of “vaccine-damaged” children and interviews with their parents to support the idea that vaccines cause autism, even though the most recent Cochrane Review on vaccines, including more than 15-million children, still failed to find any such link (Demicheli, V. et al. 2012). Furthermore, the film treats Wakefield as an expert, never mentioning that he has been stripped of his qualifications as a medical doctor, and that the paper purporting to demonstrate a causative link between vaccine and autism was retracted by the influential medical journal The Lancet (Dyer 2010).

The invitation and subsequent withdrawal of the film generated significant media attention, as one might expect given the involvement of a prominent actor such as de Niro. However the concern that scientific skeptics rightly have about media coverage of such controversies is that when figures like Wakefield or de Niro are given a platform to label the outcry a free-speech violation (Smith 2016), and to dismiss such an intervention as an example of intolerance to scientific dissent, it is in fact the media platform itself that is operative in introducing bias through presenting a discredited viewpoint as worthy of equal attention to an evidence-based view, rather than creating balance via highlighting a credible alternative view (hence the name “false balance”).

The introduction of a discredited – or simply unqualified – voice or viewpoint does nothing to create balance: at best, it discredits the understanding of “balance”; at worst, it furthers conspiracy theories, or contributes to public confusion about who and what to believe.

To compound the problem of how to communicate in such a way as to encourage considered debate without creating false balance, the widespread availability of information via the Internet, and the sense of liberty described by Baggini above, has arguably “democratised” the idea of expertise itself. The value of expertise – and experts themselves – is under constant challenge from everybody with an Internet-connected device, who can easily find “evidence” that you are serving some nefarious agenda, or suppressing dissent, if you choose not to pursue the course of creating false balance by refusing to include already-debunked views when communicating about controversial issues.

This democratisation of expertise has also resulted in a trend of mistaking the popular voice for an opinion worth taking seriously, via consumer feedback in the form of site visits or clicks on salacious headlines. Christopher Hitchens coined a wonderful phrase in referring to conspiracy theories, dubbing them the “exhaust fumes of democracy” (Hodapp and Von Kannon 2008) – an unavoidable consequence of an uninformed public being subjected to unmanageable amounts of information – and one could say something similar about public understanding of science.

In response to this information-overload, occurring in a context of scientific illiteracy, Scharrer et. al. (2016) described the public response to scientific research as characterised by an “easiness effect,” where popular representations of scientific controversies make people overconfident in their understanding of the relevant science. Scharrer’s research found that the perceived validity of scientific claims are influenced by the values, interests, and the non-scientific knowledge held by laypersons, thus disrupting the essential “division of cognitive labour” (ibid) whereby experts are best-placed to tell non-experts what to think regarding complex topics in science.

A clear lesson here is that those of us who host platforms for debate on matters such as this, whether journals, podcasts or shows broadcast on other media, need to understand and embrace our responsibility to ensure credible dialogue and debate, rather than creating spectacle for the sake of audience attention.

Guidelines for communicating science

In light of these obstacles to popular understanding of reasoning and evidence, how might science communicators reinforce belief in the value of truth and expertise? It would be short-sighted to do so by avoiding complexity entirely, or by mimicking the attitudes of charlatans such as Dr. Oz or Gwyneth Paltrow (Rousseau 2015), who seem to have enthusiastically embraced the idea that all that matters is persuasion rather than speaking with integrity, or who are so committed to a particular belief or ideology that any opposition is dismissed as prejudiced, or even sometimes as purchased by a lobby-group or other sinister forces.

A useful starting point for science communication is a reminder to its practitioners, and to the public, that certainty is rarely warranted. Speaking in the language or tone of certainty encourages similar arrogance on the part of others, and also feeds the conspiratorially- or relativistically-minded, when you later discover that you have good reason to change your mind. This attitude of epistemic prudence that I am encouraging here – not making stronger claims than are warranted by the evidence – alongside a certain humility (showing an awareness of the possibility that you might be wrong), are valuable resources for encouraging sober debate on, and hopefully understanding of, scientific controversies in the public sphere.

In light of the issues addressed above, in particular the value of epistemic humility when communicating complex issues to a polarised audience, consider the following suggestions as a diagnostic check on the communicator’s end, which can hopefully assist in effectively communicating policy or theory in such a way as to encourage receptiveness on the part of the audience.

1. Offering explanations rather than reasons

Besides setting an example of sound epistemic habits, as described in the previous section, we might also usefully be reminded of Rozenblit and Keil’s (2002) description of “the illusion of explanatory depth,” later updated for the Internet age by Fisher et al. (2015) to demonstrate how “searching the Internet for explanatory knowledge creates an illusion whereby people mistake access to information for their own personal understanding of the information.”

Humans are generally inclined to believe that we have a robust understanding of how things work (especially things we’re emotionally committed to), whereas in fact our understanding might be superficial, or even when not superficial, nevertheless difficult to convey to a less-informed interlocutor. Now that science is in the public domain, and scientific hypotheses are increasingly being marketed to an uninformed (but often overconfident) public, scientists and scientific communicators need to learn to become effective communicators, rather than only being knowledgeable about the science.

Philip Fernbach (2013) describes an interesting and promising approach that invites communication practitioners to recognize and leverage the illusion of explanatory depth to not only focus on the quality of explanations, but perhaps also aid in persuading others that they are wrong (when this is merited): instead of providing reasons, try providing explanations. For example, instead of asserting that universal healthcare is a moral imperative because all humans are equal in worth, and therefore equally entitled to care from the state, one might try approaching the problem from the so-called bottom-up, explaining how an envisaged universal healthcare scheme would work – how would it be implemented, what would it cost, who would pay, and who would benefit from it.

Fernbach’s work suggests that this approach stands a better chance of persuading others that you are right, because you have shown your workings, rather than simply asserted a view. While a brashly asserted view might trigger a reflex rejection from an interlocutor with a fixed and contrary position, taking the time to explain, rather than merely assert, your preferred approach invites objections that are similarly thoughtful, rather than reflex or reactionary dismissals.

This approach is also beneficial for communicators themselves, in that the exercise of explaining can reveal to us where and if we are wrong, because sometimes, our workings don’t stand up to scrutiny. Articulating the case through explanation affords us an extra opportunity to detect and correct errors in our reasoning.

Put simply, being able to offer reasons and explanations for your own viewpoint demonstrates both the process of holding an evidence-based position to the person you’re trying to persuade (which is more likely to succeed than a simple “you’re wrong”), while also forcing you to examine and defend your own stance, allowing for its refinement or correction.

2. Inoculation theory

Another technique that subscribers to the detached rationality perspective could find to be of value is “inoculation theory,” as presented by van der Linden et al. (2017). Van der Linden and his colleagues explored “how people evaluate and process consensus cues in a polarized information environment,” and explain how, while facts are important, they are often not sufficient to convince others if presented in a context where misinformation is also present. According to their experiments, we need to actively inoculate people against misinformation, through measures such as explaining the fallacies underlying that misinformation.

Their experiment on perceptions related to climate change, for example, used the Oregon Petition (a well-known piece of misinformation, often used to support the notion that there is no consensus on climate change).[6] When the Oregon Petition was presented with introductory text informing participants that “Some politically-motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists,” along with details regarding the specific flaws in the Oregon Petition, participants agreed that there was in fact a consensus on climate change at significantly greater rates than when the misinformation was presented without inoculation.

3. Re-asserting the value of experts

While taking care to not trigger the Backfire Effect or the anti-elitist impulses of some members of the public, we also need to re-assert the value of scientific expertise, and to combat the growing influence of influential laypersons or seductive celebrities on scientific debates. One can certainly garner media attention through rejecting consensus, or by asserting that the claimed consensus is generated by scientists to whom you ascribe malice (or claim are in the pockets of corporate paymasters). But a consensus emerging from expert analysis is not the same thing as dogma, misinformation, or sinister cover-up. These things are indeed antithetical to science. However, when a preponderance of evidence points in a consistent direction, the consensus guidelines would not be suspicious in these or other ways.

Consensus guidelines that emerge out of honest engagement with the evidence, and that are open to correction, are not anti-science – they are instead the product of good science, ideally understood by a broad proportion of the public, thanks to effective communication. The subsequent (hypothetical) overturning of a previous consensus in favour of a new one is also the product of good science. This is because, contrary to the “wisdom of the crowd” perspective, we don’t measure or identify good science via its conclusions – because we don’t know that those conclusions will survive challenges presented by future data – but by method, and by openness to correction in light of further evidence.

Science is certainly dependent on disbelief and skepticism, but skepticism is compatible with a well-justified consensus. It is only on the hyperskeptic view that one is instinctively suspicious of consensus on scientific matters.

4. Promoting scientific literacy, starting with children

While there are ways in which science communication can be improved, it is also possible that a pessimistic conclusion – that the contextual sort of rationality described above has triumphed over the detached view – should be embraced. If so, the focus should shift to scientific education and communication to younger generations, so that they – and those future societies – are better able to cope with uncertainty in the face of information overload. And if this pessimistic conclusion is rejected, the same focus is nevertheless advised, even as it should be complemented by interventions directed at adult audiences.

A promising experiment that Andy Oxman ran in Uganda (Nsangi 2017) showed that it’s possible to teach children as young as 10 to differentiate unfounded health claims from plausible ones. When tested for whether they could detect bogus health claims, more than double the number of children who had received lesson plans (on medical myths, conflicts of interest, the unreliability of anecdote, and more) achieved a passing score on the test than children who had no such intervention.

Educating young children in basic scientific reasoning is essential not only because it seems to work, but because it’s far more difficult to undo bad epistemological habits than it is to stop them from being developed in the first place. But there are plentiful resources for adults also, many of them free – SenseAboutScience.org, as one example, has a website for both the US and the UK, and offers consumers lessons in how to understand statistics, peer-review and systematic review. Askforevidence.org can give you guidance on what counts as reliable evidence, and why, across topics as diverse as food additives and lie detectors. And initiatives such as Dr. Ben Goldacre’s Alltrials.net are trying to ensure that the totality of evidence becomes available, rather than the current norm of mostly reporting only on trials that offer positive results, which could mislead readers to an inflated estimation of the value of the particular intervention being assessed.

Conclusions

It is part of the scientific communicator’s  job to present and argue for nuance, and to demonstrate – partly through showing a willingness to embrace uncertainty – why others should be persuaded by one conclusion rather than another. We devalue our skeptical currency and credibility by insisting on certainty – and we do the political and rhetorical cause of skepticism harm. This doesn’t mean we can’t take sides, and it also doesn’t entail the sort of false balance that would require one to give a vaccine-denier a seat at the adult table. Instead, there is a palpable need to be more sensitive to the political and communicative effectiveness of how we talk about science, while simultaneously emphasising the value of scientific education, particularly in the youth.

The rise of anti-elitism and the mistaken faith in the wisdom of the crowd cannot be ignored, even if it is misguided. As a matter of strategy, rather than of epistemology, we need to reassert the values of epistemic humility, and the dangers of dogmatism. This is because the long-term goal of scientific communication is not about a particular issue occupying the attention of social media at any given time, but rather about encouraging a certain mode of thought and approach to both evidence, as well as dissent regarding that evidence, regardless of the nature of the particular case at hand.

Furthermore, it is about separating – and encouraging others to separate where justified – political and emotional concerns from scientific ones. The value of good science and good communication about science relates not only to helping people to reach correct conclusions, but also to fostering an understanding of the manner in which we reach those conclusions.

Being effective in such communication requires paying attention to, and strategically adapting to, the political and psychological drivers of our audience(s), rather than falling for the temptation of consigning a significant proportion of the population to the category of “irrational” while continuing to speak – ineffectually – amongst ourselves.

References

Baggini, J. 2017a. Six Things… that Challenge Truth. https://www.southbankcentre.co.uk/blog/six-things-challenge-truth (accessed 9 November 2019).

Baggini, J. 2017b. A Short History of Truth. London: Quercus.

Brown, M. and Weise, E. “Fact check: Bill Gates is Not Planning to Microchip the World Through a COVID-19 Vaccine.” USA Today. https://www.usatoday.com/story/news/factcheck/2020/06/12/fact-check-bill-gates-isnt-planning-implant-microchips-via-vaccines/3171405001/ (accessed 18 November 2020).

Bosely, S. 2020. “Coronavirus: Fifth of People Likely to Refuse Covid Vaccine, UK Survey Finds.” the Guardian. https://www.theguardian.com/world/2020/sep/24/a-fifth-of-people-likely-to-refuse-covid-vaccine-uk-survey-finds (accessed 18 November 2020).

Center for Inquiry. n.d. Deniers are Not Skeptics. http://www.csicop.org/news/show/deniers_are_not_skeptics (accessed 9 November 2019).

Craig, M. A., and Richeson, J. A. 2014. “On the Precipice of a “Majority-Minority” America: Perceived Status Threat From the Racial Demographic Shift Affects White Americans’ Political Ideology.” Psychological Science, 25(6), 1189–1197. https://doi.org/10.1177/0956797614527113

Dennett, D. 2013. Intuition Pumps and Other Tools for Thinking. New York, NY: W. W. Norton and Co.

Demicheli V, Rivetti A, Debalini MG, and Di Pietrantonj C. 2012. “Vaccines for Measles, Mumps and Rubella in Children.” Cochrane Database of Systematic Reviews. Issue 2. Art. No.: CD004407. DOI: 10.1002/14651858.CD004407.pub3

Dyer, C. 2010. “Lancet Retracts Wakefield’s MMR paper.” BMJ 340:c696

Fernbach, P. M., Rogers, T., Fox, C. R., and Sloman, S. A. 2013. “Political Extremism Is Supported by an Illusion of Understanding.” Psychological Science, 24(6), 939–946. Retrieved from http://journals.sagepub.com/doi/abs/10.1177/0956797612464058

Fisher, M., Goddu, M. K., and Keil, F. C. 2015. “Searching for Explanations: How the Internet Inflates Estimates of Internal Knowledge.” Journal of Experimental Psychology. General, 144(3), 674–687. https://doi.org/10.1037/xge0000070

Goodman, Stephanie. 2016. “Robert De Niro Pulls Anti-Vaccine Documentary From Tribeca Film Festival.” The New York Times, March 26. https://www.nytimes.com/2016/03/27/movies/robert-de-niro-pulls-anti-vaccine-documentary-from-tribeca-film-festival.html (accessed 9 November 2019).

Hodapp, Christopher and Alice Von Kannon. 2008. Conspiracy Theories and Secret Societies For Dummies. New Jersey: John Wiley and Sons.

Lack, C. and Rousseau, J. 2016. Critical Thinking, Science and Pseudoscience: Why We Can’t Trust Our Brains. New York: Springer.

Loxton, Daniel. 2013. Why is there a Skeptical Movement? https://www.skeptic.com/downloads/Why-Is-There-a-Skeptical-Movement.pdf (accessed 9 November 2019).

Major, B., Blodorn, A., and Major Blascovich, G. 2016. “The Threat of Increasing Diversity: Why many White Americans Support Trump in the 2016 Presidential Election.” Group Processes and Intergroup Relations. https://doi.org/10.1177/1368430216677304

Nsangi, A., Semakula, D., Oxman, A., Austvoll-Dahlgren, A., Oxman, M., Rosenbaum, S., Morelli, A., Glenton, C., Lewin, S., Kaseje, M., Chalmers, I., Fretheim, A., Ding, Y., Sewankambo, N. n.d. “Effects of the Informed Health Choices Primary School Intervention on the Ability of Children in Uganda to Assess the Reliability of Claims about Treatment Effects: a Cluster-randomised Controlled Trial.” The Lancet, 390(10092), 374–388. https://doi.org/10.1016/S0140-6736(17)31226-6

Nyhan, B., and Reifler, J. 2010. “When Corrections Fail: The Persistence of Political Misperceptions.” Political Behavior, 32(2), 303–330. https://doi.org/10.1007/s11109-010-9112-2

Osborne, H. 2020. “Anti-Vaxxers Feed Off Democrats’ Skepticism of COVID Vaccine.” Newsweek. https://www.newsweek.com/anti-vaccine-covid-trust-skepticism-democrat-politicization-1535559 (accessed 18 November 2020).

Pariser, E. 2012. The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think. New York: Penguin Random House.

Pinker, S. 2018. Enlightenment Now: The Case for Reason, Science, Humanism, and Progress. New York: Viking.

Public Policy Polling. 2013, April 2. Democrats and Republicans Differ on Conspiracy Theory Beliefs. https://www.publicpolicypolling.com/wp-content/uploads/2017/09/PPP_Release_National_ConspiracyTheories_040213.pdf (accessed 9 November 2019).

Quercus Books. n.d. A Short History of Truth: Consolations for a Post-Fact World. https://www.quercusbooks.co.uk/books/detail.page?isbn=9781786488886 (accessed 9 November 2019).

Rousseau, S. 2015. “The Celebrity Quick-Fix.” Food, Culture and Society: An International Journal of Multidisciplinary Research, 18:2, 265-287

Rozenblit, L., and Keil, F. 2002. “The Misunderstood Limits of Folk Science: an Illusion of Explanatory Depth.” Cognitive Science, 26(5), 521–562. https://doi.org/10.1207/s15516709cog2605_1

Ryzik, M. 2016, April 1. “Anti-Vaccine Film, Pulled From Tribeca Film Festival, Draws Crowd at Showing.” The New York Times. https://www.nytimes.com/2016/04/02/nyregion/anti-vaccine-film-pulled-from-tribeca-film-festival-draws-crowd-at-showing.html (accessed 9 November 2019).

Scharrer, L., Rupieper, Y., Stadtler, M., and Bromme, R. 2017. “When Science Becomes too Easy: Science Popularization Inclines Laypeople to Underrate their Dependence on Experts.” Public Understanding of Science, 26(8), 1003–1018. Retrieved from http://journals.sagepub.com/doi/abs/10.1177/0963662516680311

Smith, N. 2016, March 21. “Director of Controversial Vaxxed film Calls Tribeca Snub a Free Speech Issue.” The Guardian. https://www.theguardian.com/film/2016/mar/30/vaxxed-andrew-wakefield-tribeca-robert-de-niro-free-speech (accessed 9 November 2019).

Ståhl, T. and van Prooijen, J. 2018. “Epistemic Rationality: Skepticism Toward Unfounded Beliefs Requires Sufficient Cognitive Ability and Motivation to be Rational.” Personality and Individual Differences, Volume 122, Pages 155-163. https://doi.org/10.1016/j.paid.2017.10.026.

van der Linden, S., Leiserowitz, A., Rosenthal, S., and Maibach, E. 2017. “Inoculating the Public against Misinformation about Climate Change.” Global Challenges, 1(2), 1600008–n/a. https://doi.org/10.1002/gch2.201600008

Footnotes

[1] The American spelling of “skeptic” rather than the British “skeptic” is used throughout for consistency, given that citations provided are for texts published primarily for an American audience.

[2] “Chemtrails” are the condensation trails left in the sky by aircraft, that conspiracy-theorists believe to consist of chemical agents sprayed into the environment for mind-control or other sinister (and always undisclosed) purposes.

[3] Attending to or citing evidence selectively, in order to support a particular outcome or to discredit alternative conclusions.

[4] A discredited former British doctor, who was struck off the UK medical register for misconduct and fraud, and who is influential in the anti-vaccine movement, with his (retracted) research frequently cited in defence of the claim that vaccines can cause autism.

[5] De Niro has an autistic child, and the discussion of “contextual rationality” above is perhaps salient to understanding his sympathy to Wakefield’s thesis.

[6] “31,487 American scientists have signed this petition, including 9,029 with PhDs”. http://www.petitionproject.org/index.php

By Jacques Rousseau

Jacques Rousseau teaches critical thinking and ethics at the University of Cape Town, South Africa, and is the founder and director of the Free Society Institute, a non-profit organisation promoting secular humanism and scientific reasoning.