Earlier this year, I had the opportunity to present a paper at The Amaz!ng Meeting, held in Las Vegas. Here’s the YouTube video of my presentation, with the text pasted below that.
It addresses concerns I have regarding epistemic humility and prudence – basically, remembering that we’re often wrong, no matter how smart we might think we are, and also that the claims we make need to be offered with a level of conviction that is appropriate to the evidence at hand.
The virtues of (epistemic) agnosticism, and the responsible believer
Jacques Rousseau, Free Society Institute.
One of the important lessons that the skeptical community can teach others is that things are often uncertain. We might have very good reason to believe something, yet not feel entitled to claim that we are sure of it. This attitude of epistemic prudence – not making claims that aren’t warranted by the evidence – alongside a certain humility, that of being able to accommodate the possibility that you might be wrong, are essential resources for triangulating on the truth.
The idea of epistemic prudence is worth dwelling on for a moment: one striking difference between skeptical thinkers and non-skeptical thinkers is their attitude towards certainty. I’d suggest that a skeptical thinker would be far more likely to recognise that what we typically call ‘true’ is merely the best justified conclusion available to us given the available evidence – we are rarely making the claim that it is, in fact, certain to be true. Justification, or warrant, is our proxy for truth, and our strategy for triangulating on the truth.
By contrast, the tone of much popular discourse, including coverage of important fields within science in newspapers and on social media, proceeds as if things can be known, for certain. This leads to absurd contestations where things are “proved” and then “disproved” with each new bestseller, and where apparent “authorities” rise and are then quickly forgotten as our attention shifts to the next sensation.
The diet wars are a current and fitting example of this, where moderation is drowned out by inflated claims that Paleo is the only way to go, or that sugar is not just something to be careful of, but something that is “addictive”, even “the new tobacco”.
The temperature of these debates might sometimes be far more comfortable, and the outcomes more fruitful, if proponents adopted a more considered tone and resisted claiming the final word. After all, knowledge is contingent on what we can possibly know at any given time, and the chances that you’ve got things exactly right at some particular point in time are therefore often small.
Then, epistemic humility – being willing to recognise that you might be wrong – is an essential component of holding a robust set of beliefs. Smugness or overconfidence regarding the set of ideas that you regard as true might sometimes be justified – leaving aside the issue of how attractive or politically effective smugness might be – but it can also be a sign that your belief could have ossified.
Your conviction can, in these instances, become something closer to an item of faith, rather than a belief held responsibly – meaning that you still regard as potentially falsifiable (even as you think it unlikely that it would ever be falsified).
And it’s exactly because we skeptics tend to be relatively virtuous in terms of these two attributes – humility and prudence – that we need to remind ourselves to perform a diagnostic check every now and then, and perhaps especially in situations where our point of view is being challenged by others.
Being better at avoiding confirmation bias and other common ways of getting things wrong doesn’t make us immune to those mistakes. In fact, our confidence in getting things right might be a particularly problematic blind spot, because if we think we’ve learnt these lessons already, we might falsely believe that there’s little need to keep reminding ourselves of various mistakes we might still be making.
So in light of the fact that the world is complex, plus the fact that we’re unable to be a specialist in everything, should it not strike us as odd how seldom we hear anyone say something like “I simply don’t know enough about that issue to have a position on it” – instead of taking a view, and then eagerly defending that view?
Jonathan Haidt’s account of how moral reasoning works is a useful analogy for many claims regarding contested positions. Where we find ourselves committed to a view, and stubbornly defending it, rather than finding that a view develops in light of the evidence, we should recognise that the psychology of belief and the politics of debate mean that it’s often the case that “the emotional dog wags the rational tail”.
In other words, we emotionally commit ourselves once we take a position, and then make rational-sounding noises to justify it, rather than being able to admit that we’re not sufficiently informed to take a position on this matter, or that our position is far more tentative than we’d like you to believe.
Too many of us seem to despise doubt or uncertainty, even if that’s the position best supported by the evidence we have access to. We like to have strong opinions – and with the rise of social media – where robust and hyper-confident expression gets most attention – the space for being uncertain and expressing that uncertainty closes off just that little bit more.
To add to the difficulty of entertaining and encouraging considered debate, the widespread availability of information via the Internet has arguably “democratised” expertise itself. The idea of authority – and authorities themselves – are under constant challenge from everybody with an Internet-connected device. In other words, by everybody.
While it’s of course true that we shouldn’t accept the testimony of authorities in an uncritical way, we need to nevertheless accept that expertise and privileged access to information are real things that can’t easily be replaced by Googling. Sometimes – most of the time, in fact – someone will know more than you, and you could quite possibly be wrong.
What the death of authority means is that no matter what your point of view, you can find communities that share it, and that reinforce your belief while you reinforce theirs, with all of you walking away believing that you are the authorities and everyone else bafflingly obtuse.
Eli Pariser’s concept of the “filter bubble” articulates this point well – if you’re looking for evidence of Bigfoot on a cryptozoology website, you’ll find it. Chances are you’ll end up believing in the Loch Ness monster too, simply because the community creates a self-supporting web of “evidence”. When these tendencies are expressed in the form of conspiracies, the situation becomes even more absurd, in that being unable to prove your theory to the doubters is taken as evidence that the theory is true – the mainstream folk are simply hiding the evidence that would embarrass or expose them.
Combine the filter bubble and the democratisation of expertise with the nonsense of a blanket “respect for the opinions of others”, and we quickly end up drinking too deeply from the well of postmodernism, where truth takes a back seat to sensation, or where simply being heard takes too much effort, and we withdraw from debate.
Despite these complications, we can all develop – as well as teach – resources for separating unjustified claims from justified ones, and for being more responsible believers. By “responsible believers”, I mean both taking responsibility for our beliefs and their implications, but also holding beliefs responsibly – in other words, forming them as carefully as possible, and changing our minds when it’s appropriate to do so.
Peter Boghossian’s superb 2013 book, “A manual for creating atheists”, introduces the concept of “street epistemology” – simple but effective rhetorical and logical maneuvers that we can deploy in everyday situations. In a similar vein, I’d like to articulate a few concepts that can serve as resources for making it more likely (as there are no guarantees available) that we end up holding justified beliefs in a responsible fashion.
THE POLITICS OF KNOWLEDGE
Many of our blind spots and argumentative failures involve the politics of argumentative situations, rather than errors in cognition. What I mean by that is that even though we might be quite aware of how our thinking is flawed when thinking these things through abstractly, we forget what we know about good reasoning in the heat of “battle”, especially when engaging in some areas of frequent contestation.
The sorts of contestations I’m imagining are atheists debating theists, or scientific naturalists versus Deepak Chopra, debates around gender and sex, or the extent to which atheism, skepticism, humanism and the like are supposed to intersect. We can become so obsessed with being right, and being acknowledged as right, that we forget about being persuasive, and about what it takes to get people to listen, rather than leap to judgement.
Debates occur in a context. Your opponent is rarely stupid, or irredeemably deluded – they more often simply have different motivations to yours, as well as access to a different data set (regardless of its quality relative to yours). So, to paraphrase Dan Dennett, we might usefully be reminded of the importance of applying Rapoport’s Rules when in argument.
If you haven’t encountered Rapoport’s Rules before, they invite us to do the following:
- Attempt to re-express your target’s position so clearly, vividly and fairly that your target says: “Thanks, I wish I’d thought of putting it that way.”
- List any points of agreement (especially if they are not matters of general or widespread agreement).
- Mention anything you have learned from your target.
- Only then are you permitted to say so much as a word of rebuttal or criticism.
One immediate effect of following these rules is that your targets will be a more receptive audience for your criticism than they would otherwise have been.
EXPLANATIONS & REASONS
Then, we could bear in mind what Leonid Rozenblit and Frank Keil from Yale University dubbed “the illusion of explanatory depth“. We’re inclined to believe that we have a robust understanding of how things work (especially things we’re emotionally committed to), whereas in fact our understanding might be superficial, or at least difficult to convey to a less-informed interlocutor.
Those of us who teach professionally, like me, know this phenomenon well. You might launch into an exposition on a topic you think you know well, but then quickly realize that you don’t quite have the words or concepts to explain what you’re trying to explain – even though it seemed crystal clear to you when planning your lesson.
Philip Fernbach, of the University of Colorado, wrote up an interesting 2013 experiment that invites us to recognize and leverage this illusion to not only make ourselves focus on the quality of our explanations, but perhaps also aid us in persuading others that they are wrong: instead of providing reasons, try providing explanations.
For example, instead of asserting that we need universal healthcare because everyone is morally equal in this respect, and therefore equally entitled to care from the state, try explaining how your envisaged universal healthcare scheme would work – how would it be implemented, what would it cost, who would pay, who would benefit.
This approach stands a better chance of persuading others that you are right, because you have “shown your workings”, rather than asserted your view. It also – crucially – stands a better chance of showing you where (and if) you are wrong, because sometimes those workings don’t stand up to scrutiny, and exposing them allows you to spot that.
Another way of putting this is that if you’re offering reasons, it’s likely that you’re mostly trying to demonstrate that you’re right. If you’re explaining, it’s more likely that you understand why you’re probably right, and therefore, that you’d be able to effectively articulate to others why they should be persuaded to subscribe to your point of view.
BACKFIRE EFFECT
Whether you’re explaining or not, keep in mind that we don’t operate in a space of pure reason. We’re often emotionally invested in our beliefs, to the extent that we’re prone to what’s known as the “backfire effect”.
While we like to think that when our beliefs are challenged with strong evidence, we’d be willing to alter our opinions – adjusting our claims in light of the new evidence that has been presented to us – the truth is not always that flattering.
When our deep convictions are challenged, the perceived threat can mean that we dig in our epistemic heels, allowing our existing beliefs to become more entrenched rather than considering the virtue of the challenge that has been put to us.
Consider possible longer-term implications of this: once you rule one set of considerations out as irrelevant to the truth of your thesis, how much easier might it be to cut yourself some slack on some future set of considerations also – and therefore, how easy it might be to end up with beliefs that are essentially unfalsifiable, and therefore only as virtuous as those of an astrologer?
CONCLUSION
In learning about various ideas related to scientific reasoning and how to assess evidence, we shouldn’t forget that we can be the victims of various biases ourselves. Furthermore, congregating as self-declared skeptics shouldn’t be allowed to obscure the fact that we can create our own filter bubbles at events like TAM, and need to guard against that possibility.
To return briefly to where I started, the epistemic prudence I was speaking of might properly lead – more often than we think – to a conclusion that is essentially agnostic. In other words, the most justified position for you to take on a particular issue might be to say something like, “I simply don’t know”, or perhaps to engage in debate mostly for the sake of argument, rather than for the sake of defending a view you’re not fully qualified to hold.
Agnosticism of this sort does not necessarily entail thinking that two perspectives on an issue are equally well justified. The agnostic can believe – and even be strongly convinced – that one side of the argument is superior to the other. Agnosticism of this modest sort simply means that we recognize we are not justified in claiming certainty, and speaking in ways that presume certainty. Our discourse should acknowledge the limits of what we can know.
This is an important attitude or style to cultivate, because for as long as we are resisting unwarranted confidence or the appearance of it, we’re signaling to others and reminding ourselves that the evidence still matters, and that our minds can still be changed.
I’m emphasizing this idea because our considered views are – or should be – always contingent on the information we have access to. And, we are often in a position to confess in advance that this information is inadequate for conviction to be a reasonable attitude. We nevertheless feign conviction in conversation, partly because many of these debates take place on social media platforms that eschew nuance.
But it’s our job to fight for nuance, and to demonstrate, partly through showing that we’re willing to embrace uncertainty, why you should take us seriously when we claim that some conclusion or other is strongly justified. We devalue our skeptical currency, or credibility, when we assert certainty – and we do the political cause of skepticism harm.
To repeat, this doesn’t mean we can’t take sides, and it also doesn’t entail the sort of false balance that would require one to give a creationist a seat at the adult table. Instead, I’m urging that we become more aware of our own fields of expertise, and not overstep the boundaries of those fields, or our knowledge more generally, without expressing our views in a qualified way, aware of our limitations.
We might say that while we’re not sure, we think it’s overwhelmingly likely that some position is wrong or right. The point is that avoiding dogmatism and its more diluted manifestations reminds us that it’s possible to change an opinion when new evidence comes to light.
Our worth as skeptics is not vested in conclusions, but in the manner in which we reach conclusions. Skepticism is not about merely being right. Being right – if we are right – is the end product of a process and a method, not an excuse for some sanctimonious hectoring.
Sometimes we need to remind ourselves of what that method looks like, and the steps in that process, to maximize our chances of reaching the correct conclusion. Focusing simply on the conclusions rather than the method can make us forget how often – and how easily – we can get things wrong.
As skeptics, we need to set an example in the domain of critical reasoning, and show others that regardless of authority or knowledge in any given discipline, there are common elements to all arguments, and that everybody can become an expert – or at least substantially more proficient – in how to deploy and critique evidence and arguments.
As humanism can be to ethics – a woo-woo free inspiration and guide for living a good life – skepticism should be for science, providing resources and examples of how to be a responsible believer, and of the importance of holding yourself responsible for what you believe.
So if we’re spending excessive skeptical energy in self-congratulation for how smart we are compared to some gullible folk out there, rather than in helping them develop the intellectual resources we’re so proud of, perhaps we should consider whether we might be doing it wrong – or at least, whether we could be doing it better.