SSHRC Insight Grant: Epistemic Courage, Oppression, and the Duty to Believe
In 2023 I was awarded a SSHRC Insight Grant for a research project on Epistemic Courage, Oppression, and the Duty to Believe from 2023–27.
Here are some project-related thoughts and outputs (some of which predate the project). Below, I reproduce the summary of the project I gave in my grant proposal.
Epistemic Courage, a monograph with Oxford University Press
Normative Inference Tickets, a co-authored journal article with Jen Foster
You Ought To Have Known: Positive Epistemic Norms in a Knowledge-First Framework
Summary of the Project
The objective of my project is to explain and defend positive epistemic norms and their theoretical and political significance. I aim to correct a negative bias: many intellectuals think a lot about avoiding bad beliefs; we need to pay more attention to the opposite mistake: remaining open-minded when there is enough evidence to settle the question.
Epistemology is about how to manage our beliefs: how to believe the things we should, and avoid believing the things we shouldn't. Few questions are more urgent or more salient. We live in a world of ideologically-driven misinformation. From vaccine-hesitancy, to conspiracy theories, to Donald Trump's "big lie", it has become increasingly obvious to many people that epistemological mistakes — errors about what to believe — are a major source of social strife.
Epistemology prescribes norms about when to believe. Negative epistemic norms tell us what not to believe; positive epistemic norms tell us what we should believe. I'll argue that there is too much focus on the former kind of norm, at the expense of the latter. We worry a lot about the mistake of believing things we shouldn't believe — both in ourselves, and, especially, in our assessments of others. This bias in emphasis is widespread both in academic philosophy and in general nonacademic ideas about rationality. Epistemologists talk a lot about not believing beyond the evidence. "Don't believe something if you don't know it." "Be careful when you form beliefs, especially if the stakes are high." These negative epistemic norms echo popular nonacademic conceptions about care, thoughtfulness, and rationality, e.g. worries about "jumping to conclusions" or embracing "conspiracy theories" and "misinformation".
Negative epistemic norms have an important role to play. But the near-exclusive focus on them is, this project argues, a grave mistake. It leaves out half the picture when it comes to the question of what to believe. We need epistemology to help us decide, not merely when we shouldn't believe, but also when we should. As I’ll explain in the detailed description, I'll argue that the traditional epistemic emphasis on the negative is both theoretically incomplete and politically harmful.
My project emphasizes cases where we must believe, and where it would be a major mistake not to do so. Not believing when you should can be a form of epistemic cowardice. Virtuous epistemic agents exhibit epistemic courage, which involves believing that which they should. This is a part of doing a good job deciding what to believe, and is also socially, morally, and politically important. I develop the theoretical and practical importance of epistemic courage, grounded in both academic philosophy and anti-oppressive activism. This project restores epistemology's missing half, explaining positive epistemic norms, and their importance, both for philosophical theorizing, and for working towards a more just world.