I mentioned to someone that I thought people often mistake signs for proof, when signs aren’t even evidence. And that person asked for clarification, so here is the clarification.
What I was trying to say is that some people support their point via signs rather than evidence. I’ve often made the mistake of thinking that the people who appeal to signs rather than evidence misunderstand how evidence is supposed to work, but I eventually figured out that they don’t care about evidence. They care about signs. Explaining that point means going back over some ground.
A lot of people are concerned about our polarized society, identifying as the problem the animosity that “both sides” feel toward each other, and so the solutions seems to be some version of civility—norms of decorum that emphasize tone and feeling. I have to point out that falling for the fallacy of “both sides” is itself part of the problem, so this way of thinking about our situation makes it worse. The tendency to reduce the complicated range of policy affiliations, ideologies, ways of thinking, ways of arguing, depth of commitment, open-ness to new ideas, and so many other important aspects of our involvement with our polis to a binary or continuum fuels demagoguery. It shifts the stasis from arguing about our policy options to the question of which group is the good one. That is the wrong question that can only be answered by authoritarianism.
I think we also disagree about ontology. I’ve come to think that a lot of people believe that the world is basically a stable place, made up of stable categories of people and things (Right Answer v. Wrong Answers, Us v. Them). It isn’t just that the Right Answer is out there that we might be able to find; it’s that there is one Right Answer about everything, and it is right here–the Right People have it or can get it easily. We just need to listen to what the Right People tell us to do. I want to emphasize that these stable categories apply to everything—physics, ethics, religion, politics, aesthetics, how you put the toilet paper on the roll or make chili, time management, childraising….
There are many consequences of imagining the world is a place of fake disagreement in which there is one Right Answer that we are kept from enacting, and I want to emphasize two of them. First, in this world, there is no such thing as legitimate disagreement about anything. If two people disagree, one of them is wrong, and needs to STFU. Second, the goal of thinking is to get one’s brain aligned with the categories that are in the structure of the world (to see the Right Answer), and people who think about the world this way generally believe there is some way to do that. In my experience, people who believe the world presents us with problems that have obvious solutions are some kind of naïve realist, but it’s important that there are various kinds of naive realist (with much overlap).
There are naïve realists all over the political spectrum. That doesn’t mean I’m saying all groups are equally bad–that’s an answer to the question we shouldn’t waste our time asking [which group is the good one]. Instead of arguing about which group is good, we should be arguing about which way of arguing is better. I don’t think that there is some necessary connection between political ideology and epistemology—there are very few relativists (it’s hard to say that it’s wrong to judge other beliefs without making all the nearby cats laugh), but realists of various stripes I’ve read or argued with have been self-identified anarchist, apolitical, conservative, fascist, leftist, Leninist, liberal, Libertarian, Maoist, Nazi or neo-Nazi (aka, Nazi), neoconservative, neoliberal, objectivist, progressive, reactionary, socialist, and I’ve lost interest in continuing this list.[1] I’ve also argued with people from those various positions who are not realists (which is a weird moment when I’m arguing with objectivists), and it’s often the people who insist on the binary of realist v. relativist who actually appeal to various forms of social constructivism (Mathew McManus makes this point quite neatly).[2]
I’ve talked a lot about naïve realism in various writings, but I’ve relatively recently come to realize that there are a lot of kinds of naïve realism, and there are important differences among them. They aren’t discrete categories, in that there is some overlap as mentioned above, but you can point to differences (there are shades of purple that become arguably red, but also ones that are very much not red–naïve realism is like that). For instance, some people believe that the Truth is obvious, and everyone really knows what’s true, but some people are being deliberately lazy or dumb. These people believe you can simply see the Truth by asking yourself if what you’re seeing is true. I’ve tended to focus on that kind of naïve realism, and that was a mistake on my part because not all naïve realists think that way.
Many kinds of naïve realists believe that the Truth isn’t always immediately obvious to everyone, because it is sometimes mediated by a malevolent force: political correctness, ideology, Satan, chemtrails, corrupt self-interest, unclean engrams, or the various other things to which people attribute the inability of Others to see the obvious Truth.[3] These people still believe it’s straightforward to get to the Truth. It might be through sheer will (just willing yourself to see what’s true), some method (prayer, econometrics, reading entrails, obeying some authority), being part of the elect, identifying a person who has unmediated access to the Truth and giving them all your support, or through paying attention to signs, and that last one is the group I want to talk about in this post.
Belief in signs is still naïve realism—the Truth (who/what is Right and who/what is Wrong) can be perceived in an unmediated way, but not always; the Truth is often obscured, but also often directly accessible. These people believe that there are malevolent forces that have put a veil over the Truth, but that the Truth is strong enough that it sometimes breaks through. The Truth leaves signs.
It is extremely confusing to argue with these people because they’ll claim that one study is “proof” of their position (they generally use the word “proof” rather than evidence, and that’s interesting), openly admitting that the one study they’re citing is a debunked outlier. They’ll use a kind of data or argument that they would never admit valid in other circumstances—that some authors say there is systemic racism is a sign that those authors are Marxists, since that’s also what Marxists say. But, that the GOP says that capitalism tends toward monopoly doesn’t mean the GOP is Marxist, although that’s also what Marxists say. That one Black man, scientist, “liberal,” expert says something is proof that it’s true, but that another Black man, scientist, and so on say it isn’t true doesn’t matter. That a hundred Black men, scientists, and so on say it’s wrong doesn’t matter. Or, what I eventually realized is that it does sort of matter—it’s further proof that the outlier claim is True. That knowledge is stigmatized is proof that it is not part of the cloud malevolent forces place over the Truth—it’s one of the moments of Truth shining through. If you’ve argued with people like this, then you know that pointing out that relying on a photo, quote, or study that appears nowhere outside their in-group doesn’t suggest to them that there are problems with that datum; on the contrary, they take it as a sign that it’s proof.
Because they believe that the Truth shines through a cloud of darkness, or leaves clues scattered in the midst of obscurity, they prefer auto-didacts to experts, an unsourced heavily-shared photo to a nuanced explanation, someone whose expertise is irrelevant to the question at hand, polymaths, and people who speak with conviction and broad assertion over someone who talks in terms of probabilities.
Fields that use evidence such as law spend quite a bit of time thinking about the relative validity of kinds of evidence. Standards of good evidence are supposed to be content-free, so that there are standards of expertise that are applied across disciplines. We can argue about the relative strength of evidence, and whether it’s a kind of evidence we would think valid if it proved us wrong, but neither of those conversations have any point for someone who believes in signs rather than evidence. They’ll just keep repeating that there are signs that prove their point.
People who believe in degrees and kinds of evidence are likely to value cross-cutting research methods, disagreement, and diversity. People who believe that the Truth is generally hidden but shines out in signs at moments are prone, it seems to me, to see cross-cutting research methods and diversity as a waste of time, if not actively dangerous. They don’t see a problem with getting all their information from sources that confirm their beliefs; they think that’s what they should do. Yes, it’s one-sided, they’ll say—the side of Truth.
It’s because of that deep divide about perception that I often say that we have a polarized public not because we need more civility, as though we need to be nicer in our disagreements, but because we disagree about the nature of disagreement.
[1] Yes, I’ll argue with a parking brake, if it seems like an interesting one.
[2] I really object to the term “populist,” since it implies that the “elite” never engage in this way of thinking. That’s a different post.
[3] As an aside, I’ll mention that these people often believe that you either believe that there is a Truth, and good people perceive it with little or no difficulty or you believe that all beliefs are equally valid (a belief that pro-GOP media attribute, bizarrely enough, to “Marxism”—Marxists hate relativism). Acknowledging uncertainty doesn’t make one a relativist, let alone a Marxist. If it does, then Paul was both a relativist and a Marxist. He did, after all, say that “we see as through a glass darkly.” If you’d like to argue that Paul was a relativist and Marxist, I’m happy to listen.