From the inside, via introspection, each of us feels that our beliefs are pretty damn sensible. Sure we might harbor a bit of doubt here and there. But for the most part, we imagine we have a firm grip on reality; we don't lie awake at night fearing that we're massively deluded. But when we consider the beliefs of other people? It's an epistemic shit show out there. Astrology, conspiracies, the healing power of crystals. Aliens who abduct Earthlings and build pyramids. That vaccines cause autism or that Obama is a crypto-Muslim — or that the world was formed some 6,000 years ago, replete with fossils made to look millions of years old. How could anyone believe this stuff?! No, seriously: how?
So whatever processes beget their delusions are at work in our minds as well. We therefore owe it to ourselves to try to reconcile the inside and outside views. Because let's not flatter ourselves: we believe crazy things too. We just have a hard time seeing them as crazy.
By way of analogy, let's consider how beliefs in the brain are like employees at a company. This isn't a perfect analogy, but it'll get us 70% of the way there.
we can think about beliefs as ideas that have been "hired" by the brain. And we hire them because they have a "job" to do, which is to provide accurate information about the world.
The closer our beliefs hew to reality, the better actions we'll be able to take, leading ultimately to survival and reproductive success. That's our "bottom line," and that's what determines whether our beliefs are serving us well. If a belief performs poorly — by inaccurately modeling the world, say, and thereby leading us astray — then it needs to be let go.
I contend that the best way to understand all the crazy beliefs out there — aliens, conspiracies, and all the rest — is to analyze them as crony beliefs. Beliefs that have been "hired" not for the legitimate purpose of accurately modeling the world, but rather for social and political kickbacks.
As Steven Pinker says, People are embraced or condemned according to their beliefs, so one function of the mind may be to hold beliefs that bring the belief-holder the greatest number of allies, protectors, or disciples, rather than beliefs that are most likely to be true.
The human brain has to strike an awkward balance between two different reward systems: Meritocracy, where we monitor beliefs for accuracy out of fear that we'll stumble by acting on a false belief; and Cronyism, where we don't care about accuracy so much as whether our beliefs make the right impressions on others.
And so we can roughly (with caveats we'll discuss in a moment) divide our beliefs into merit beliefs and crony beliefs. Both contribute to our bottom line — survival and reproduction — but they do so in different ways: merit beliefs by helping us navigate the world, crony beliefs by helping us look good.
Even mild incentives, however, can still exert pressure on our beliefs. Russ Roberts tells the story of a colleague who, at a picnic, started arguing for an unpopular political opinion — that minimum wage laws can cause harm — whereupon there was a "frost in the air" as his fellow picnickers "edged away from him on the blanket." If this happens once or twice, it's easy enough to shrug off. But when it happens again and again, especially among people whose opinions we care about, sooner or later we'll second-guess our beliefs and be tempted to revise them.
But it can also be helpful to take a different perspective, one in which our brains actively adopt crony beliefs in order to strategically influence other people. In other words, we use crony beliefs to posture.
Here are a few of the agendas we can accomplish with our beliefs:
As Homo sapiens, our mistakes are stubborn, systematic, and (in some cases) exaggerated by runaway social feedback loops. And this, I claim, is because our lives are teeming with other people. The trouble with people is that they have partial visibility into our minds, and they sometimes reward us for believing falsehoods and/or punish us for believing the truth. This is why we're tempted to participate in epistemic corruption — to think in bad faith.
First, it's important to remember that merit beliefs aren't necessarily true, nor are crony beliefs necessarily false. What distinguishes the two concepts is how we're rewarded for them: via effective actions or via social impressions. The best we can say is that merit beliefs are more likely to be true.
A given belief can serve both pragmatic and social purposes at the same time — just like Robert could theoretically be a productive employee, even while he's the mayor's nephew.
Something in our brains has to be aware — dimly, at least — of which beliefs are cronies, or else we wouldn't be able to give them the coddling that they need to survive inside an otherwise meritocratic system. (If literally no one at Acme knew that Robert was a crony employee, he'd quickly be fired.) The trick, then, is to look for differences in how merit beliefs and crony beliefs are treated by the brain.
We should expect ordinary beliefs to be treated with level-headed pragmatism. They have only one job to do — model the world — and when they do it poorly, we suffer. This naturally leads to such attitudes as a fear of being wrong and even an eagerness to be criticized and corrected. As Karl Popper and (more recently) David Deutsch have argued, knowledge can't exist without criticism. If we want to be right in the long run, we have to accept that we'll often be wrong in the short run, and be willing to do the needful thing, i.e., discard questionable beliefs. This may sound vaguely heroic or psychologically difficult, somehow, but it's not. A meritocracy experiences no anguish in letting go of a misbelief and adopting a better one, even its opposite. In fact, it's a pleasure. If I believe that my daughter's soccer game starts at 6pm, but my neighbor informs me that it's 5pm, I won't begrudge his correction — I'll be downright grateful.
Crony beliefs, on the other hand, get an entirely different treatment. Since we mostly don't care whether they're making accurate predictions, we have little need to seek out criticism for them. (Why would Acme bother monitoring Robert's performance if they never intend to fire him?) Going further, crony beliefs actually need to be protected from criticism. It's not that they're necessarily false, just that they're more likely to be false — but either way, they're unlikely to withstand serious criticism. Thus we should expect our brains to take an overall protective or defensive stance toward our crony beliefs.
Here, then, is a short list of features that crony beliefs will tend to have, relative to good-faith merit beliefs:
But perhaps the biggest hallmark of epistemic cronyism is exhibiting strong emotions, as when we feel proud of a belief, anguish over changing our minds, or anger at being challenged or criticized. These emotions have no business being within 1000ft of a meritocratic belief system — but of course they make perfect sense as part of a crony belief system, where cronies need special protection in order to survive the natural pressures of a meritocracy.
The better — but much more difficult — solution is to attack epistemic cronyism at the root, i.e., in the way others judge us for our beliefs. If we could arrange for our peers to judge us solely for the accuracy of our beliefs, then we'd have no incentive to believe anything but the truth.In other words, we do need to teach rationality and critical thinking skills — not just to ourselves, but to everyone at once.