It's a commonplace that politics is becoming more polarized and tribal: think of Remainers vs Brexiters or Trumpites vs Clintonites. This isn't simply because people have retreated into intellectual trenches in which they associate only with the like-minded and get their news only from sources that echo their prejudices. It's also because of another mechanism which is variously called attitude polarization, the backfire effect or asymmetric Bayesianism.
This is the process whereby people who are confronted with evidence that disconfirms their beliefs do not become more open-minded as Bayesianism predicts, but rather become dogmatic. The earliest evidence on this came from a 1979 paper (pdf) by Charles Lord, Lee Ross and Mark Lepper. They showed people mixed reports on the effects of the death penalty. They found that, after reading these reports people who supported capital punishment became stronger in their support, whilst opponents of the death penalty also became more dogmatic. This arises from a form of confirmation bias: we look favourably upon evidence that confirms our position but sceptically upon disconfirmatory evidence.
Reasonable people lament all this.
But here's my problem: I find it damnably hard not to be an asymmetric Bayesian. One reason for this is that having learned about cognitive biases I now see them everywhere. Yes, I know - that's the confirmation bias. But of course, it's easier to see them in others than in oneself: physician, heal thyself.
But my opponents don't help me. Many who favour tough immigration controls, for example, look like hysterical nativists who don't even have the wit to recycle Rawlsian law of peoples-type arguments. And many aren't even trying to convince me; perhaps most political discourse is preaching to the choir rather than an attempt to persuade opponents. (The Left are probably as guilty as the right here).
So, how can I resist the pressure to be an asymmetric Bayesian? Here are four things I try:
- Construct one's own argument for something one doesn't believe in. I tried this here, here and here.
- Look for weaknesses in your own theory. For example in yesterday's post, I didn't establish that people in power actually are racist, but merely expressed scepticism about the validity of other explanations. That wasn't good enough. (In fairness, though, proof on this matter is perhaps unobtainable.)
- Look for disconfirming evidence. To take yesterday's post again, I might have pointed out that Indian men earn more than similarly qualified white men. This might be evidence against employer racism. (But is it proof? It's possible both that Indians sort into high-paying occupations and that they suffer racism which causes their wage premium over whites to be lower than it otherwise would.)
- Look for intelligent arguments against one's case. The fact that most of one's opponents are silly and hysterical doesn't mean they all are. For example, I especially valued this post by Ben Cobley, and welcome the writing of Bryan Caplan and his colleagues or Andrew Lilico.
Of course, in saying all this I don't claim to succeed in resisting the pressures of asymmetric Bayesianism. No doubt I fail.
But here's the problem. My efforts to do so arise solely from intrinsic motivations. External incentives actually encourage asymmetric Bayesianism. The MSM select for overconfident blowhards; Toby Young, remember, actually earns a living. And most politically-engaged people on left and right would rather associate with a fellow partisan than a pointy-headed sceptic. When Michael Gove said that the country has had enough of experts, he was in part expressing a truth.