Here's paragraph 863 of Chilcot:
Ground truth is vital. Over‑optimistic assessments lead to bad decisions. Senior decision‑makers - Ministers, Chiefs of Staff, senior officials - must have a flow of accurate and frank reporting. A "can do" attitude is laudably ingrained in the UK Armed Forces - a determination to get on with the job, however difficult the circumstances - but this can prevent ground truth from reaching senior ears. At times in Iraq, the bearers of bad tidings were not heard. On several occasions, decision‑makers visiting Iraq...found the situation on the ground to be much worse than had been reported to them.
This looks familiar. It echoes Kenneth Boulding (pdf):
All organizational structures tend to produce false images in the decision-maker, and that the larger and more authoritarian the organization, the better the chance that its top decision-makers will be operating in purely imaginary worlds.
He wrote those words in 1965.
There's a depressing inference here. The mistakes Blair made in Iraq were well known - or at least they should have been. This is not just true of the importance of ground truth. In the day job I list nine errors of judgment described by Chilcot, most of which were familiar in the early 00s. Kahneman and Tversky's classic Judgment under Uncertainty was published way back in 1982, for example, and Richard Thaler's collections of papers, Quasi-Rational Economics and Advances in Behavioral Economics appeared in the mid-90s. Any serious decision-taker in the early 00s should therefore have been well aware of the vast research on cognitive biases.
But as Chilcot documents, the decision to go to war in Iraq seems to have been taken in utter ignorance of this research.
Why? It would be nice to think that the error was Blair's idiosyncratic one and now we are rid of him, there's no problem.
I fear this is too optimistic. It is itself a manifestation of cognitive biases (the optimism bias and fundamental attribution error) and of the "leader-?????-success" fallacy. The very fact that the country is again embarking upon a risky venture without a plan unsupported by good evidence and fuelled by wishful thinking suggests we've learned nothing from Iraq. And when we have a semi-credible candidate for Number Ten wanting to "banish the pessimists", it seems we are still plagued by the mindless optimism that led us into Iraq.
Instead, I fear there's a deeper problem here. We should ask: does politics select for rationality by weeding out cognitive biases or not*? I fear not. Politicians are selected for overconfidence, and the narrow class background and lack of cognitive diversity of politicians and journalists can promote groupthink.
For me, Chilcot thus poses a systemic question: how can we ensure that political structures favour rational decision-making?
This question will, of course, be ignored. For some, the mere fact that the report discredits Blair is sufficient. And others, of course, have no desire to let politics be tainted by even the faintest whiff of rationality: as Gove said, people "have had enough of experts". I fear that British politics is like English football: people love talking about it, but they hate thinking about it.
* There's an analogy with markets here. One argument for markets is that well-functioning ones select against stupid businesses or strategies. Sadly, though, this is only sometimes the case.