More information does not mean that we make better decisions, or even good ones.
Some laboratory evidence for this comes from some experiments (pdf) at Princeton. Alexander Todorov and colleagues asked subjects to predict the results of basketball games. Half of them were told the teams' records that season and the half-time score in the match concerned. The other half were also told the names of the teams. Those who were told the names made worse predictions but with more confidence than those who weren't. "More knowledge can decrease accuracy and simultaneously increase prediction confidence" they concluded.
A couple of examples from my day job corroborate this. Economists at the University of Mannheim show that even sophisticated investors buy expensive but poorly performing actively-managed funds. And economists at the University of Maryland show that when informed investors are given advice from a portfolio optimization tool they trade more often without improving investment performance.
You might think these examples are small beer. But a similar thing might have lain behind some of the most catastrophic decisions of recent years. If the government had had zero information about Iraq in 2003 rather than the military intelligence it actually had, it might not have gone to war. And if banks had known nothing they might have held fewer credit derivatives in 2007 than they were led to own by AAA credit ratings and risk management systems which told them that big losses would be 25 standard deviation events.
There are at least two mechanisms whereby more information leads to worse decisions.
One is that it encourages overconfidence, thereby emboldening people to do things they wouldn't otherwise do, or to place bigger bets than they otherwise would. They fail to see that even good knowledge still leaves a lot of unavoidable uncertainty about the future.
A second is that information distracts people from using simple but effective rules of thumb of the sort described (pdf) by Gerd Gigerenzer (pdf). Simple rules such as "invest in passive funds" or "don't start wars" work reasonably well: they at least protect us from terrible errors. Knowledge - and especially the illusion of knowledge - can lead us to think we can better than this, even if we cannot.
All this might seem abstruse. But in fact it bears upon two of this week's big issues.
One is the idea of ranking universities by graduates' earnings. Some people responded to my scepticism about this exercise by saying that if the rankings are prepared carefully they'll be better than nothing. Weak - and necessarily out-of-date - information might however distract students from using effective heuristics such as "go to the university with the best reputation you can given your grades".
The other issue is the Skripal affair. People on all sides seem wholly confident about what happened and what the UK should do on the basis of information that seems to me to be scant at best. The heuristic "if you don't know, shut up" goes unheeded. But then, perhaps in this case the evidence is not the point.