The A levels U-turn has prompted the question: what went wrong with the algorithm?
The answer is: nothing, zip, diddly, jack.
What went wrong wasn't the equation, but people. People create algorithms and they do so according to the principle garbage in, garbage out. The error here was not of high-level maths. It was a basic failure to appreciate the nature of statistics. Statistics cannot discover what isn't there. And the information we really needed - how to compare students across schools - just wasn't there. As the great Dan Davies puts it:
No statistical method in the world is going to be able to give you good results if the information you're looking for is fundamentally not there in the dataset that you're trying to extract it from
The problem wasn't the algo. It was that the people in charge of it combined stupidity with class hatred. And it is of course not good enough for Williamson to claim that "Ofqual didn't deliver the system that we had been reassured and believed that would be in place." Ofqual isn't some organic food company delivering a strange-looking vegetable. It is, or should be, under government control.
What's going on here is in fact an old and widespread error. Its what Georg Lukacs called reification - the process whereby "a relation between people takes on the character of a thing". What actually happened was that people in power allocated A level results. That's a relation between people. But to some, this relation appears as a thing, an algorithm. Which effaces the basic fact that those in power are the enemy of young working class people.
With algorithms and big data becoming increasingly important, the danger is that the reification fallacy will increasingly serve as a means of effacing the reality of class power.
In fact, of course, it has done so for decades. When Luddites smashed machines, they were transferring their anger from people to things - just as people do today when they rail against IT systems failures rather than against the mismanagement that create them.
And in 1845, Marx described how capitalism appears to people not as the product of human action but rather as "an alien power existing outside them, of the origin and goal of which they are ignorant, which they are no longer able to control."
A classic example of this is how some people use the phrase "market forces" to defend income inequality. What this misses is that "the market" is not some entity existing above us but is really only a relation between people. And it is a relation of power. As Rick said, "all pay is, ultimately, a function of power." Bosses and bankers have power; they must be bribed not to misappropriate their firms' assets. But care-workers do not - often because they are women and migrants who lack outside options.
All this, of course matters for the reason Marx thought it did. If we regard inequalities and inefficiencies as arising from abstractions - be they algos or market forces - we are apt to either attack the wrong target or, worse still, to resign ourselves to fate. We need not do either. Technology - which includes social technologies such as markets - is a human construct.