The complexity of Kwasi Kwarteng

What the Chancellor’s disastrous tax cuts can teach us about trying to change complex, human systems

Kwasi Kwarteng has had a rough week following his ludicrous “mini-budget” on 23 September. Kwarteng’s plan was, in short, to borrow money in order to fund tax cuts, mostly for the very wealthy. These tax cuts would mean more money in those people’s pockets, which would – he hoped – mean more consumer spending, which would – he hoped – kick-start economic growth that would in the long term – he hoped – more than pay for the borrowing.

The problem is that the Bank of England, which has a mandate to control inflation, was worried that such economic growth would be a “sugar rush”, an increase in demand without a matching increase in supply. Such growth would make the current problem of inflation even worse. And so it set out to curb the inflationary effects of Kwarteng’s changes, signalling that it would make government borrowing more expensive in the future.

The markets responded, pricing in future increases in interest rates, and so the cost to the government of borrowing money rose to about ten times what it was a year ago. But the upcoming rates rises won’t just increase the cost of government borrowing; they’ll also increase the cost of borrowing for private companies. This will reduce the chances of private companies making investments, which will further harm the chances of the economic growth that was necessary if Kwarteng’s gamble were to pay off.

Kwarteng seemed, bizarrely, to have thought about his policy in isolation: if he cut taxes, people would have more money; if people had more money, they’d spend more money; and so the economy would grow. He neglected to think about what else would happen as a result of his changes – how the Bank of England and the markets would respond, and whether those responses might undermine or even cancel out his intended results. When you put it in these terms, it seems colosally moronic. And… it is. You’d expect a Chancellor (and an Economics PhD!) to understand this. But it’s a mistake that happens, albeit on a smaller and less catastrophic scale, in all sorts of organisations.

In a small, simple, controlled system – a “linear” system – the relationship between cause and effect is simple. The weather changes; the temperature falls; the thermostat clicks on the central heating; the house warms up; the thermostat clicks it off. But the economy is far from a simple, controlled system, and the relationship between cause and effect is indirect and obscure. The same is true in any system that features observant humans who respond to the people around them.

This is the problem of reflexivity: human relationships are bidirectional, and people change their behaviour according to changes in others’ behaviour. This leads to things like self-fulfilling prophecies (e.g. if the government says there’ll be a recession, it might dent consumer confidence to the point that one actually occurs) and the observer effect (e.g. when you involve people in a sociological study, they no longer behave how they would have if they hadn’t joined the study; their natural behaviour is impossible to observe).

One of the neatest summaries of reflexivity is Goodhart’s law, named for the economist Charles Goodhart and slightly wonkish in its original phrasing:

“Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.”

In other words, if you make something a target, people will change their behaviour as a result. Sometimes that means actively gaming the system, but it doesn’t have to be as extreme as that; it’s enough that people respond and change their behaviour as a result of your choices. Either way, this focus on the target will almost inevitably ensure that the underlying measurement stops being useful.

Played out on the scale of the whole economy, Goodhart’s law leads to the beautifully named iatrogenic volatility, coined by Larry Summers to refer to the situation when a policy designed to improve a situation actually makes it worse:

“Iatrogenic illness is when you go into a hospital and you catch an infection there. Iatrogenic volatility is when policymakers, whose role is to stabilise markets, destabilise them with their actions.”

Kwarteng’s behaviour is a perfect example of this. By treating the economy as though it was a simple system, he applied a cure much worse than the disease. But this isn’t just a problem that affects Chancellors of the Exchequer; it’s something we all need to watch out for. Every time we introduce a new incentive scheme, a set of quarterly targets, or a new company policy, we risk falling foul of Goodhart’s law – and achieving the very opposite of what we set out to do.