Every time you poke or prod a complex system, you run the risk of unexpected behaviour – both good and bad
The phrase “unintended consequences” has become so commonly used that it’s hard to believe that it was only popularised in the 1930s, coined as it was by the eminent sociologist Robert K. Merton. It’s passed into the common language in a way that few technical terms from sociology have, and with good reason: virtually everything we do has both intended and unintended consequences, especially when other human beings are involved. The intended ones might be quite straightforward, but predicting the unintended ones is fiendishly difficult – and that’s what leads us collectively to make many of the mistakes we do.
There are three broad types of unintended consequence:
- Unexpected benefits
- In other words, luck or serendipity; positive outcomes that were accidental. You were aiming for positive outcome “A”, but you got positive outcome “B” instead (or maybe in addition!)
- Unexpected drawbacks
- Negative outcomes that were unintended. You were aiming for positive outcome “A”, you might well have obtained it, but you got negative outcome “X” as well.
- Perverse results
- Like the malicious genie that interprets your wishes in the worst possible way, you got what you were aiming for – but that turned out to have exactly the opposite consequences to the ones you wanted, making the problem you were trying to solve worse.
From the 1950s, the border between West Germany and East Germany became one of the most heavily fortified in the world. The border served its intended purpose with brutal efficacy: around a thousand people died trying to cross it between the 1950s and the 1990s, when Germany was unified. But it also had unexpected benefits. On the western side of the East German fortifications was a strip of land known as the “death strip”, still part of East Germany but cut off from it by the border fortifications, a no-man’s land that made the border harder to cross. Left untouched by humans and freed from agriculture, the land flourished; after reunification, it was preserved as part of a green belt that stretches 1,400 km through the centre of Germany and remains a vital home for wildlife.
Unexpected benefits are common in research and scientific experimentation. Perhaps the most famous is the invention of Sildenafil, better known as Viagra. The drug had been synthesised by Pfizer scientists looking for a treatment for hypertension and angina. But, during the original clinical trial, it quickly became apparent that the drug had very different effects, and could conceivably be used as a treatment for erectile dysfunction. The drug was patented in 1996, and even in 2016, years after its patents began to expire, it brought in $1.6 bn for Pfizer. It’s the quintessential example of a whole class of research called “drug repositioning”, which seeks to find unexpected benefits of existing drugs.
Unexpected drawbacks are sadly common too, though, and are often a result of policymaking and human behaviour colliding. One infamous example is the prohibition of alcohol in the US in the 1920s. While alcohol consumption certainly did decline under prohibition – the expected benefit the policymakers were aiming for – it was by no means eliminated. Instead, the production of what alcohol was consumed became, by definition, the preserve of criminals. Organised crime groups, who had previously involved themselves in prostitution, gambling, intimidation and theft, found a new and highly profitable market in bootlegging. Organised crime networks in Chicago tripled in size during prohibition, and were able to consolidate and centralise – surely an unexpected drawback of the policy.
These unexpected drawbacks aren’t just limited to human behaviour; we’re constantly discovering them in our interactions with the complexity of nature, too. One example comes from Australia in the 1930s, where attempts to grow sugar were being challenged by the native cane beetle. The beetle’s larvae feasted on the roots of the sugar cane plants, destroying crops by the thousands of tons. The Bureau of Sugar Experiment Stations hatched a plan: rather than use pesticides, they’d import cane toads from Hawaii, which would eat the beatles. 62,000 toads were released in Queensland in 1937. Without natural predators, the toads flourished and by the mid-2000s the descendants of this original population had spread to every corner of Australia and numbered over 200 million. They’ve caused countless environmental problems of their own. In an ironic twist, it turned out that they also failed to eat the cane beetles that were attacking sugar crops; they couldn’t jump high enough to reach the beetles, which tended to live at the tops of canes.
Perverse results are different from unexpected drawbacks. It’s not that you wanted something good to happen and something bad happened instead: it’s that the you made the very thing that you were trying to fix worse. My favourite example is the possibly apocryphal tale of the British Raj’s attempt to cull the cobra population of Delhi, as told by the economist Horst Siebert. In order to encourage the population to kill cobras, a reward was offered for each dead cobra handed into the authorities. This worked well, and many cobras were handed in. Soon, however, entrepreneurial people started breeding cobras in order to get the reward for them; when the government discovered this, they abandoned the reward programme. Without any reason to keep the cobras, the breeders turned their now-worthless snakes out into the wild – with the perverse result being that there were more wild cobras than when the policy was first introduced.
More recently, Barbara Streisand gave her name to the “Streisand effect”. In 2003, the California Coastal Records Project published photos of the California coastline to keep track of erosion levels. One of the 12,000 photos was of Streisand’s house, which she objected to. She sued the photographer in an attempt to get him to remove the images, but in doing so drew far more attention to the images than they ever would have experienced otherwise. The image of her home was viewed six times before the lawsuit (two times by Streisand’s lawyers), but was viewed over 400,000 times in the month following the lawsuit. To add insult to injury, she lost and was forced to pay the photographer’s legal fees. The effect has been repeated countless times: attempts by famous people to suppress obscure content online generally results in that obscure content becoming more famous.
Unintended consequences occur whenever people mess with complex systems and expect them to operate in simple, easy-to-understand and linear ways. They think they can anticipate the way that the system, and everything within it, will respond. The engineer John Gall, in his legendary book The Systems Bible makes the observation that “systems in general work poorly or not at all” – but also, dismayingly, that everything is a system, and that all systems are part of larger systems that scales infinitely upwards and downwards. It’s perhaps no surprise, then, that all our actions carry with them the possibility for unintended consequences both positive and negative. We’ll never be free of that – it goes with the territory of complexity. But by understanding the broader types of unintended consequence that exist, we stand a slightly better chance of making the same mistakes again and again.