Us analytical types can easily fall prey to the temptation of over-analysis; more reckless folk, on the other hand, tend to leap before they look, and act rashly. They’re two sides of the same coin: acting when you don’t have the right amount of knowledge – in one case because you have too much information, and in the other because you have too little.
A key part of sense-making is knowing when to stop – and knowing when you probably need to think a little bit longer. It’s a question of sufficiency, and it’s the subject of this week’s article that continues my series on sense-making.
This week’s article
You’re never going to know everything. So how do you know when you know enough to act?
Sense-making is about making sense of the world in order to act in it. But how do you know when you’ve made enough sense, and therefore when to act? In uncertain situations, you’re never going to know everything. Sometimes knowing everything is merely impractical; sometimes it’s impossible. In sense-making, understanding when you know enough to act, and when to wait and gather more information first, is often called sufficiency.
Sufficiency is partly a philosophical question about knowledge, but it’s also a question of comfort – of developing the ability to feel confident acting even in the absence of a complete picture and in the absence of perfect information.
That confidence comes from reducing the risk of getting things wrong. And getting it wrong is quite easy! You can think you have sufficient knowledge when you don’t and act too soon, too rashly. Or you can fail to act despite having enough information, and cause delay and inertia – or perhaps miss an opportunity entirely.
In the most recent article in the series, we looked at the broad concepts of Cynefin, and how it gives us the tools to recognise different decision-making situations. Different types of situation demand different tools in order to make decisions effectively. (We probably shouldn’t probe and experiment when baking a cake, but instead should follow a recipe; we probably shouldn’t expect an organisation with hundreds of people in it to behave in predictable and ordered ways, but instead should change things gradually and carefully.)
The same is true of sufficiency. What constitutes sufficient information in order to act is different in different contexts.
In the “clear” domain, sufficiency is about having all of the knowledge you need in order to apply best practice. That means understanding what the problem is that you’re facing, how that same problem has been solved many times before, and running through an established process. (My cake has been in the oven for 20 minutes, but is still raw in the middle; I have enough information to know that best practice suggests leaving it in for another ten minutes. An iron fence is at risk of rusting; I know enough to know that best practice suggests painting it. A customer is complaining that they’ve forgotten their password; I know enough to know that best practice suggests resetting it for them.)
The “complicated” domain is similar, but “best practice” doesn’t exist; you’re in the realms of merely “good practice”, where there are several different viable solutions. Sufficiency is about having collected enough data and done enough analysis to evaluate those different potentially useful solutions, and to make a choice between them. (If you’re drilling for oil, there might be arguments for both site A and site B, with complicated analysis to justify each; there’s no obviously correct solution, but analysis can help you decide. Likewise, there might be several different viable approaches to a legal case that a lawyer could take, and they have to use their judgement and experience to choose between them.)
In the “complex” domain, sufficiency is much less clearly defined, because there’s not even “good practice” to rely on. Sufficiency in these scenarios is about having confidence in your understanding of patterns that seem to be emerging; your actions are less about making permanent decisions, and more about giving more resources to the desirable and useful patterns and fewer resources to the harmful ones.
You gain this confidence using “probes”; little safe-to-fail experiments that nudge the situation in a particular direction in a controlled way. Liz Keogh defines the characteristics of a probe as having:
- Indicators of success
- Indicators of failure
- A way of amplifying the probe if it succeeds
- A way of dampening the probe if it fails
- Coherence – a “realistic reason for thinking that the probe will have a positive impact” – in effect a narrative about a positive future where the probe has been successful
The important thing to note here is that, in a complex world, you act first. Complexity doesn’t become clearer through analysis; this is definitely the domain where “analysis paralysis” is common. But you keep your actions small, and safe-to-fail; it’s also easy to make the opposite mistake, and to go too big too soon and throw the system out of whack. Sufficiency here is when you’re seeing indicators of success or failure; the further action you take is to amplify or dampen the probe.
Getting a feel for what level of knowledge is sufficient before you act, then, is a crucial part of sense-making, and relies on understanding the context you’re in. But once you understand sufficiency in your current situation, you can avoid analysis paralysis, where you continue thinking about the situation long past the point where you should have acted; and you can avoid leaping before you look, taking rash decisions that could have been avoided with a little more thinking.
This week’s five interesting links
Great life and career advice from Y Combinator founder Paul Graham:
“Once you’ve found something you’re excessively interested in, the next step is to learn enough about it to get you to one of the frontiers of knowledge. Knowledge expands fractally, and from a distance its edges look smooth, but once you learn enough to get close to one, they turn out to be full of gaps.
“The next step is to notice them. This takes some skill, because your brain wants to ignore such gaps in order to make a simpler model of the world. Many discoveries have come from asking questions about things that everyone else took for granted.”
Anton Corbijn’s new film, about legendary album cover designers Hipgnosis, looks great.
“Thorgerson and Powell were very different individuals, but that difference worked perfectly. Corbijn explains their dynamic: ‘They loved making things,’ says Corbijn. ‘One with great ideas and one with the technical skills to execute these ideas.’ He knows first-hand how demanding it is to deliver album design in its entirety: ‘I have done a lot of record sleeves in my life, but I’ve not designed that many. I may have taken the photo on the sleeve. Hipgnosis however, did everything. It’s amazing they came from nothing in a way. Neither of them were educated in the visual sense. They found ways to do the impossible.’”
A fascinating New York Times piece about Guam, a piece of America that both is and isn’t part of the USA:
“Guam, with its strategic location, quickly became home to Andersen Air Force Base, where B-52 bombers deploy on a rotational basis, and Naval Base Guam was expanded. The Guam tourism board’s slogan, Where America’s day begins!, was everywhere. The Guam Chamber of Commerce proudly proclaimed the island America in Asia! while Guam’s license plates read Guam, U.S.A.; but underneath that they also said Tano Y Chamorro — ‘the land of the CHamoru.’”
As tensions between China and the USA ratchet up, Guam is uniquely and unfortunately placed:
“In every iteration of war games between the United States and China run by the Center for Strategic and International Studies (C.S.I.S.), Beijing’s first strike on U.S. soil has been to bomb Guam.
“Yet the island is largely forgotten by most Americans. Guam plays a central role in ‘homeland defense,’ though it rarely shows up on maps or in textbooks about the homeland — no place tries harder to show its patriotism and gets so little recognition in return.
An interesting profile in the New Statesman by Katie Stallard of Alexander Lukashenko, Belarus’s dictator and the recent mediator of the Wagner mutiny. It charts his rise from humble beginnings to absolute power in Belarus:
“Born in 1954 in a poor village in eastern Belarus, Lukashenko was raised by a single mother who worked as a milkmaid, and he was bullied at school by other boys because he didn’t have a father. He served in the Soviet army and worked his way up through the Communist Party ranks to become the director of a collective pig farm in 1985, where he was once accused of beating a tractor driver with a shovel for coming to work drunk.”
Paul Millerd reviews Bill Perkins’s fascinating book Die With Zero. Most people plan their lives on autopilot, working hard and saving as much as possible for retirement, but that’s a mistake:
“This is a push against the default or what he calls ‘autopilot.’ Most people look at life and retirement as a money problem because, well, that is what everyone else is doing. By looking at life as a life energy allocation problem you might make different choices like giving money to your children while they are still alive, deprioritizing work and buying back time before you are ‘supposed’ to retire, and spending lavishly on experiences that might pay ‘memory dividends’ to you and people in your life.”
Perkins emphasises the importance of making memories while you can:
“As we get older, we spend more time reflecting on our lives. This is why Perkins argues that more people should think about old age as a combination of savings AND memories. Through this lens, having memorable experiences earlier in life, especially before retirement, can be valuable because they will pay memory dividends. And there’s good evidence that this is what makes people happy.”