top of page

Why Your Judgement Often Suffers in Times of Uncertainty


There’s no shortage of uncertainty at the moment. What’s going to happen to the economy? When will our country fully open up, and if it does, will it be the same as before? Will there be a second or even a third wave of the virus? What’s going to happen in the US election?

All this uncertainty makes for some difficult decision making. And that’s because uncertainty forces us to consider statistical probabilities. And most of us are surprisingly bad at statistics. And, to add to the difficulty, most of us don’t know that we suck at probabilities, mostly because while we have a terrific memory for the times when we were right, we fail to keep track of our many, many failed predictions.

To make sense of uncertainty, we tend to rely on three different mental strategies, or heuristics, as identified by psychologists Amos Tversky and Daniel Kahneman in their seminal 1974 study on judgement under uncertainty. The problem with these strategies, is that they trap us into limited thinking, limited imagination, and limited possibility.

Representativeness

The first of these traps is representativeness, which is how an event, situation, person or thing does or doesn’t fit into a particular category. Our brains are wired for quick decisions based on past experiences – especially when we’re under stress. Unfortunately, this is not always the best tool for predicting the future. We forget things, like how small sample sizes can create outliers, how correlation doesn’t equal causation and how our perceptions of the quality of ideas, people and situations have a tendency to regress toward the mean (which is why the first bite of chocolate cake is always so much better than the last).

And this gives rise to biases that get us into all kinds of trouble. For example, after an hour of watching a certain news network, one could form the representative bias that all protestors are also looters. Or after watching the stock market for the past few months, that the economic impact of the COVID crisis is over and that the economy will continue to grow. Or after spending some time in Silicon Valley, that the only good software engineers are straight, white, Asian or South Asian men under the age of forty.

And, as we’re making these poor judgements based on our own internal biases, we feel pretty damn confident. And comfortable. It doesn’t occur to us to look beyond our assumptions to understand the actual truth of what’s going on around us, even if the information we’ve based our judgement upon is slim, unreliable or outdated.

What to do?

The way to avoid falling into the representativeness trap is to seek to see past what heuristics pioneers Tversky and Kahneman call “the illusion of validity”. Expand your horizons. Seek multiple perspectives. Let go of the need to be right, the need for stability and the need to cling to ideas that are no longer working.

Availability

The next trap is availability, which is the ease with which a particular idea or alternative can be brought to mind. When problem solving under uncertainty, our natural tendency is to limit our focus to a few known possibilities and to overestimate the importance of the ideas that are most available to us. The thing is though, just because we come to it easily, doesn’t make it true.

The hard truth is, we don’t know what we don’t know. No one likes this fact because the not knowing makes us uncomfortable. So, we magnify the importance of what we do know (or what we think we know). The danger here is that when we’re trapped in limited availability, we can’t even imagine what’s possible. This creates a form of self-soothing tunnel vision. From here, it's impossible to sense what’s emerging around us leaving us vulnerable to chasing red herrings.

What to do?

The best way to know more is to invest in continuous learning. Read outside of your profession or industry. Form a large network of contacts with different backgrounds and perspectives. And most importantly, diversify the people you work with. If everyone on your team is the same gender or went to the same school or is the same color or came from the same socio-economic background, you don’t know what you don’t know.

Anchoring

The final trap is, anchoring, which is the tendency to base our decisions relative to the first piece of information that becomes available to us. The problem with anchoring is that different staring points result in different guesses and that the anchor itself often has nothing to do with the thing we’re judging.

Here’s how it works. Let’s say I ask you to spin a wheel (numbered 1-100) before guessing the number of countries in Africa. If your spin results in a higher number, you will guess a significantly higher number of countries than if you spin a lower number.

Just think about the real-life implications of this bias. How does it impact your forecasting, your budgeting, or your purchase decisions? (Clever salespeople are often very astute at strategically placing anchors by presenting the most expensive option first. After that, the other options look like bargains.)

What to do?

Well for starters, remember that the anchoring bias is a thing and watch for it. Delay your decision-making and take time to gather information and compare options. Then decide. Better yet, do the research in advance and drop your own anchor.

The Point

The common denominator in avoiding all of these biases is to stop your navel gazing. Lift your head up and become a humble, curious, questioning, complexity-loving force for good in the world – even if that makes you uncomfortable.

e-cover.jpg
The Unstuck Leader book is now available.
Featured Posts
Recent Posts
Archive
Follow Me
  • Facebook Basic Square
  • Twitter Basic Square
bottom of page