Reducing Bias: The Linda Problem and the Cost of Neglect
The Linda Problem
The “Linda problem” (Tversky and Kahneman, 1983) is a well-known scientific experiment proving that people do not always decide rationally. Over 80 percent of participants in the experiment made a biased decision. It proved two things: (1) that this particular bias exists, and (2) that bias is a predictable judgment error. Try for yourself.
It also implied that decision-making needs to be de-biased.
The Cost of Neglecting Bias
A good friend of mine who has been a captain of industry for as long as I can remember tells a fine story about bias.
Early in his career, he was send to the far east to manage a chain of wholesalers. He went out and bought a local business to add to the chain. It set him back a few million, only to find out that the new addition was not profitable for long and had to be dismantled. He fell victim to biases that could have been avoided: overconfidence and conformation bias (only looking for confirmatory evidence).
His boss called him up to talk about it. He went with sweaty palms, sure that he would be fired. Instead, he was told: "It was a very costly mistake. Don't do it again". Surprised, my friend made sure: "I'm not let go?"
"No", was the answer. "You cost us plenty, but I'm sure you have learned your lesson. Now go and make me some money to make up for the loss".
I bet it doesn't take you long to remember a stupid judgment error of your own. Was it during the last management team meeting or one before? What did it cost you?
This type of judgment error is a waste of time, money and opportunities that should not have been necessary. If only we had paid more attention to the quality of information and its meaning.
The Most Common Biases
Wikipedia has a list of decision-making errors you can check for. However, the problem is that checking for bias doesn't help much. Even when we know that we're biased and how, we still make the same judgment errors.
To reduce biases, you have to take action and involve yourself in discussions with 'dissenters': People with different views, because they have different backgrounds than you: from other industries, other companies, other countries, other anything.
Learn to De-Bias Actively
De-biasing takes effort, because we need to keep our minds open to information we don't want and structure discussions in a way we don't like. Our aversion stems from unfamiliarity and not wanting to change habits.
That's a real problem for executives. Because: the more decisions you have to take on a daily basis, the more you suffer from decision fatigue. Which means: your willpower is depleted and you turn to the same old solution.
Take 20 Seconds to Set Yourself Up for Habitual De-Biasing
You need to increase the energy it takes to make 'normal' decisions, and to decrease the energy it takes to de-bias. Then it won't be so tempting to take the path of least resistance, even when you have decision-fatigue. Here's how you do it in 20 seconds:
- Remind yourself that you've had it with decision-errors such as the ones you made recently (the fed-up feeling increases will-power)
- Have a system in place to jump to (this takes the effort out of new habits). For instance, have a flip-board in your room with felt pens and always use it to unravel the arguments on the table in claims and evidence (see these tips)
- But first: test you decision-making for errors with this quick, one-question, test (no email required):