In the series so far, I have explained techniques to remove subjectivity and to make decisions objectively.
But, there is an elephant in the room when it comes to decision-making. Unfortunately, we are the elephant.
We have this problem because of our cognitive biases. These are those observer effects including very basic statistical, social attribution, and memory errors that are common to all human beings. These effects may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or even irrationality.
This article is not meant as a thesis on a well-documented branch of psychology, in much the same way that the earlier articles were not meant as statistics and mathematics treatises. The purpose of this article is to highlight how cognitive bias effects decision-making, how to recognise it, and some tools to mitigate it.
Let us start with a simple experiment. I ask you to rate yourself as either above average, or below average on how honest you are, how modest you are, and how unbiased you are.
I would be most surprised if you answered anything other than you are above average on all counts.
I now ask you to perform the same experiment with a group of people. Again, I would be surprised if you came up with a result that was anything other than an overwhelming number of people thinking they are above average when asked each of those questions.
You are now probably thinking that this can’t be right. Surely there must be some above average and some below?
What you can see here is cognitive bias. People tend to think they are better at things than they actually are; they have a blind-spot about themselves, hence the name blind spot bias.
Let’s take this one step further. Certainly, some who expressed above-average honesty surprised you. Your assessment of their honesty might well be well below-average, and you probably have a reason for that. This again is cognitive bias, and this time it is yours.
We now have a subject’s assessment of themselves, your assessment of the subject and, of course, somewhere there is an objective, accurate, and incontrovertible assessment. I promise you that none of these are likely to correspond.
This paradox is where the phrase “perception is reality” comes from. Your assessment is your reality and your subject’s assessment is their reality.
I am going to examine a few use-cases of cognitive bias in the remainder of this article, but before then I offer the following infographic, which arranges over 180 recognised cognitive biases. If 180 is too big a number to work with, they are accumulated into about 20 groups, and then four categories.
For this article I am just going to exercise a small handful of these biases that relate to some of the themes we have explored earlier.
Basically, our cognitive biases are filters, and they operate to simplify the world we perceive.
Still, a man hears what he wants to hear
And disregards the rest
Paul Simon, The Boxer
Generally speaking, the information we need to make a decision is available to us. However, we tend to look for more to help us justify ourselves. This exposes us to information that can impede our decision. Sociologists and psychologists call this the information bias, a state where we think we need more information to make the decision. Too frequently this information does not directly contribute to the decision.
This bias is one of the greatest contributors to procrastination or “analysis paralysis”.
Overconfidence affects those of us who think we are experts in the field. Once this happens, it becomes much more difficult for us to listen to others, even if the suggestion or advice given is valid and will benefit us in the long run.
It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so.
We all have a different perspective of the same situation and your interpretation might be wrong. Therefore, to avoid making stupid mistakes, you should be conscious of being overconfident and question everything, including your biases, knowledge, and assumptions.
This effect is all too common in consumerism, and is the reason for fashion trends. It is also evident in politics and business.
At the time of this article, cloud computing is fashionable, and people want to get onto this particular bandwagon. But, bandwagons often run out of steam and people soon start abandoning them to look for a new one.
While the bandwagon effect can be utilized positively, it kills your creativity because you are very influenced by group-think. You can break this by determining whether what the group thinks or decides is rational. If not, then don’t ride the bandwagon.
When this cognitive bias is at play, we think that past events play an important role in future outcomes.
In the gambler’s mind, the fact that the last six tosses of the coin have produced all heads means that the likelihood of a tail next has grown to an almost certainty. It hasn’t. The next toss of the coin has the same 50/50 chance that it always had.
This gambler’s fallacy is so ingrained that when we fail at something, we forget that repeating it will almost certainly have the same outcome as last time.
To avoid succumbing to gambler’s fallacy, you must treat each event independently. You have to understand that the odds of a specific outcome happening again are the same as they always were.
Most, if not all, of us are prone to stereotype. The problem with this cognitive bias is that it tends to affirm what we believe rather than disprove it.
When our beliefs are based on stereotypes, the information becomes distorted which, in turn, can lead to a distorted decision. Worse, stereotypes cause us to reject other information that does not affirm our belief.
Confirmation bias happens when our desire begins to influence our belief. That means if we want a certain idea to be true, we will soon believe that it is true. When this happens, we stop thinking objectively and only accept information that confirms what we want to believe.
Just like the blind spot bias, combatting confirmation bias is difficult because of the element of self-deception. One way to mitigate this bias is to take advantage of group-think, as discussed in plan A and B. This will force you consider approaches you may have rejected earlier.
optimism, pessimism and errors of option
When assessing options, we tend to either over-estimate or under-estimate the likely option value. Option value is defined by the following formula:
(probability of outcome) * (value of outcome) = (value of option)
We often agree on the value of an outcome as it can be determined objectively in many cases. But we do differ on assessing the probability, because is far, far more subjective. Consider the following situation, where we have two options. Option A is worth $1,000, and Option B is worth $600. We agree that Option B is more likely to succeed than Option A. We also agree that both options are likely to succeed. But, our individual assessment of their likelihood differs, and this quite significantly affects the decision.
Person 1 assesses the probabilities as 0.55 and 0.95 respectively, which gives Option A an option value of $550, and Option B a value of $570. Option B is therefore the winner.
Person 2 assesses the probabilities lower at 0.50 and 0.80, which produces option values of $500 and $456 respectively, which leads to a decision that is the opposite of Person 1’s. Option A is now the winner.
But, this doesn’t seem right, does it? They agree on so many points, yet have arrived at different outcomes. Once again, the effect is due to cognitive bias.
Let us play the scenario with Person 2 forwards a little. The fact that the calculated outcome favours the more risky approach does not sit well with their zero-risk bias. Consequently, their information bias will take over, and they will re-examine the data and even change the data to support their preconception – the confirmation bias.
Statisticians describe this as “torturing a confession from the data”.
Whilst we have all done this, or similar, in the past, we need to be rigorous in ensuring we avoid such thinking when taking important decisions.
Which is not to say that all cognitive biases are bad, because they aren’t. Some lead us towards ethical choices, empathetic choices and selfless choices.
What is important is to understand we have these biases, and recognise when they are influencing us. If we know we are thinking this way, we can take steps to mitigate the effect.
If you find you are re-defining, re-calculating or ignoring some the information to make it conform to an outcome, then I suggest your cognitive biases are in play.