I’m going to share with you one of the most important discoveries, maybe the most important discovery, of modern psychology.
Consider an iceberg. Let’s imagine that it represents the complete set of cognitive processes that are causally responsible for what we think, how we reason, how we form beliefs and how we make decisions. Out of all of this complex cognitive activity emerges our beliefs, judgments, decisions and behavior.
Notice that most of the iceberg is submerged, below the waterline. In this picture, the waterline represents the distinction between conscious and unconscious cognitive processes.
So what this image represents is that most of the cognitive processes that are causally responsible for what we think and how we behave, are operating unconsciously. We don’t have conscious access to them, we can’t consciously observe them. They’re operating below the surface, behind the scenes.
In a way this isn’t news, in the sense that we’ve known about the importance of the unconscious since Freud. But what we’re talking about isn’t connected to Freud’s theory in any way, it’s really a product of the modern cognitive revolution in psychology, which views brain functioning as a type of information processing activity.
But it does have this very counter-intuitive conclusion, that to a large extent we’re strangers to our own minds.
Cognitive biases, like any other complex cognitive process, operate unconsciously as well.
I’m not saying that all of our thinking is unconscious, that’s clearly not true. What I’m saying is that the kinds of processes that result in biased judgments, like the one’s we saw with the gambler’s fallacy, typically have roots in cognitive processes that we have no conscious access to or control over.
We might be able to consciously override a quick intuitive judgment when we’re prepared for it (this is one way that training in cognitive biases can be helpful) but we have very little control over these quick intuitive judgments, like the intuition in the coin tossing case that after a string of six heads, “tails” is somehow “due”.
These intuitions just happen to us. They “bubble up” from below the waterline, so to speak.
A consequence of this view is that we can’t detect bias simply by inspecting, or introspecting, our conscious beliefs and reasoning.
This is important to understand if we want to get into the right headspace to talk about debiasing strategies, which we’ll be getting to later in this course.
One of the biases that we’re all prone to is a bias against viewing our own judgments as biased. This is a well-studied cognitive bias — it’s known as “bias bias” or “bias blind spot”.
One of the reasons for this bias is that we have a strong intuition that we know our own minds, that we’re experts on the causes of our own beliefs and decisions. So if we can’t detect the presence of bias in our own thinking, we conclude that it’s not there.
But this is a delusion.
The truth is that for the most part we don’t know our own minds, and we need to accept this if we’re going to be truly open to the debiasing strategies that we’ll look in the third section of this course.