One of the most important lessons I’ve learned: don’t believe anything completely.
As humans, we seek certainty.
We are particularly talented at telling stories to help us maintain our certainty.
Justifying the purchase we made, our career path, our decisions.
We are storytellers, both in the way we describe things and how we remember and understand them. Linking events together in a logical way helps us remember things, and reassures us that we understand the world around us.
The problem is, none of those things are true.
Richard Feynman said “You must not fool yourself, and you are the easiest person to fool.”
The author Nassim Taleb agrees.
He’s shown repeatedly that as humans, we overestimate our understanding, constructing stories and narratives to fit events after the fact. We ignore extreme, unexpected events because they don’t fit our models of the world.
There are a whole host of other studies showing similar psychological effects:
The solution?
Don't believe anything with certainty. There is always a level of uncertainty, whether we can assess it accurately or not. Having a blanket rule is easier.
Erring on the side of uncertainty is likely to cause some mental discomfort. But it opens us to the possibility of other options.
Believing something with a high probability is okay. In most circumstances, it should be accepted as good enough.
But believing something with complete certainty closes us to the possiblity of another view, another reason, another answer.
And it makes us vulnerable to a whole host of other psychological mistakes.
I'd rather accept a little mental discomfort if it means I fool myself just a little bit less often.