I think, therefore I err

Most of us are familiar with the Rene Descartes quote, “I think, therefore I am.” It’s woven into the fabric of modern man that we equate thinking with consciousness, with awareness and self-hood. But what if a large part of our thinking is completely hidden from us?

I’m currently reading a book by Daniel Kahneman called Thinking, Fast and Slow. It’s a 500-page tome that appears to have set out to make you question everything you think you know – well, really everything that you think more generally. Kahneman’s claim to fame is winning the Nobel Prize in Economics – not in spite of being a psychologist, but because of it. You see, economic theory historically took the view that humans are generally rational decision makers. As a psychologist, Kahneman knew this to be absurd! He’s spent a rich career uncovering a rather daunting list of cognitive biases humans are prone to. We are often irrational and illogical. We make mistakes in our thinking are often blind to them.

In his book, Kahneman introduces us to the two systems on which our thinking is based. They’re named, simply enough, System 1 and System 2. System 1 is fast, intuitive, based on associations, and unconscious. System 2 is slow, rational, logical, and accessible to our conscious awareness. System 1 feeds input (such as general impressions, knee-jerk reactions, well-learned information) into System 2, which then uses it in its more laborious, diligent manner. It seems like pretty good teamwork, and as Kahenman points out, it does seem to work pretty well most of the time. The main problem is that System 2 is a little lazy. It seeks to preserve its limited resources and rarely fact-checks System 1 – which means we are working with potentially unreliable thoughts and ideas, and worse yet, are not even aware of it.

We rapidly take in, process, and retain information in a way that fits our expectations and preferences. Even if our reasoning were flawless, we aren’t basing our reasoning on facts, but rather on perceptions and ideas that have been man-handled to fit in nicely with everything else. And yet we are generally quite confident! In fact, we are far more confident than we are accurate in our judgment – this has been borne out time and again by research. Over-confidence is even more rampant and troublesome among leaders. People who tend to rise to leadership positions tend to exhibit high levels of self-confidence, and continued success breeds further confidence. This can lead us to narrow our perspective, fail to gather sufficient input on issues, and poorly manage risk. We cannot believe everything we think. Question your own reactions and assumptions. Learn to scan for and be aware of your own biases. Think, and then think again: is this really true, or does it just “feel true” because it’s familiar, or because I prefer it to be true?

Another fun fact from Kahneman’s book: System 2 pools resources from everywhere in the brain – both emotional and cognitive labor take from it. Your emotional state has an enormous impact on how deeply you process information and how effectively you reason. Be aware of this influence. You’re more accurate and critical in negative states, but more creative and intuitive in positive states. You’ll be more effective if you don’t try to brainstorm when you’re grumpy or assess risk when you’re feeling jolly. This is also one of the reasons that trust and psychological safety are crucial ingredients to well-functioning and high-performing teams. When we are afraid – when we fear we will be punished, ridiculed, or marginalized for making mistakes or seeming foolish – it quite literally diminishes the brain power we can bring to bear on anything else, like solving problems and making plans. When we operate from a fear state, we are less rational, have more narrowed perspective, and are less resilient, motivated, and persistent.

Kahneman’s book is a treasure trove of data and research that is both bleak and beautiful. I’ll leave you with just a few nuggets of wisdom:

“A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.”

“Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

“The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little.”

“This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.”

“Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients. An unbiased appreciation of uncertainty is a cornerstone of rationality—but it is not what people and organizations want.”