How the Dueling Systems of our Minds Influence Business Decisions

Psychology shows that the mind’s dueling systems make humans flawed thinkers. How do leaders cope with this as they assess the way they make decisions?

In 2002, Daniel Kahneman won the Nobel Prize in economics for his work on the psychology of judgment and decision-making. His seminal research illuminated the processes by which humans evaluate information and make decisions, and exposed a number of patterned, recurrent “cognitive biases” that influence human judgment.

Kahneman presented a new way to imagine the thinking brain, most notably in his 2011 book “Thinking, Fast and Slow.” Our mind is not one cohesive whole, but can be thought of as two distinct systems of thought. Kahneman describes System 1 as the “brain’s fast, automatic, intuitive approach.” System 1 includes all of the split-second recognitions our brain makes millions of times a day.

Consider a brief thought exercise. When you see a dog on the street, how do you know you’re looking at a dog? Do you think back to all of the characteristics that comprise a dog, and then match them against the creature in front of you? No, you just see a dog and know it’s a dog. We’ve seen enough dogs in our lives that when we see a dog, it is immediately obvious what we are looking at.

Our minds have filled themselves with all sorts of these automatic classifiers. From forming sentences to recognizing faces, these classifiers are the effortless, intuitive thought processes that we have encoded into every facet of our lives. And they serve a very crucial purpose. Unlike newborn children, we aren’t paralyzed in thought from every stimulus we encounter. Instead, we use millions of split-second mental shortcuts accumulated over a lifetime that allow us to function at a high level.

System 2, on the other hand, is our “slower, analytical mode, where reason dominates,” Kahneman writes in his book. A second thought exercise exemplifies how it works. What is 11 multiplied by 23? It’s obviously a harder problem than recognizing a dog, but it also has a fascinating physical reaction. As you thought about the question, did you squint your eyes or form a frown on your face? That happens because System 2 is effortful; it requires focus. Our bodies instinctively react to prepare our minds for maximum attention.

Heuristics and Cognitive Biases

A crucial insight on the System 1-System 2 framework is that, whenever possible, our minds always prefer the easier path. In psychology, heuristics are the simple, efficient rules that people use to form judgments and make decisions. They’re the mental shortcuts of our System 1, or our accumulated classification systems. Usually, the heuristics involve focusing on one or two aspects of a complex problem, and the vast majority of the time, they work perfectly.

Sometimes, however, our classification shortcuts lead us astray. Sometimes our System 1 responses can be puzzlingly inaccurate. We answer instinctively and confidently, without engaging System 2. Only when told of our mistake do we revisit the problem, focus our System 2 processes on calculating the right answer, and then slap our foreheads in frustration when we realize our error.

These heuristics can have profound consequences in the real world. Consider the decision errors of the Iraq War or the 2008 financial crisis. Questions like “Do WMDs exist?” and “Are housing prices inflated?” are difficult; they require advanced knowledge and thorough investigation. So instead, most people substituted questions like, “Do other smart people think WMDs exist?” and “How do financial experts value current housing prices?”

These questions are easier to answer, and most of the time they offer an effective heuristic shortcut. From years of cumulative training, our System 1 minds have been trained to believe that when other smart people believe something to be true, it usually is. Most of the time, this shortcut works perfectly. But when it doesn’t, it can be disastrous.

In the examples above, the heuristic created confirmation echo chambers. As more and more people answered the easier question, the power of the heuristic grew stronger — and the real truth became even more elusive.

Understanding Decisions in the Workplace

To understand how cognitive biases influence our decisions in the workplace, it is important to understand how workplace decisions differ. On one end of the spectrum, there are “bet the company” decisions. These decisions have a high level of impact and are typically infrequent. Mergers and acquisitions, market entries and large R&D investments all fall within this category.

While these types of decisions can certainly be influenced by cognitive biases, firms are getting smarter about imposing processes to ensure decision quality. For example, boards may nominate a “resident devil’s advocate” to prevent groupthink or confirmation bias. Alternately, executives may impose a “pre-mortem” into the decision process. A pre-mortem is a preliminary reflection on how a decision might go wrong. The process has been proven to counteract overconfidence biases. Finally, bet-the-company decisions are often supplemented by rigorous analyses, including advanced methodologies like Monte Carlo simulations and decision trees.

On the other end of the spectrum, there are “rule-driven algorithms,” or decisions that occur with a high level of frequency and that are relatively uniform. Consider credit approvals, recommendations systems, marketing campaign optimizations and, soon, vehicle navigation. For these decisions, the high number of instances, coupled with a consistent, rule-based architecture, usually makes Artificial Intelligence solutions the most effective.

AI algorithms are capturing an increasing share of the decision space, and this growth will likely continue for the foreseeable future. AI systems are also an extremely effective weapon against cognitive biases. A credit officer at a bank is subject to all sorts of biases when determining whether an individual should be approved for a loan. An AI system, on the other hand, only knows the strict rules of the algorithm. The only catch is if the AI is biased in its calibration, every decision it makes will be biased.

Everyday Significant Decisions

The final set of workplace decisions is the most challenging. This group is the catchall bucket in the middle. It comprises the vast majority of all decision types. Carl Spetzler from Strategic Decisions Group calls these “Everyday Significant Decisions,” or ESDs. ESDs are challenging because they are too varied for an AI algorithm, but also too frequent to impose a decision quality process upon them. They could include anything from hiring new people to planning projects to purchasing equipment to prioritizing daily tasks. And while a single ESD may not be as impactful as a bet-the-company decision, the cumulative effect of ESD quality throughout an organization can be massive.

Because ESDs are so varied in nature, the decision-making quality will continue to depend on individual humans. And because most humans are very susceptible to cognitive biases, the decision-making quality of ESDs is often quite poor. This was such an important issue for the U.S. National Intelligence Agencies that they developed video games to train analysts. The results were pretty astounding. After a more than 400-person study, the games proved to be up to three times more effective than other training formats.

Decision-making competency can be one of the most important indicators of long-term success, yet it is rarely addressed through systematic training. Too often, companies rely on the trial and error of real-world experience to improve employee judgment, where the errors can be costly. Game-based training programs offer a new solution: an immersive, interactive environment, without the fail-risk. Companies should take note.

Andrew Strong is the head of decision solutions at Correlation One. To comment, email andrew@correlation-one.com.