Managerial mistakes you’re wired to make…and how to avoid them (#1 circumventing the confirmation bias)

Posted on Posted in leadership

By “wired” I mean all of us, humans, are subject to a fascinating set of cognitive biases – systematic errors in thinking that affect the decisions and judgments that people make. Generally they’re related to either  memory (simply put, we “create” inaccurate memories which further bias thinking & decision-making), attention (we are selective about what we pay attention to in the surrounding world, so we “create” limited realities), attribution or other mental mistakes. As everything else about human automatism, the brain’s intention is on the good side of our race’s survival Golden Path, helping us to make swift decisions.

As managers though, such phenomenon lead to all shapes & colors of mistakes and negative impact on people’s motivation…so it’s really important that we try our best to step out of our automatic beliefs in our own minds and engage in protective routines against them. There are very little things we can do to increase the span of attention and improve our memory, but there are things we need to pay attention to – and things we can do.

Confirmation bias = we favor information that fits our existing beliefs (including one’s expectations in a given situation and predictions about a particular outcome).  and tend to discount evidence that does not conform. We are so much more able to rationally process information, giving equal weight to multiple viewpoints, if we are emotionally distant from the issue (although a low level of confirmation bias can still occur when an individual has no vested interests).

There are 3 main reasons why we’re functioning this way:

1) according to Robert MacCoun, most biased evidence processing occurs through a combination of “cold” (cognitive) and “hot” (motivated) mechanisms.

  1.  Cognitive explanations for confirmation bias are based on limitations in people’s ability to handle complex tasks, and the shortcuts (“heuristics”) that they use – such as the availability heuristic (how readily a particular idea comes to mind). It is also possible that people can only focus on one thought at a time, so find it difficult to test alternative hypotheses in parallel.
  2. Motivational explanations involve an effect of desire on belief. It is known that people prefer positive thoughts over negative ones in a number of ways (the “Pollyanna principle“) – wishful thinking or decision-making and the formation of beliefs based on what might be pleasing to imagine, rather than on evidence, rationality, or reality (a product of resolving conflicts between belief, and desire). A generalized application is that we want to feel we’re smart – so information suggesting we’re holding an inaccurate belief or made a poor decision may signal we’re lacking intelligence.

2) James Friedrich – an exponent of evolutionary psychology – suggests that people do not primarily aim at truth in testing hypotheses, but try to avoid the most costly errors.

3) Psychologists Jennifer Lerner and Philip Tetlock distinguish two different kinds of thinking process. Exploratory thought neutrally considers multiple points of view and tries to anticipate all possible objections to a particular position, while confirmatory thought seeks to justify a specific point of view.

4) Developmental psychologist Eve Whitmore has argued that beliefs and biases involved in confirmation bias have their roots in childhood coping through make-believe, which becomes “the basis for more complex forms of self-deception and illusion into adulthood.” The friction brought on by questioning as an adolescent with developing critical thinking can lead to the rationalization of false beliefs, and the habit of such rationalization can become unconscious over the years.

To make it even worse, biases sometimes combine into “perfect mistakes”:

  • combined with the positive bias, when we’re focusing on positive evidence that confirms that a hypothesis is true rather than information that would prove the view is false if it is false.
 Surely you’ve experimented the effects of such mental mistakes in business – if it rings a bell, here are some ideas as to how to protect yourself from them:

  1.  attitude polarization – disagreements becoming more extreme even though the different parties were exposed to the same evidence…leading to the failure of all those long “alignment” and “information” meetings. Solution: whenever you catch yourself thinking/saying “even more reason to”, get a grip and ask yourself & the others “what’s the middle ground here?”.
  2.  belief perseverance – persistent beliefs even after the evidence for them is shown to be false…leaving people disgusted and demotivated by the long hours put into gathering, analyzing and presenting relevant data with no impact. Solution: routinely ask yourself “what’s the other side of this?” and allow for a couple of moments to genuinely ponder of the opposite view…just give it a fair chance!
  3. the irrational primacy effect – greater reliance on information encountered early in a series…leading to the “battle” between employees to be the first to share a piece of information (sometimes just gossip!). Solution: when you must make a decision, religiously observe the data gathering – analysis – decision routine and have all the arguments piled up in front of you when you enter the analysis mode.
  4. illusory correlation – falsely perception of associations between two events or situations especially when distinctive and unusual information is presented…hence jumping to conclusions. Solution: whenever you feel you’ve just made a “clever” correlation and that “I’m so smart” feeling sets in, remember to ask yourself “what if it’s simpler / more complicated than that?”…and seek for more data if you can or refrain from believing insufficiently tested inferences (billions have been invested in those infamous conclusions of so called “researches” or “root cause analyses”).
  5. group think –  occurring when a group of individuals reaches a consensus without critical reasoning or evaluation of the consequences or alternatives. Solution: whenever you have an important decision to make in your otherwise homogeneous team and everybody reacts with an “of course!” type of response, make sure you allocate a devil’s advocate role to someone in order to challenges the “it is known” potentially damaging group imagination effects. 
  6. impression formation – if people are told what to expect from a person they are about to meet, they’ll look for information that supports their expectations and  even as one-sided questions…imagine the effect in job interviews or client meetings! Solution: either start fresh (do not ask for recruiter’s notes on candidates) in order to be able to look at the person objectively, or run a thorough analysis (competency-based assessment, stakeholder analysis, etc)…the middle ground we so often use is the worst possible way!

Personality traits influence and interact with biased search processes. Individuals vary in their abilities to defend their attitudes from external attacks in relation to selective exposure, which occurs when individuals search for information that is consistent, rather than inconsistent, with their personal beliefs. The higher your  confidence level, the more likely you are to seek out contradictory information to your personal position to form an argument. This is valid for this bias as well as the next ones we’ll examine here.