"The most dangerous decision-making fallacy is that informed decision-makers will naturally make better, more objective decisions. Making consistently timely, effective, informed decisions takes hard work. Trust me – it’s worth it. Effective decision-making is the essential common ingredient behind every successful step, initiative and strategy that people, organizations and national governments undertake."Ozzie Paez
Strong beliefs can undermine our ability to analyze facts and make informed decisions. They promote biases that compromise our ability to reconsider positions and beliefs in light of new evidence. One of the most common and damaging of these is confirmation bias. This cognitive trap influences us to accept information that validate our beliefs, while filtering those that disconfirm them. Internet technologies can magnify the influence of biases like confirmation bias. For example, while Google search is a great research tool, it does not assure the quality and balance of documents in its search results.
Can we avoid confirmation bias? Research suggests that we cannot. According to psychologist Daniel Khaneman, this bias is connected to the “operations of associative memory,” which suggests that it is an integral aspect of human cognition. As with other biases, such as hindsight, we can learn strategies to mitigate its negative effects, but we cannot completely escape its influences.
Confirmation bias filters information that challenges existing beliefs. As a result, decision-makers may stick with their positions in spite of mounting evidence to the contrary.
Do strong beliefs and confirmation bias lead to poor decision-making? Not necessarily. It depends on whether existing beliefs are valid. If they are, then confirmation bias can promote quick, effective and decisive action. Unfortunately, when beliefs turn out to be wrong or conditions change, confirmation bias undermines our ability to adapt. It can influence leaders to reject new evidence and stick with damaging positions and beliefs. Roberta Wohlstetter, a RAND military and foreign policy scholar referred to this tendency as a “stubborn attachment to existing beliefs.”
RAND's Roberta Wohlstetter concluded that a “stubborn attachment to existing beliefs” undermined American preparations and decision-making in the lead-up to the Pearl Harbor attacks. US Navy photo.
Psychologists Lee Ross and Craig Anderson explained how confirmation bias promote and sustain existing beliefs: “First, any pattern of evidence processed in this fashion, even evidence that is essentially random, will tend to bolster the initial belief. Second, once evidence has been processed in this fashion it gains the capacity to sustain the prior belief when that belief is subjected to new empirical disconfirmation or to attacks on its original evidential basis.”
Confirmation bias can undermine timely awareness and responsiveness, which are central to good decision-making, particularly in dynamic environments. It can blind decision-makers to evidence suggesting that their current strategies, tactics and beliefs are no longer valid. While we can’t avoid biases like confirmation and hindsight, there are means and methods available to check their effects. These can be effective when thoughtfully applied to analytical and decision-making processes. They work best, however, when tailored to each organization’s structure, systems and culture.
 Daniel Khaneman, Thinking, Fast and Slow, p. 84, Nook edition, Farrar, Straus and Giroux, 2011.
 Roberta Wohlstetter, Pearl Harbor: Warning and Decision, p. 393, Stanford University Press, 1962.
 Lee Ross, Craig A. Anderson, Shortcomings in the Attribution Process: On the Origins and Maintenance of Erroneous Social Assessments, in Judgment Under Uncertainty: Heuristics and Biases, edited by Daniel Khaneman, P. Slovic and Amos Tversky, pages 149-150, Cambridge University Press, 1982.