A common and dangerous decision-making fallacy is that informed decision-makers predictably make better, more objective decisions. It’s an understandable false assumption given the promises of our digital age and decades of Star Trek indoctrination. It’s why I’ve long referred to this damaging notion as The Star Trek fallacy – despite being a long-time Trekkie. Data and information lead decision-makers to make ‘informed’ good and bad decisions, as I noted in two books on nuclear weapons negotiations and nuclear strategy in the Middle East.
Consider the incident on November 9, 1979, when “a training tape was mistakenly loaded onto the North American early warning computer systems. The software displayed a realistic scenario of a massive incoming Soviet nuclear first strike at the North American Defense Command’s (NORAD) Cheyenne Mountain Complex in Colorado Springs, the Pentagon National Military Command Center in Washington, and the Alternate National Military Command Center in Fort Ritchie, Maryland. The imaginary nightmare scenario triggered alerts to Minuteman ICBM missile silos and the continental air defense system, which launched some of its fighters. Even the National Emergency Airborne Command Post (the “doomsday plane”) was sent up, although the President was not aboard.” [1]
In this instance, American senior military authorities received compelling information of an unprovoked attack by the Soviet Union and faced a launch-on-warning decision-making scenario. Instead, well-rehearsed verification procedures quickly concluded that it was a false alarm. Still, when the alarms initially went off at the three command centers, the best-informed decision-makers could have decided to act before it was too late. Thankfully, good training, proven processes, and steady minds kept the unthinkable in check.
Similar, though less potentially catastrophic incidents take place in other situations including cockpits where crews sometimes must quickly make sense of cascading warnings and control problems to save their planes and passengers. Ditto in emergency rooms where doctors must decipher unusual symptoms and apply differential diagnostic strategies to save lives. In these and other stressful situations, data and information can clarify and confound decision-makers. It’s their training, experience, and proven processes that help them make timely, effective decisions under challenging and life-threatening conditions. These are lessons my colleagues and I take to heart when architecting information-driven responsive processes, systems, and organizations.
References
1. Ozzie Paez, Decision Making in a Nuclear Middle East – Lessons from the Cold War, https://a.co/9EhnTaw
Commentaires