Availability heuristic and decisions on thin ice
THIS POST was first published in April 2012. It follows my previous post on heuristics and decision-making. It has been updated to include the implications of smart technologies, including artificial intelligence.
The two buddies had enjoyed ice fishing on the Great Lakes for many years. They had caught their fair share along the way and expected the unseasonably warm weather to make their outing more enjoyable. So, they drove out on the ice as they had countless times, set up their gear, drilled their holes, and settled to warm coffee and conversation. It was just as they expected, until a loud crack signaled a break in the ice that sent their truck floating away. Oops!
Recurring activities with predictable outcomes create easily retrievable memories that promote rule of thumb (heuristic) decision-making strategies like the availability heuristic.
I heard this story from one of the fishermen, an engineer who worked at a facility near Lake Erie. It was easy for them to recognize their mistake with the benefit of hindsight. If they had gathered and analyzed more data, including wind, temperature, and ice sheet conditions, they would have avoided the embarrassment of watching their truck sit out on the ice for months, until it sank to the bottom in the spring. So – Why didn’t they take the time to think their decision through?
One explanation is that they fell victim to the availability heuristic, a decision-making strategy and memory process that usually serve us well, but also lead to predictable mistakes. Specifically, frequent events that result in consistent outcomes are more clearly and readily available in memory. We therefore quickly remember them and expect similar events to result in similar outcomes. Since the two friends had safely parked on the ice many times before, they easily remembered those experiences and expected the same results. The “voice of experience” convinced them that it was safe to park on the ice without analyzing existing conditions.
The availability heuristic is a useful shortcut indispensable to broadly exploiting experience. Doctors use it to diagnose recognizable symptoms and experienced troubleshooters rely on it to quickly identify and fix equipment problems. Unfortunately, it can also lull decision-makers into false expectations and dangerous assumptions. For example, troops that repeatedly patrol an area without incident will usually become less alert and at greater risk of ambush. Availability heuristic errors can be aggravated by framing errors when areas are declared “safe,” causing troops on patrols to further let down their guard.
Leveraging smart technologies to improve decisions
Smart technologies can help decision-makers avoid framing bias and availability errors – if they are designed for that purpose. Interestingly, artificial intelligence strategies like machine learning can reflect their own versions of the availability heuristic, i.e. data sets based on consistent experience (patrols going out and returning safely) can lead to predictions that reflect past performance. In this context, it is critically important to incorporate “artificial skepticism” into artificial intelligence analysis systems. In other words, artificial intelligence systems can quickly get "stupid" when they don't reflect real life operational constraints or are fed poorly constructed and unrepresentative datasets!
 Richard Nisbett, Lee Ross, Human Inference: Strategies and Shortcomings of Social Judgment. Prentice-Hall, 1980, page 3.
 Sandra Aamodt, Sam Wang, Welcome to Your Brain: Why You Lose Your Car Keys But Never Forget How to Drive and Other Puzzles of Everyday Life. Bloomsbury, 2010, Nook edition, page 6.