• Ozzie Paez

Seeing ≠ Believing: the vision-perception disconnect

Updated: Sep 28

THIS POST was first published in July 2012. It's been updated to include new lessons learned from growing deployments of smart technologies, including Artificial Intelligence.


We often fail to grasp the implications of what is in front of our eyes. The culprit is often the vision-perception disconnect, i.e. we see with our eyes but perceive with our minds. Our brains often register what we expect to see, not what is actually there, a condition with sometimes tragic consequences. Specialists trained in human factors have long recognized the need to tweak designs so that operators will quickly recognize and respond to the unexpected. Engineers use proven design techniques like flashing lights, color, and physical stimuli to attract operators’ attention and make sure that crucial, time-sensitive information are not ignored. Remove this strategy and even experienced operators will be rendered partially blind and unaware of conditions requiring their immediate response.


A tragic case that illustrates the vision-perception disconnect is the crash of Northwest Flight 255 on August 16, 1987 that killed 148 passengers and six crew members—miraculously, a four-year-old girl survived. The final report from the National Transportation Safety Board (NTSB) found that the pilots had not properly configured the aircraft for takeoff by extending the wing flaps and slats—and then failed to notice and correct their mistake. Their airplane stalled on takeoff and crashed a short distance from the airport.[i]


Airplane wings have slats at the front and flaps at the back to increase surface area and lift during takeoff and landing. Their positions are visible and identifiable to trained pilots.

It's instructive that, when NTSB investigators interviewed pilots whose planes were lined up behind flight 255, they reported seeing the flaps and slats properly deployed. Inspection of the wreckage and data from the plane’s flight recorder proved conclusively that their observations were in error. These pilots perceived what they had expected to see based on years of following preflight checklists that require aircrews to validate aircraft configuration before takeoff. The NTSB’s findings were in line with previous lessons learned: There are no guarantees that operators will follow checklists and eyewitness accounts are often flawed.


I was not surprised by the NTSB’s findings. Years before, while serving in the US Air Force, I was called to a T-33 trainer aircraft with a malfunctioning engine. The pilot reported that he could not get the engine to start. It took me a few seconds to scan the cockpit and notice that the plane’s fuel shutoff valve was in the closed position – no fuel, no engine start. Interestingly, I had asked the pilot on arrival to validate the fuel system’s configuration. He scanned his controls and reported that they were properly configured. He perceived what he expected to see. I was an experienced and skeptical troubleshooter who was unaffected by the pilot’s expectations, i.e. my vision and perceptions in that instance were a match.


Looking forward


The vision-perception disconnect will likely get worse with increasing automation. The effects will be masked by the same technologies, which will prevent unapproved configurations, override improper operator inputs, and reduce cognitive demands – Until something goes wrong and humans are left to make decisions without the aid of computers. Operators (pilots, drivers, surgeons, and others) will be expected to quickly take over and prevent looming disasters. To succeed they will have to establish situational awareness, interpret conflicting signals[ii], understand their implications, and make timely decisions under pressure and uncertainty. Many will fall short leading to tragic consequences. These are some of the tradeoffs from our increasing use of smart technologies to make and execute decisions once reserved for human operators.


References

[i] Aircraft Accident Report, Northwest Airlines Inc., McDonnell Douglas DC-9-82, N312RC, Detroit Metropolitan Wayne County Airport, Romulus, Michigan, August 16,1987, National Transportation Safety Board, May 10, 1988, https://libraryonline.erau.edu/online-full-text/ntsb/aircraft-accident-reports/AAR88-05.pdf [ii] Clark, Nicola. “Report on ’09 Air France Crash Cites Conflicting Data in Cockpit.” New York Times, July 5, 2012. http://www.nytimes.com/2012/07/06/world/europe/air-france-flight-447-report-cites-confusion-in-cockpit.html