Who decides? GPS, smart tech, and risky decision-making
Johnathan Turley’s post on the tragic death of a young father and husband highlights the differences between technologies that help us make better decisions and those that reduce human awareness and decision-making authority. These are critical factors in safety-significant applications. In this instance, a North Carolina husband and father of two reportedly guided his Jeep via GPS to a bridge that had been destroyed by heavy flooding years before. We will never know how much his situational awareness was diminished by his reliance on GPS guidance, but the research record suggests that even experienced operators are often distracted by technology and automation. They are recognized causal factors in accidents involving drivers distracted by cell phones.
GPS provides critical services beyond geographic positioning and history shows that they are all vulnerable to intentional and accidental disruptions. For example, GPS satellites provide timing and synchronization signals made possible by multiple onboard atomic clocks that resolve time to a 100 billionth of a second. Tiny errors in these services can have major repercussions across industries and governments. For example, a 2016 case involving the decommissioning of satellite SVN 23 and software maintenance resulted in a 13-microsecond error that affected telecommunication companies in Europe and the United States – that’s an error of 13 millionths of a second.
GPS signal jamming is also a major concern for its effects on guidance and navigation systems. South Korea and its allies have accused North Korea of intentionally jamming GPS signals and forcing hundreds of passenger planes to rely on alternate navigation systems. A report of a test conducted by the Royal Navy found that inexpensive jammers on a navy ship had “caused electronic chart displays to show false positions” and the autopilot to “steer the ship quietly off course.” The most dangerous scenarios involve GPS spoofing and hijacking that trick navigation systems into responding as if they are at a different location.
Traditionally, engineers respond to these threats by adding more layers of detection and control as means to improve safety and prevent catastrophic failures. These strategies increase complexity, particularly in systems employing artificial intelligence because they make it even more difficult for operators (humans and tech) to foresee the consequences of decisions and actions on other parts of interconnected systems. Making matters worse is the tight coupling within and across systems that impose time-dependent requirements on system response and performance. These characteristics create conditions that make accidents normal and inevitable despite all efforts to prevent them. AI systems and subsystems raise new levels of uncertainty and risks because, to human operators, they are black boxes with nondeterministic behaviors. In this context, operators can’t predict and foresee the actions and consequences of artificial intelligence-based controls.
The demands placed on GPS services, their vulnerabilities, and growing system complexities raise questions about the future of self-navigating platforms like cars, planes, ships, etc. Most autonomous systems assume that human actors can intervene when necessary to prevent disasters, however, our analysis of the Uber self-driving car accident in Tempe, Arizona highlighted the limitations of human intervention,. Similar limitations are applicable to all smart technologies that reduce human engagement, situational awareness, and decision-making in fields like medicine and warfighting. I will discuss their implications in more detail in upcoming posts.
 Johnathan Turley, North Carolina Man Killed After GPS Sends Him Over Destroyed Bridge, Johnathan Turley, October 10, 2022, https://jonathanturley.org/2022/10/10/north-carolina-man-killed-after-gps-sends-him-over-destroyed-bridge/#more-194916  Ozzie Paez, Dean Macris, Automation and the unaware caretakers, Ozzie Paez Research, May 1, 2018, https://www.ozziepaezresearch.com/post/2018/04/30/automation-and-unaware-caretakers.  Distracted Drivers, NHTSA, https://www.nhtsa.gov/risky-driving/distracted-driving, accessed October 11, 2022.  Satellite Navigation – GPS – Space Segment, Federal Aviation Administration, https://www.faa.gov/about/office_org/headquarters_offices/ato/service_units/techops/navservices/gnss/gps/spacesegments, accessed October 11, 2022.  Chris Baraniuk, GPS Error Caused ’12-hours of problems’ for companies, February 4, 2016, BBC, https://www.bbc.com/news/technology-35491962.  Space Threat 2018: North Korea Assessment, Aerospace Security, https://aerospace.csis.org/space-threat-2018-north-korea/, accessed October 11, 2019.  Chris Whitty and Mark Walport, Satellite-Derived Time and Position: A Study of Critical Dependencies, London: Government Office for Science, 2018, p. 29, https://navisp.esa.int/news/article/study-of-critical-dependencies  Shah Zahid Khan, Mujahid Moshin, Waseem Iqbal, On GPS spoofing of aerial platforms: a review of threats, challenges, methodologies, and future research directions, National Library of Medicine, May 6, 2021, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8114815/.  Charles Perrow, Normal Accidents – Living with high risk technologies, October 12, 2011, Princeton University Press.  Ozzie Paez, Dean Macris, Who’s responsible for Uber self-driving car accident?, Ozzie Paez Research, June 15, 2018, https://www.ozziepaezresearch.com/post/2018/06/15/uberselfdrivingvehicleaccident.  Ozzie Paez, Dean Macris, The fatal Uber self-driving car crash – update, Ozzie Paez Research, October 28, 2020, https://www.ozziepaezresearch.com/post/2018/07/12/the-fatal-uber-self-driving-car-crash-update.