Who’s responsible for Uber’s self-driving vehicle accident?
Updated: Oct 28
The harrowing video of a pedestrian getting struck by a self-driving Uber vehicle shocked the nation. Tragically, the victim later died at a local hospital. Car video captured the distracted driver who failed to intervene in time to prevent the tragedy. He was widely blamed for the accident. We had studied other cases where human operators failed to intervene in time to prevent catastrophe. Initial impressions in those instances were frequently disproved by the investigations that followed. We were intrigued and decided to take a second look at this incident and its implications. This post summarizes our findings, which proved surprising.
The accident by the numbers
The accident took place on March 18, 2018 in Tempe, Arizona. The New York Times reported that the Uber self-driving vehicle was traveling at approximately 40 miles per hour, or 58.6 feet-per-second (fps), when it struck the pedestrian. The video shows the incident taking place at night. Visibility and road conditions were good. There were no reports of the car’s autonomous systems alerting of a failure or malfunction prior to the accident.
Assumptions and questions
Our analysis assumed that the Uber vehicle had a modern, high efficiency braking system capable of decelerating at 25 feet-per-second-squared (fps^2). This is about half-way between a car without a high-tech braking system (15 fps^2) and a high-performance racing car and driver (32 fps^2).
The backup driver’s role was to monitor the vehicle’s self-driving performance and take control in case of failure or unexpected conditions. Since the vehicle did not report a malfunction, the study focused on the following questions:
At what point would an alert driver have realized that the self-driving car was not braking as expected?
What is the vehicle’s estimated breaking distance?
How much time would an alert driver have required to react, gain control of the vehicle and take appropriate action (swerve, slow down, stop)?
What distance would the car have traveled during that time?
Could an alert driver have intervened and stopped the vehicle in time to prevent the accident?
These are deceptively difficult questions to answer. There are many studies of autonomous car-driver interactions during normal operations and emergencies. None of them provided definitive answers on how quickly alert drivers are likely to identify an autonomous braking failure and take control of the vehicle. We based our estimate on multiple studies and research on situational awareness, response and reaction times.
The study estimated that under normal driving conditions, deceleration under braking would be approximately half of the car’s maximum capabilities. None of the studies we reviewed included deceleration rates under varying conditions. Our estimate was based on driving experience and field experiments. This incident involved a taxi service in an urban environment, so customer experience would be a factor. Hard braking is uncomfortable for most passengers, yet traffic forces us to stop within a narrow range of distances. We used these insights as input into multiple stopping exercises, which resulted in an estimated deceleration rate.
We developed a simplified mathematical model of autonomous vehicle-driver performance, the Driver-Automation Performance Model (DAVPM). It was used to do most calculations. The software uses input speed and other variables to calculate braking times and distances. Using 40mph as the initial speed and 25fps^2 as the deceleration rate, DAVPM calculated a braking distance of 68.87 feet. Multiplying it by two to take passenger comfort into account resulted in 137.74 feet, or approximately 46 yards. That is the distance under normal braking condition that the car would take to come to a stop.
Backup Driver Performance Limitations
A generally alert backup driver in a self-driving vehicle should recognize by feel when autonomous braking fails. That will only happen after the vehicle gets closer than expected without slowing down. In this case, the vehicle would have been less than 46 yards from the desired stopping point before the driver began the process of taking control. This process involves three phases: 1- establishing situational awareness, 2 - recognizing the problem and 3 - physically responding.
Establishing situational awareness
Developing situational awareness in this case refers to the mental processes that takes sensory information (primarily visual) to make sense of the driver-vehicle environment. Specifically, it will make the driver aware of the position and direction of his vehicle and the pedestrian, the location and speed of surrounding vehicles, other objects, traffic light, etc. We estimated based on the results of multiple research projects that an alert driver requires approximately 2 seconds to establish situational awareness.
Recognizing and Responding
In the recognition phase the brain recognizes what is happening and decides on a course of action, i.e. swerve, slow down, stop, etc. Estimates based on multiple studies suggest that a skilled, alert driver requires about 1.5 seconds to process information and decide on a course of action. In the response phase the brain sends signals to the arms, hands, legs and feet to act as instructed – braking in this case. This step takes about 0.7 seconds.
Could an alert driver have prevented the accident?
Our calculations estimated the time to establish situational awareness, formulate a response and physically act at 4.2 seconds (2 + 1.5 + 0.7). At 40 mph, or 58.6 fps, the Uber vehicle would have traveled 4.2 * 58.6, or approximately 246 feet. That is 82 yards, or more than 80% the length of an American football field. More importantly, that is nearly 1.8 times the distance (46 yards) that an alert, skilled driver requires to recognize that the autonomous braking system has failed and intervene to stop the accident. We therefore concluded that an alert, skilled driver would not have prevented the tragic accident that killed the pedestrian. The calculations also show that he would not have applied the brakes before the car impacted the pedestrian. As a result, the impact would have taken place at full speed, or 40mph.
Our investigation suggests that having a backup human driver in a self-driving care may not significantly improve the margin of safety. Alert drivers can make a difference in some situations, but only where they have the time and distance to recognize a problem or safety concern, establish situational awareness and respond in time to avoid a collision. The Tempe incident shows that in many cases drivers face limitations that effectively render them incapable of responding in time to prevent catastrophe.
Our investigation also concluded that it’s unreasonable for self-driving car designers to expect drivers to remain alert, aware and ready to intervene at all times. Experience in other industries suggest that automation promotes operator disengagement, which in turn undermines alertness and situational awareness. The results can be tragic. For example, in the loss of Air France Flight 447 on June 1, 2009, the pilots lost situational awareness and failed to respond in time to save their plane and 225 passengers and crew.
Lack of operator engagement and awareness is a recognized risk factor in airplanes with glass cockpits and advanced flight management systems. Investigators have found that large numbers of pilots fall asleep during long flights in direct violation of regulations and flight rules. Distracted aircrews have even flown past their intended destination, while focusing on tasks unrelated to flying. These incidents, which involve highly trained, experienced professional crews, challenge assumptions that lesser trained human operators (drivers) can be relied upon to backup to autonomous systems.
The results of our investigation suggest that the Uber driver could not have intervened in time to prevent the accident. The outcome was essentially baked into the design of the vehicle and human performance limitations. A proper human factors assessment might have uncovered these limitations and their implications. We could not determine whether such an assessment had been conducted as part of the vehicle’s safety analysis. Based on these facts and the analysis above, we concluded that responsibility for the accident, to the degree it could be assigned, primarily belonged to the designers of the autonomous driving system. The human driver’s behavior was careless, but the numbers suggest that the outcome would have been the same even if he had been more attentive.
Basis and References
Numerous studies were consulted in developing driver response times. Their results varied, often widely. The values used in this study took these differences into account. They are generally on the conservative side of the range of values considered. The references below were selected to support the methodology and illustrate the studies that were consulted.
 Daisuke Wakabayashi, Self-driving Uber car kills pedestrian in Arizona, Where robots roam, March 29, 2018, New York Times, https://www.nytimes.com/2018/03/19/technology/uber-driverless-fatality.html
 Vehicle stopping distance and time, National Association of City Transport Officials (NACTO), https://nacto.org/docs/usdg/vehicle_stopping_distance_and_time_upenn.pdf
 Vehicle stopping distance and time, NACTO.
 Cheryl A Bolstad, The measurement of situational awareness for automobile technologies of the future, SA Technologies, http://drivingassessment.uiowa.edu/drivingmetrics/SA1_Bolstad_SAE%20Workshop%202008.pdf
 Knowing what is going on around you (situational awareness), Leadership and Worker Engagement Forum, http://www.hse.gov.uk/construction/lwit/assets/downloads/situational-awareness.pdf
 Zhenji Lu, Xander Coster, Joost de Winter, How much time do drivers need to obtain situational awareness, April 2017, Applied Ergonomics, https://www.sciencedirect.com/science/article/pii/S0003687016302630
 Dean Macris, Ozzie Paez, Automation and the unaware caretakers, May 1, 2018, Ozzie Paez Research, https://www.ozziepaezresearch.com/single-post/2018/04/30/Automation-and-unaware-caretakers