Dismissing Uberâ€™s own self-driving errors as mere â€œmistakesâ€� feels wrong, too (although on a different order of magnitude). Especially given the raft of documents released last week by the federal transportation safety watchdog the National Transportation Safety Board, which has spent the last 20 months investigating the context of the accident in which a car killed the woman, named Elaine Herzberg. During Sundayâ€™s interview, Primack asked whether the crash boiled down to a â€œbad sensor.â€� â€œYes, yeah,â€� Khosrowshahi responded, before Primack cut him off. But according to the documents, thatâ€™s not quite true. In fact, a series of poor decisions appear to have led to that moment on a dark Arizona road. (In May, an Arizona prosecutor said there was â€œno basis for criminal liability for the Uber corporation arising fromâ€� the fatal crash. On November 19, the NTSB will announce the final results of its investigation, saying who and what they believe is at fault for the crash.)
According to the NTSB investigation, Uberâ€™s software was not created to recognize pedestrians outside of crosswalks. â€œThe system design did not include a consideration for jaywalking pedestrians,â€� one of the documents said. As a result, Uberâ€™s system wasted some 4.4 seconds trying to â€œclassifyâ€� Herzberg, and to use that information to predict her movement.
Then, with just 1.2 seconds until impact, the Uber system again did what it was designed to do: It held off braking for one second. This aspect of the system was meant to give the â€œmission specialistâ€� hired to monitor the self-driving car from behind the wheel time to verify â€œthe nature of the detected hazardâ€� and take action. According to the NTSB documents, Uber created â€œaction suppressionâ€� system because the self-driving programâ€™s developmental software kept having false alarmsâ€”that is, identifying hazards on the roads where none existedâ€”and so kept executing unnecessary but â€œextremeâ€� maneuvers, like swerving or hard braking. But on that night in March, the woman behind the wheel of the car didnâ€™t look up during that second-long period, and the system only began to slow down 0.2 seconds before impact. In the end, the car was traveling at 43.5 mph when it hit Herzberg.
And if the self-driving system had flaws, maybe those can be traced to a series of decisions Uber made around its organizational structure. The NTSB documents note that, while Uberâ€™s self-driving unit did have a system safety team, it didnâ€™t have an operational safety division or a safety manager. Nor did it have a formal safety plan, or a standardized operating procedure or guiding document for safetyâ€”the stuff of a well-thought-out â€œsafety culture.â€� In fact, the company had only recently decided to depart from industry standards and have just one single person in each testing vehicle instead of two. (â€œWe deeply value the thoroughness of the NTSBâ€™s investigation into the crash and look forward to reviewing their recommendations once issued after the NTSBâ€™s board meeting later this month,â€� an Uber spokesperson said last week in a statement.)
So, does Uber get to be forgiven? Thatâ€™s probably for Uber and Uber customers to decide. For part of Monday morning, #BoycottUber trended nationwide on Twitter. Uber says it has completely revamped its self-driving testing procedures since the crash, and has added another person to each of the vehicles it tests on public roads. To cut down on mistakes.
More Great WIRED Stories