The software inside the Uber self-driving SUV that killed an Arizona woman last year was not designed to detect pedestrians outside of a crosswalk, according to new documents released as part of a federal investigation into the incident. That’s the most damning revelation in a trove of new documents related to the crash, but other details indicate that, in a variety of ways, Uber’s self-driving tech failed to consider how humans actually operate.As it turned out, the vehicle's software did detect the victim, 49-year-old Elaine Herzberg, with more than enough time to stop, but did not do so because it did not recognize that Herzberg was, in fact, a human. (The car did have a human back-up driver, Rafaela Vasquez, but she didn't see Herzberg until it was too late.)
It never guessed Herzberg was on foot for a simple, galling reason: Uber didn’t tell its car to look for pedestrians outside of crosswalks. “The system design did not include a consideration for jaywalking pedestrians,” the NTSB’s Vehicle Automation Report reads. Every time it tried a new guess, it restarted the process of predicting where the mysterious object—Herzberg—was headed. It wasn’t until 1.2 seconds before the impact that the system recognized that the SUV was going to hit Herzberg, that it couldn’t steer around her, and that it needed to slam on the brakes.
That triggered what Uber called “action suppression,” in which the system held off braking for one second while it verified “the nature of the detected hazard”—a second during which the safety operator, Uber’s most important and last line of defense, could have taken control of the car and hit the brakes herself. But Vasquez wasn’t looking at the road during that second. So with 0.2 seconds left before impact, the car sounded an audio alarm, and Vasquez took the steering wheel, disengaging the autonomous system. Nearly a full second after striking Herzberg, Vasquez hit the brakes.Self-driving vehicles will only be as safe as the design of the systems, sensors and software upon which they operate. The software design flaw that occurred in this case - an inability to recognize a human outside of a marked crosswalk - is so obvious as to beggar belief, and makes me wonder how many less-obvious but nevertheless lethal programming errors might still be imbedded in Uber's self-driving software. This is one of many reasons why we still have a long way to go before self-driving cars become commonplace on streets and highways.
Uber settled a lawsuit with Herzberg's family shortly after her death, and has made changes to its safety program for automated vehicle testing.
There's also this tidbit:
Another factor in the crash was the Tempe road structure itself. Herzberg, wheeling a bicycle, crossed the street near a pathway that appeared purpose-built for walkers, but was 360 feet from the nearest crosswalk.Yikes! Along with the safety of autonomous vehicles themselves, we cannot ignore the design of the environment in which they - and the pedestrians and bicyclists they are supposed to detect - travel. The safe deployment of self-driving cars could require significant and expensive modifications to streets, sidewalks, pathway and bikeways.
A lot of work remains to be done.
No comments:
Post a Comment