Feds: Uber self-driving SUV saw pedestrian, did not brake

Subscribe Now Choose a package that suits your preferences.
Start Free Account Get access to 7 premium stories every month for FREE!
Already a Subscriber? Current print subscriber? Activate your complimentary Digital account.

DETROIT — The autonomous Uber SUV that struck and killed an Arizona pedestrian in March spotted the woman about six seconds before hitting her, but did not stop because the system used to automatically apply brakes in potentially dangerous situations had been disabled, according to federal investigators.

In a preliminary report on the crash, the National Transportation Safety Board said Thursday that emergency braking is not enabled while Uber’s cars are under computer control, “to reduce the potential for erratic vehicle behavior.”

Instead, Uber relies on a human backup driver to intervene. The system, however, is not designed to alert the driver.

The findings, which are not final, should be a warning to all companies testing autonomous vehicles to check their systems to make sure they automatically stop when necessary in the environment where they are being tested, said Alain Kornhauser, faculty chairman of autonomous vehicle engineering at Princeton University.

Uber, he said, likely determined in testing that its system braked in situations it shouldn’t have, possibly for overpasses, signs and trees. “It got spoofed too often,” Kornhauser said. “Instead of fixing the spoofing, they fixed the spoofing by turning it off.”

In the Tempe, Arizona, crash, the driver began steering less than a second before impact but didn’t brake until less than a second after impact, according to the NTSB, which has yet to determine fault.

A video of the crash showed the driver looking down just before the vehicle struck and killed 49-year-old Elaine Herzberg in what is believed to be the first death involving a self-driving test vehicle.