Robocar Death Analysis: Part Two

Article By : Junko Yoshida, EE Times

As more details surface, we revisit last week's Uber incident where an innocent woman lost her life, becoming the first fatality caused by an autonomous vehicle

The Uber dashcam footage released by Tempe, Arizona, police this week has raised a host of fresh questions among the technology community, including many EE Times readers.

It’s prudent, of course, to await reports by the National Highway Transportation Safety Administration (NHTSA) and the National Traffic Safety Bureau (NTSB) before assigning blame for the self-driving Uber vehicle’s fatal accident last Sunday night.

But for the tech community, it is past time to start thinking about what could have prevented the autonomous car from killing a woman crossing the street.

Footage of the collision shows that the self-driving car not only did not stop; it didn’t even slow down when a human — whose movements were neither fast nor sudden — crossed its path. Mike Demler, senior analyst, The Linley Group, calls it “the million-dollar question.”

He asked: “Uber needs to answer — what was the purpose of the driver being on the road at that hour? Was it a nighttime test? Were the radar/lidar functioning? Is their software just totally incapable of responding to emergency conditions?”

Shocking to many automotive experts is that none of the sensors — including radars, lidars, vision — that were embedded inside Uber’s self-driving car (Volvo XC90) seemed to have their eyes on the road. Nor — as indicated by the fully functional driver-facing camera — did the so-called “safety driver.”

Phil Magney, founder and principal of VSI Labs, told us, “The fact that the AV stack failed to detect is surprising.” However, he added, “Unless some sensors or features were temporarily disabled for testing or isolating certain functions.” This is the most generous possible interpretation of what might have happened.

Roles of sensors
Before jumping to any conclusion, I asked Demler and Magney which of the sensors deployed by the Uber’s robocar was best-positioned — in theory — to detect objects appearing on the street.

Uber self-driving vehicle sensor suite

Self-driving Uber vehicle’s sensor suite. (Source: Uber)

“VSI Labs believes that lidar would have been the most beneficial in this accident, although it is entirely possible that the radar and/or the camera could have picked up the pedestrian as well,” said Magney.
Demler said, “At the distance we see in the video, all of the sensors should have picked up the woman crossing the street with her bike. As soon as she was illuminated by the headlights, which was one to two seconds before impact, the vehicle cameras should have detected her.”

But considering that the accident happened in the dark, Demler said, “The radar and lidar should have picked her up first. At the reported 38 mph, the Volvo traveled approximately 17 m/s. Even short-range radar (SRR) has a range of up to ~100 m. For example, in a TI whitepaper, they describe a device with an unambiguous range of 80 m. That would give the vehicle four to five seconds to respond.”

Demler added, “But the vehicle must have long-range radar (LRR), too. This Bosch LRR has a detection range of up to 250 m .

Furthermore, “There’s the lidar. The Uber Volvo has a rotating 64-element Velodyne lidar on its roof. That has a range of 120 m.”

Asked for the weakest link among all of these sensors, Demler described the cameras as “obviously the weakest sensor to use at night.” He noted, “Radar can give the best long-range detection and nighttime immunity, but the lidar is close to radar in range and much higher in spatial resolution.” In fact, the Uber Volvo had 360° radar/lidar, apparently no obstructions, so it had a clear field-of-view, he added.

Given all that, Demler believes, “There’s no excuse for the sensor system not detecting the woman crossing the road.”

While the vision system might have been the least capable for nighttime sensing, Magney theorized that vision sensors, especially those with high dynamic range, or — better yet — thermal cameras could have helped. “Ideally, thermal cameras looking out the front corners could be used to reduce this kind of pedestrian accident where the target is dressed in dark clothing and it is dark.” Millimeter-wave radar would have certainly picked up the pedestrian, he added.

What about software failure?
The greatest difficulty in understanding this autonomous car accident is that, along with the possibility of hardware failures, a software failure also could have prevented the car’s highly advanced hardware from properly functioning.

At issue isn’t just the capability of each sensor device, cautioned Magney. It all depends on “how the device is programmed to see versus what gets filtered out.”

A system failure does not necessarily indicate that the sensors failed. Magney stressed, “It all depends on how those sensors are programmed. The algorithms are trained to detect certain classes of objects and to filter out others. Meanwhile, other algorithms determine the pose and trajectory of other actors.”

Roles of ‘safety drivers’
The police-released video footage also includes a peek inside the car. It shows that Uber’s human assistant was looking down. This “safety driver” appeared shocked at the last minute, just as the car failed to stop.

The pedestrian did not become visible in this video until less than two seconds before being struck. One could argue that it may have been difficult for a human to react in time to avoid this accident.

“Nevertheless, there is a reason the safety driver is in this vehicle to begin with,” said Magney.
It is unclear what qualifications, responsibilities, and tasks Uber requires of these safety drivers.

Of course, having seen other autonomous vehicle vendors’ promotional video clips showing off robocar test drives with a safety driver behind the wheel, I naively assumed that these so-called “safety drivers” are engineers testing certain conditions on autonomous vehicles (sensors and software) while driving on a public road.

Apparently, that doesn’t appear to be the case with Uber in Arizona. Magney said, “Surely the safety drivers would have been trained and agreed to pay full attention to the vehicle performance and be prepared to take over immediately under imminent danger.” He added, “I assume this safety driver was informed and agreed to pay FULL attention.”

“Short of a complete system failure, which the driver should have been aware of, if it happened, the technology in that vehicle is completely capable of minimizing the damage in those conditions,” noted Demler.

He added, “There’s a rule every driver learns about nighttime driving — never drive past your headlights.”

“In other words,” he said, “your speed should be no faster than the time it takes to come to a stop in the range your headlights are illuminating. For the autonomous vehicle, it should be ‘don’t drive past your sensors.’ If their system has so much latency that it couldn’t brake for this woman crossing in front of it, it needs to be taken off the road.”

Work of art
Asked about how something like a thermal camera can see a pedestrian in the dark, Magney explained that the requirement isn’t just a capable sensor device but a piece of capable software. “Your software has to be able to predict the behavior of the pedestrian. Albeit complex, you can program based on context cues that help the AV predict the pedestrian’s motion and trajectory.”

He acknowledged, however, that such methods that an automated vehicle must use to examine and predict the movement of bystanders is “a work of art.” There are motion models for vulnerable pedestrian prediction that are good at predicting where the bystander will end up in a few seconds, explained Magney. “I assume Uber has these models in place, but perhaps the speed and lack of contrast limited the cameras’ ability to perform these calculations.”

Sensor confusion?
Experts do not suspect any sensor fusion confusion in this accident.

“The video shows the pedestrian, so the computer had enough info to react. Either the software is bad, the computer is too slow, the radar/lidar weren’t working, or all of the above,” concluded Demler.

In his opinion, Uber should be able to provide an answer right now by releasing the radar/lidar data. Of course, it’s great that the police released the video, he said. But the other sensor systems (radars and lidars) should show more, he added.

— Junko Yoshida, Chief International Correspondent, EE Times

Subscribe to Newsletter

Test Qr code text s ss