Are we getting HMI issues right?

Article By : Junko Yoshida

Although U.S. looked into human-machine interface issues related to Autopilot, analysts don’t think regulators are helpful in their Tesla assessment.

« Previously: US regulator exonerates Tesla in 2016 fatal crash
 

The U.S. National Highway Traffic and Safety Administration (NHTSA) has found no defects in Tesla’s Automatic Emergency Braking (AEB) and Autopilot systems. But was that ODI finding, that “braking for crossing path collisions… are outside the expected performance capabilities of the system,” well understood by Tesla drivers?

Phil Magney, founder & principal advisor at Vision Systems Intelligence, told EE Times, “I do not think that cross-path collision is understood by the drivers. Nor do I think Tesla explicitly warns against this although ODI does not blame Tesla.”

In fact, instead of posing further questions, NHTSA apparently took Tesla’s word for it and wrote in the report:

It appears that Tesla’s evaluation of driver misuse and its resulting actions addressed the unreasonable risk to safety that may be presented by such misuse.

The Linley Group’s Demler agrees. Asked about cross-path collision, he told EE Times, “I think that’s an important point, and it’s the one Mobileye made immediately after the accident. In my opinion, NHTSA let Tesla off easy on that one.”

According to the report, NHTSA’s crash reconstruction indicates that the tractor-trailer involved in the crash should have been visible to the Tesla driver for at least seven seconds prior to impact.

This opens a new line of questioning. Demler asked, “NHTSA say the truck should have been observable for at least seven seconds. Isn’t the fact that the vehicle totally ignored the truck a defect?”

Demler added, “Again, I think the complication is Autopilot. Drivers are more likely to treat AEB as a backup system for when they happen to be momentarily distracted, or if the car needs to react faster than they can. But Autopilot says ‘let me drive,’ regardless of what warnings [Tesla] throws up on the display. Lateral crossing is an example of one of the critical issues that needs to be resolved for Level 3 [car] deployments. Just because you can automate acceleration, braking and steering doesn’t mean you can do it safely.”

Are we getting HMI right?

Although NHTSA’s investigation also looked into human-machine interface (HMI) issues related to Autopilot, the industry analysts don’t judge NHTSA as particularly helpful in their assessment of Tesla’s HMI. NHTSA once again just accepted Tesla’s explanation and Tesla’s methodology of warning drivers.

[Tesla Autosteer strikeout alert (cr)]
__Figure 1:__ *Autosteer ‘Strike-out’ Alert (Source: Tesla)*

Here’s what NHTSA wrote in the report:
The Autopilot system is an Advanced Driver Assistance System (ADAS) that requires the continual and full attention of the driver to monitor the traffic environment and be prepared to take action to avoid crashes. Tesla's design included a hands-on the steering wheel system for monitoring driver engagement. That system has been updated to further reinforce the need for driver engagement through a "strike out" strategy. Drivers that do not respond to visual cues in the driver monitoring system alerts may "strike out" and lose Autopilot function for the remainder of the drive cycle.

A safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted. Accordingly, this investigation is closed.

NHTSA seem almost unconcerned about the critical issues carmakers are about to face with drivers in cars that are increasingly automated. Magney said, “The biggest problems with these systems are drivers getting overconfident. And while Tesla measures steering wheel ‘micro-movements,’ I don’t think this is completely adequate for measuring attentiveness.”

Demler went a step further. Speaking of Tesla’s HMI issues, he asked: “Doesn’t the fatality prove there’s a weakness, or at least there was before Tesla added the ‘strikeout’ feature?”

Further, he added, “Why does NHTSA just accept that Tesla tells drivers to keep their hands on the wheel when using Autosteer? What’s the point of that? We shouldn’t be letting people take their hands off the wheel under any circumstances!”

Over the air (OTA) software upgrades

NHTSA’s report clarified one thing. While regulatory investigations tend to take too long, Tesla–through the power of software upgrades–was able to outfox NHTSA and fix the problem very quickly.

Demler said, “I agree. Unfortunately, technology advances much faster than our ability to regulate its use.” He added, “Nevertheless, Tesla didn’t exactly fix the problem. Their cars are no less susceptible to lateral collisions than before, unless they happen to occur after the driver ‘strikes out,’ at which point a human would presumably be monitoring their environment.”

Magney agreed, “OTA is vital for managing advanced features like this where software is being updated all the time.” However, he predicted, “Going forward, NHTSA will require extreme documentation through Operation Design Domains and there will be Enhanced Data Recording with careful monitoring of the software versions.”

In short, Magney said, “I don’t think OTA will have the capacity to cover up for a previous version of the software that may have lacked a certain feature.”

Narrow focus

“NHTSA was obviously careful to apply a very narrow focus to this investigation,” Demler observed. “I’d say it was about 90% AEB, and only 10% Autopilot.”

NHTSA “could have dug into Autopilot more, at least as a warning to other manufacturers building similar systems,” he noted. In summary what they said was “we accept Tesla’s descriptions of how Autopilot works, the warnings they provide, and the fix they implemented.”

That’s pretty weak, Demler added.

This article first appeared on EE Times U.S.

 
« Previously: US regulator exonerates Tesla in 2016 fatal crash

Subscribe to Newsletter

Test Qr code text s ss