On the night she struck and killed a pedestrian, the safety driver behind the wheel of an Uber self-driving test vehicle spent more than a third of her time staring at a cell phone instead of the road ahead.

An analysis conducted by the National Transportation Safety Board found Rafaela Vasquez spent 34 percent of her time looking at her phone while streaming the TV show “The Voice.”

During one driving interval, along the same stretch of road where the crash later occurred, she diverted her attention downward to the phone for 26 consecutive seconds. In the three minutes before the crash, she glanced at her phone 23 times.

In its final report on the March 2018 crash, released Wednesday, the NTSB said the probable cause of the crash was Vasquez’s failure to monitor the driver environment.

But that failure did not occur in a vacuum. While not diminishing Vasquez’s role, the report zeros in on the notion that automation can stir complacency in humans assigned to monitor its performance.

The phrase “automation complacency” appears 18 times in the report’s 78 pages, sounding an alarm for those in the auto industry working on everything from driver-assist features to conditional autonomy.

Automation complacency is “present in many crashes and seen in all modes of transportation,” NTSB Vice Chairman Bruce Landsberg writes in a supplemental statement to the final report. “Automation performs remarkably well most of the time, and therein lies the problem.”

The better the automation, the more substantial the potential for complacency.

Vasquez’s prolonged distraction was a “typical effect of automation complacency,” according to the report. In part, she had been lulled into this mindset because she had made the same trip around Tempe, Ariz., 73 times without incident before the night of the crash.

Automation complacency has been a factor in mishaps as varied as the 1979 Three Mile Island nuclear partial meltdown and an assortment of aviation crashes. In the Uber report, the NTSB cites the grounding of the Panamanian passenger ship Royal Majesty off the coast of Nantucket, Mass., in 1995.

More recently, the NTSB, a federal agency charged with investigating transportation crashes and making safety recommendations, has examined automation complacency’s role in car crashes, particularly as manufacturers such as Tesla and General Motors have rolled out advanced driver-assist systems.

Josh Brown, a driver killed in a May 2016 crash that occurred in Williston, Fla., while his Tesla vehicle’s Autopilot feature was engaged, was “inattentive to the driving task,” according to an NTSB investigation, and “his pattern of use of the Autopilot system indicated an overreliance on the automation.”

In August, the NTSB released a brief related to its investigation of another Autopilot-related crash, which occurred when a driver barreled into a parked fire truck in Culver City, Calif. Like the Williston crash, the agency found the probable cause of the accident was the driver’s inattention and overreliance on the driver-assistance system.

But the NTSB report from Culver City also cited as a probable cause the design of the Autopilot system itself, because it “permitted the driver to disengage from the driving task.”

Decisions on how to implement such features may cut to the core of a company’s safety culture. Much of the NTSB’s final report addresses failures in that culture as it related to the testing of self-driving systems in Uber’s Advanced Technologies Group.

Across its recent work investigating the intersection of automotive and technology, the NTSB suggests that scrutiny of safety culture should extend beyond testing and into the way the industry conceptualizes, implements and brings to market fledgling automated technologies.

“Automation in vehicles has great potential, but it must be developed and managed carefully,” Landsberg wrote. “That didn’t happen here. … It’s a dynamic environment, but evolution in nature and in technology where lives are at stake is a brutal process.”

Similar Posts