The National Transportation Safety Board has investigated landmark car crashes involving self-driving vehicles, and its findings could shape the future of a technology the auto industry has spent billions of dollars developing.
Is anybody paying attention?
Robert Sumwalt isn’t so sure. The NTSB chairman lamented last week that few are listening to the federal agency charged with probing crashes so that safeguards can be implemented and future tragedies averted.
In a hearing Tuesday, Feb. 25, that stemmed from a fatal crash involving Tesla’s Autopilot driver-assist system, he said California highway agency Caltrans, the U.S. Department of Transportation and the automaker had not yet provided responses to questions from the NTSB regarding the crash, which killed Walter Huang. Nearly two years have passed since the March 23, 2018, crash, and Sumwalt’s frustration was palpable.
“How we effect change is through our recommendations becoming implemented, and it’s, frankly, disheartening that these are not being responded to,” he said.
Tesla’s lack of cooperation comes as no surprise. In the early days of the investigation, the NTSB revoked the company’s status as a party to the probe after the company released select data from the crash — data that suggested the driver was at fault. As last week’s hearing proceeded, Elon Musk tweeted about ice cream and Kit Kats.
The surprise perhaps was the lack of response from other agencies. Tensions have simmered between NHTSA and the NTSB for some time, at least partially because NHTSA has relied upon voluntary guidance for automakers to handle the influx of new technology instead of promulgating regulations.
If that’s happened largely behind the scenes, it spilled into stark view last week. The NTSB swung, and swung hard: Excoriating NHTSA’s lack of standards for driver-assist systems that fall within the Level 2 automation definition. Calling out the federal regulator as putting industry profits ahead of safety. Identifying shortcomings in NHTSA’s own investigations into Tesla’s Autopilot system.
As she did in a recent hearing examining Uber’s fatal self-driving car crash, board member Jennifer Homendy reminded NHTSA of its core mission to safeguard the driving public, then scolded the agency for a Jan. 7 tweet that said it was “working to keep regulations reasonable” in an effort to keep cars affordable.
“Let me be clear,” she said. “NHTSA’s mission is not to sell cars.”
The hearing felt as much a trial of NHTSA’s competence as it did a crash summary. Shortly after her first comments, Homendy asked Robert Malloy, the NTSB’s director of the Office of Highway Safety, a question: In his opinion, had NHTSA protected the safety of the driving public from unreasonable risks that stem from automated technology?
“No, I believe NHTSA has not taken the approach that is best for safety in this situation,” Malloy said. “For my staff and myself, there’s nothing more disappointing than investigating a crash, coming up with a good solution, and having no response from Tesla, and in NHTSA’s case, ‘No, we don’t need to do that, and it’s not happening.’ ”
He referenced a series of recommendations the NTSB made regarding driver-assist systems from its first Tesla-related investigation. Recommendations that have, thus far, not been acted upon by NHTSA.
The NTSB’s response last week was to double down, and then some. The agency reiterated seven of its previously stated safety recommendations, and it added nine new ones.
Recommendations include an examination of Autopilot to determine whether its limitations, potential for misuse and ability to operate outside its intended design pose an unreasonable risk to safety.
The board included recommendations for automakers to develop driver-monitoring systems that help guard against “automation complacency.” And it recommended that companies such as Apple develop mechanisms that disable cellphone functions that distract drivers while a vehicle is in motion.
At the hearing’s conclusion, Sumwalt’s final statements took on a demoralized tone.
“We’ve issued very important recommendations today, and we’ve reiterated recommendations,” he said. “If those recommendations are not implemented, if people don’t even bother to respond to them, then we’re wasting our time.”
Yet the NTSB’s findings involving fledgling driver-assist systems such as Autopilot and self-driving cars could, in the long run, help these technologies reach their promise of reducing traffic deaths and injuries. The haphazard arrival of these systems, however, could scare off potential customers already wary of them.
The NTSB remains a fail-safe guarding against this worst-case scenario. Maybe the only one remaining.