WASHINGTON — U.S. auto safety regulators on Thursday said they were opening a formal regulatory proceeding that could eventually result in the adoption of new safety standards for autonomous vehicles.
NHTSA said it was issuing an advance notice of proposed rulemaking to get public input on how to ensure the safety of future self-driving vehicles. Companies like General Motors’ Cruise unit, Alphabet Inc.’s Waymo and Tesla Inc. are working on vehicles that can drive themselves.
“This rulemaking will help address legitimate public concerns about safety, security and privacy without hampering innovation in the development of automated driving systems,” said Secretary of Transportation Elaine Chao in a statement.
NHTSA said the proceeding could result in the agency issuing new guidance documents addressing best industry practices, providing information to consumers or formal regulations including rules requiring reporting and disclosures to new legally binding safety standards on automated driving systems. Any final rules are still likely years away.
The agency said it is focused on key primary functions for self-driving systems.
NHTSA seeks input to develop “a framework that meets the need for motor vehicle safety and assesses the degree of success in manufacturers’ efforts to ensure safety,” it said.
John Bozzella, CEO of the Alliance for Automotive Innovation, said in a statement that the group would be releasing a policy roadmap for self-driving vehicles.
“AVs can enhance roadway safety and increase access to mobility, and that’s why Auto Innovators applauds the Department of Transportation’s continued work to advance this important technology,” he said. “In the coming days, Auto Innovators will release its own AV Policy Roadmap with additional recommendations for policymakers. Both of these announcements demonstrate the forward looking, positive steps being taken to create a regulatory framework to advance and govern this technology.”
The National Transportation Safety Board has faulted NHTSA for adopting what it called “a nonregulatory approach to automated vehicle safety” sand the agency has failed to develop a method for verifying manufacturers of “partial automation systems are incorporating system safeguards.”
On Thursday, NHTSA said it “has no desire to issue regulations that would needlessly prevent the deployment of any (automated-driving system)-equipped vehicle” adding “an ill-conceived standard may fail to meet the need for motor vehicle safety and needlessly stifle innovation.”
NTSB has criticized NHTSA in investigations of fatal crashes involving an Uber self-driving test vehicle and another involving a California driver killed using Tesla’s driver assistance system Autopilot.
There are also no regulations governing the performance of systems like Autopilot that allow drivers to keep their hands off the wheel for extended periods, but NHTSA can demand a recall if believes any vehicle poses an unreasonable risk to safety.
NHTSA’s special crash unit is investigating a dozen Tesla crashes in which it suspects Autopilot or some other advanced driver assistance system was in use.