TEL AVIV, Israel — Amnon Shashua, the CEO of automated-driving technology developer Mobileye Global Inc., wants to change the conversation around automated driving.

Shashua told Automotive News the number-based nomenclature the auto industry relies on to confer automation capabilities must be more straightforward and adequately convey the division of responsibilities between human and robotic drivers.

“You either have eyes-on or eyes-off,” he said. “With eyes-on, you are responsible. With eyes-off, you are not responsible.”

As automated-driving technology has developed over the past decade, the auto industry has relied on what is known as the SAE Levels of Driving Automation.

Developed by the automotive engineering trade group, the taxonomy is numbered from zero to five to demarcate the gradations of automated driving. The range starts at zero — where the driver operates the vehicle manually — to five — where the car is self-piloted, regardless of the conditions.

The gradients describe and categorize the capabilities and limitations of automated driving systems and clarify who or what is responsible for driving at a given time.

While this works from an engineering perspective, everyday drivers aren’t generally familiar with the terminology and how it applies to their vehicle.

Level 2, for example, requires a human to be responsible for driving, even if the system is sustaining active control of the vehicle. Its features provide steering and throttle control, such as lane-centering and adaptive cruise control, at the same time. General Motors’ Super Cruise system and Tesla’s Autopilot are in this category.

The existing lexicon leaves too much room for confusion, Shashua said. He’s not the only befuddled industry insider.

“I am an expert in this, and half the time, I don’t understand what Level 2 and Level 3 are,” said Kelly Funkhouser, manager of vehicle technology at Consumer Reports. “I can relate to what most consumers are facing.”

A study published by the National Institutes of Health this year looked at drivers operating Tesla models with Autopilot engaged.

It “found that drivers became complacent over time with Autopilot engaged, failing to monitor the system, and engaging in safety-critical behaviors, such as hands-free driving, enabled by weights placed on the steering wheel, mind wandering or sleeping behind the wheel.”

Shashua is pitching a different lexicon based on product-oriented language.

He’s divided driver-assistance and autonomous driving into four broad categories, using simpler language — eyes-on/hands-on, eyes-on/hands-off, eyes-off/hands-off and a mode for robotaxi.

“Having something simple and intuitive like hands-on/eyes-on would facilitate consumers’ understanding, while it might be necessary for engineers and professionals to be familiar with the SAE levels,” said Xiaopeng Li, a civic and environmental engineering professor at the University of Wisconsin.

Most advanced driver-assistance systems require eyes on the road and hands on the steering wheel. They help the driver by issuing warnings, automatically braking in emergencies and making steering corrections within a lane. But the driver must be alert and remain responsible for the vehicle’s operation at all times. The assistance works as backup and support, not as a substitute driver.

The next level is eyes-on/hands-off. Mobileye’s Supervision advanced driver-assistance system is one example. The driver can take their hands off the wheel and let the car steer itself. The early systems work only on highways but will eventually expand to broader areas as developers such as Mobileye add more sophisticated sensors to what is known as the operational design domain. But still, the driver is responsible for the vehicle’s operation and must pay attention to traffic and road conditions.

Next up is eyes-off/hands-off. This is an iteration of autonomous driving; at this point, the car, rather than the driver, is responsible for the vehicle’s operations.

“Our belief is that the vehicle essentially becomes the ‘driver’ in that scenario, and any responsibility or liability risk would fall on the automaker and the system provider,” said Mobileye spokesperson Justin Hyde.

Automakers and system providers are conducting “significant work going on behind the scenes,” he said, validating eyes-off systems to better-than-human safety levels, which is needed to build consumer trust and limit liability risk to manageable levels.

A vehicle with that level of autonomy might switch, requiring eyes-on driver engagement in urban settings but driving itself on the highway.

Finally, there’s the no-driver, just passengers, level. That type of autonomous system guides a Waymo or Cruise robotaxi.

“Mobileye is headed down the right path with the taxonomy they proposed,” Funkhouser said. “In theory, it is the correct way to go, but manufacturers need to design their systems to work with it.”

That means better monitoring of motorists and ensuring they’re engaged with driving tasks as needed, she said.

Cameras trained at a driver’s eyes can see if they are open and if the person is looking ahead, but they can’t gauge whether the driver is scanning the roadway, looking in mirrors and engaging with traffic.

“They just know that the driver is awake and looking forward,” Funkhouser said.