BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

New Ratings Aim To Make Partial ‘Self Driving’ Safer

Following
This article is more than 2 years old.

Vehicles on the market today often have partially automated systems that “share” the driving with motorists, like Autopilot, Pilot Assist and Super Cruise.  But most of the new technology doesn’t provide enough back-up protection and lacks features that help prevent or reduce misuse of those systems – both deliberate and unintentional, according to the Insurance Institute for Highway Safety, a nonprofit financed by the insurance industry. 

“Partial automation systems may make long drives seem like less of a burden, but there is no evidence that they make driving safer,” David Harkey, president of the Insurance Institute, said in a statement. “In fact, the opposite may be the case if systems lack adequate safeguards.”

On Thursday the safety group announced a new ratings program that aims to evaluate how well partial automation systems provide important protections, which it hopes will encourage automakers to create better designs that will help keep drivers focused and out of danger. 

The first set of ratings, currently in development, is expected to be issued this year.

Despite misleading messaging from some manufacturers, Insurance Institute researchers noted, human drivers must still handle many routine driving tasks that the systems aren’t designed to do, and also monitor how well the automation is performing and be ready to take over if problems arise. While most partial automation systems have some safeguards in place, none has all of its recommendations, the safety group said.

Systems currently on the market use cameras, radar or other sensors to “see” the road, and combine adaptive cruise control (ACC) and lane centering with other driver assistance features. For example, ACC maintains a driver-selected speed, but will automatically slow to keep a set following distance from a slower moving vehicle ahead and then accelerate when the way is clear.

To date, even the most advanced systems require active supervision by the driver, but some manufacturers have oversold the capabilities of their systems, prompting drivers to treat the systems as if they can drive the car on their own, the researchers said. “In egregious cases, drivers have been documented watching videos or playing games on their cellphones or even taking naps while speeding down the expressway,” they added, citing the fatal crash in Mountain View, California, in 2018, involving a Tesla Model X and its Autopilot system as an example. 

“The way many of these systems operate gives people the impression that they’re capable of doing more than they really are,” Alexandra Mueller, an Insurance Institute research scientist who is spearheading the new ratings program, said in a statement. “But even when drivers understand the limitations of partial automation, their minds can still wander. As humans, it’s harder for us to remain vigilant when we’re watching and waiting for a problem to occur than it is when we’re doing all the driving ourselves.”

 The safeguards will be rated good, acceptable, marginal or poor.  

To receive the top rating of good, the Insurance Institute said, automakers should ensure that their safeguards:

  • monitor both the driver’s gaze and hand position. While there is no technology that can determine whether people are focused on driving, there are features that can monitor drivers’ gazes, head postures or hand positions to assess if they are actively engaged in driving.
  • use multiple types of rapidly escalating alerts (chimes, vibrations, pulsing the brakes or tugging on the driver’s seat belt) through various channels and with greater urgency as time passes, to get drivers’ attention to remind them to look at the road and return their hands to the wheel when they’ve looked elsewhere or left the steering unattended for too long. Evidence shows that the more types of alerts drivers receive, the more likely they will notice them and respond. 
  • slow the vehicle to a crawl or stop if the driver fails to respond, notify the manufacturer for emergency intervention, and keep automation off for the remainder of the drive.
  • make sure automated lane changes are initiated or confirmed by the driver.
  • ensure that adaptive cruise control does not automatically resume after a lengthy stop or if the driver is not looking at the road.
  • make certain that lane centering, which continuously adjusts the steering to help the driver keep the vehicle centered in the travel lane, does not discourage steering by driver; it should motivate the driver to share in the steering rather than switching off automatically when the driver adjusts the wheel. 
  • design automated systems to prevent drivers from using them when their seat belts are unfastened or when automatic emergency braking or lane departure prevention/warning is disabled.

“Nobody knows when we’ll have true self-driving cars, if ever,” Harkey added. “As automakers add partial automation to more and more vehicles, it’s imperative that they include effective safeguards that help drivers keep their heads in the game.” 

For more information, click here and here.

Follow me on Twitter or LinkedInCheck out my website