Enhanced safety systems are getting more sophisticated, prompting many design teams to start monitoring drivers to help determine what actions should be taken. The ability to watch drivers will play a role in the transition to autonomous driving.
Driver monitoring can be an important aspect for engineers developing advanced driver assistance systems (ADAS) that will take control of brakes and steering systems when emergencies arise. If drivers are distracted, systems may take over sooner than when drivers are attentive.
Determining what must be monitored to understand driver’s attention is one of the challenges facing developers of both autonomous and ADAS systems. Some developers contend that cameras don’t have to monitor many points on the face. Simply watching whether the driver’s eyes are closing or where they’re focused will provide all the information that is necessary. Knowing this attention level will help determine when to warn drivers and when to take preventive action.
“Other driver-monitoring systems, such as eye tracking, can detect if the driver is alert so warnings may be less frequent than if the driver is not focused on driving,” said Brian Daugherty, Associate Director of Advanced Development at Visteon.
This condition monitoring can extend beyond watching the driver. Some systems can watch external driving conditions, preventing distractions when drivers need to be focused on driving. Distractions such as incoming texts can be held until the driver’s cognitive load is lower.
“We also need to understand driver stress,” said Doug Patton, Senior Vice President of Engineering for Denso International America Inc. “Someone driving in a snowstorm with a semi on the left and a snowplow on the right probably won’t want to answer a phone call. Many things can wait until the driver is more relaxed.”
Many systems will have to work together to keep the driver’s eyes on the road and hands on the wheel. Today’s active-safety collision-mitigation systems will probably be integrated with new driver-state sensing and workload-management systems, as well as with connected infotainment systems.
“Driver-state sensing and real-time workload-management systems monitor the level of attentiveness of the driver and can provide alerts or scale back available connectivity content when the driver is not paying adequate attention to the road,” said Jeff Owens, Delphi’s Chief Technology Officer. “This integration of these systems is critical to the future of safe and connected driving.”
Techniques for watching drivers will remain important when engineers start making the transition to autonomous driving. Before vehicles take control of braking and/or steering systems to avoid accidents, they need to know whether the driver is ready to take control of the vehicle.
“Our philosophy today is that the driver is always able to oversteer, but that becomes more difficult when you go to semi-autonomous driving,” said Alois Seewald, Global Director, Research and Development and Cognitive Safety Integration, at TRW Automotive. “There’s a lot of discussion about what’s necessary to bring drivers back to put their hands on the wheel. Some studies say it takes four seconds, or even eight if they’re comfortably relaxed.”
Autonomous controls will need to know whether drivers are ready to take control of the vehicle or whether they’re disengaged while reading, texting, or performing other tasks. Many Tier 1 suppliers are developing systems that watch drivers to see whether or not they’re poised to take action if an emergency arises.
“The driver’s condition is a big part of this,” Patton said. “We have a device that tracks 17 points on the face to see if the driver’s drowsy or inattentive.”