Sound the alarm

  • 05-Jun-2015 10:05 EDT
aetrhmialerts.jpg

Watching the driver’s eyes will help Visteon's systems determine which alert will best focus attention on the potential hazard.

Lane keeping and adaptive cruise control combine to let humans relax on highways and in traffic jams, but people are still needed to maintain safety when situations stray from normalcy. Human-machine interfaces (HMIs) are being designed to alert drivers when they need to take control and respond to events that are beyond the understanding of the electronic controls.

When the vehicle’s driving, people will no doubt start doing other things like sending texts or e-mails, reading, or even dozing. But when deer or bicyclists pop up, or when other unexpected events go beyond the electronic controls’ capabilities, drivers will have to take control.

Getting their attention is going to be a big challenge for HMIs. Alerts can’t startle people, but the driver’s focus must quickly return to the road. It may be helpful if alarms guide them towards the potential hazard.

“When the driver’s immediate attention is needed, there may be a loud, strange noise,” said Ingo Krueger, Business Unit Director at IAV. “If the driver is inactive because they’re doing work, a voice command may be effective for focusing his attention to the left, right, or back.”

HMI developers are trying out several types of alerts. There are three prominent techniques: audible, visual, and haptic. Many planners note that the type of alert will vary depending on whether the driver is watching the road or not.

“Within those alerts are two different types of elements: contextual and directional,” said TC Wingrove, Senior Manager, Global Electronics Innovation, at Visteon. “Contextual alerts are useful when the vehicle system recognizes you aren’t likely to notice something, for example, if eye-tracking is enabled and the driver is looking left and a bicyclist is approaching from the right, an alert would be issued. Directional alerts inform the driver which direction a potential issue could be coming from. If an ambulance is approaching from the right rear, a visual indicator and audio signal could come from the rear right speaker.”

Most developers feel that it’s necessary to observe drivers to know whether they’re alertly watching roads or are distracted. Cameras can watch the driver’s eyes and head position to determine where he or she is focused.

“The next challenge is to get more technology to focus on the driver,” Krueger said. “With highly automated driving, you need to know about the condition of the driver, seeing if he’s reading the paper or has her eyes closed. You need to know how long it will take to bring the driver back to take control.”

Share
HTML for Linking to Page
Page URL
Grade
Rate It
3.00 Avg. Rating

Read More Articles On

2016-11-13
Lengthy automotive development and production cycles have long prevented automakers and startups from working together. While that’s changed a bit, many young companies still find it difficult to work with OEMs.
2016-11-13
Focused on the near-term safety-improvement potential underlying autonomous-driving technology, Toyota - counter to much of the auto industry - sees real promise in developing SAE Level 2-3 systems.
2016-11-15
Tanktwo, a Finland-based startup company is rethinking the basic battery cell and challenging the fundamental economics and operational assumptions of EVs. The ingenious concept is worth engineers' attention.
2016-11-14
Conti’s 48-V system will be standard equipment on both gasoline and diesel versions of the Scenic Hybrid Assist model. It is the first of multiple 48-V production announcements coming over the next few years.

Related Items

Technical Paper / Journal Article
2010-10-25
Training / Education
2017-10-26