During the transition from a future fully autonomous vehicle to a human controlling the steering wheel and brake pedal in emergency take-overs of vehicle control in complex situations, a distracted driver is bad news.
“The seamless exchange between automated and manual driving modes will be absolutely critical. We think having a driver who is alert and ready to take over will be fundamental to automated driving,” Jeffrey Owens, Chief Technology Officer and Executive Vice President at Delphi, said in an Automotive Engineering interview during an August media event at the Tier-1 supplier’s customer technology center in Auburn Hills, MI.
Similar to a commercial jet pilot handling take-off and landing tasks instead of the auto-pilot controlling the plane’s movements during the entire flight, initial real-world automated driving scenarios are not likely to replace a driver’s input all of the time.
“On the big jets you still have a pilot ready to take over at any point during the flight, and there is usually a co-pilot just in case the captain has an issue. And I think that illustrates one of the reasons we’ll have difficulty getting to a confidence level of being able to extract the driver from the entire driving process,” Owens said.
Delphi’s first public showing of its automated vehicle demonstration concept was at the 2014 Consumer Electronics Show in January in Las Vegas, NV. Since that unveiling, automotive customers, consumers, and media have taken a seat in the static prototype that features various Delphi technologies in a retrofitted 2013 Tesla Model S sedan.
The mission of the demonstrator vehicle’s system for driver-state sensing is to determine whether or not the person in the driver’s seat is ready for the transition from automated driving.
According to John Absmeier, Director of Delphi Labs in Silicon Valley and the Global Business Director for Automated Driving, the system is in the production development phase. “The automated vehicle is really an extension of driver-assistance technologies, and many OEMs have already sourced versions of a driver-state-sensing system,” said Absmeier.
Delphi’s prototype driver-state-sensing system captures vehicle data as well as information about the driver’s behaviors. “There are algorithms that take into consideration steering, throttle, and brake inputs as well as cameras that look at the driver’s head position, eye gaze, and eye closure in order to make a decision as to whether the driver is distracted, drowsy, or even intoxicated,” Absmeier said.
If the driver is distracted, staged alerts are issued.
The first cue a driver receives in the prototype demonstrator is a flashing amber light on the dashboard directly in front of the steering wheel. If another alert is needed, a visual signal appears on the center stack infotainment screen, and any non-driving pertinent infotainment content fades into the background. The system also disables the touch screen interface. If another alert is needed, a voice message of "place both hands on the steering wheel and look ahead in the driving direction" is broadcast as the driver’s seat shakes.
“The concept of full automation, where the driver is fully out-of-the-loop, is years out from being a production application. In the near-future, though, there will be limited-use cases such as low-speed traffic jams or expressway driving at highway speeds. So it will be important to develop technology that will help transition control back to the driver and make sure that the driver takes control of the vehicle,” said Absmeier.