Rear- and forward-facing cameras have made big inroads, but cameras that monitor interiors haven’t seen much acceptance. That may change, driven partially by the shift to semi-autonomous driving.
Years ago, interior cameras seemed poised for growth as a tool for occupant sensing for airbag deployment. That market didn’t materialize, nor did drowsy-driver monitoring, another potential application.
The push to add more autonomy on freeways or in traffic jams is now a major factor for putting cameras in cabins. Semi-autonomous driving systems fare well for keeping vehicles in lanes at safe distances from cars in front of them, but electronic controls aren’t yet capable of handling many unexpected events. Many observers feel cameras may help alert drivers when they’re needed, giving this market a bright future.
“I think we’ll start seeing some cameras inside the vehicle within three to five years,” said John Capp, Director, Global Vehicle Safety at General Motors.
Tier 1s are working hard to make this happen, both in the long and short term.
“We are working actively with multiple OEMs in various areas of development of interior camera systems—advanced development or concept studies, pre-development for next generation vehicles, as well as applications for series production vehicles,” said Tejas Desai, Head of Interior Electronic Solutions, Continental North America.
The cameras will help semi-autonomous systems determine whether they need to alert drivers when human cognitive skills are required or whether an alert may prove more distracting than beneficial. To do that, the safety system needs to know whether the driver’s paying attention, dozing, texting, or otherwise distracted.
“An inward-looking camera knows whether the driver is engaged or disengaged; you know if the driver is yawning or not paying attention to the road,” said Walter Sullivan Head of Elektrobit’s Silicon Valley Innovation Lab.
Once cameras are installed, they may take over jobs now handled by other sensors. For example, a driver-monitoring camera may be used to see if passengers are in a safe position for airbag deployment. The safety factor makes the interior a high-reliability environment.
“Interior cameras need very high reliability,” Capp said. “Incorrectly deploying an airbag has a higher potential for problems than a false alert for a braking system.”
While conventional CMOS (complementary metal-oxide semiconductor) cameras will dominate this application, an alternative technology may make inroads, partly due to its capability to handle gesture recognition along with other tasks. Time-of-flight cameras work like radar, sending out light beams and measuring how much time until the light returns.
“Time-of-flight technology enables 3D imaging so it can be used for gesture recognition as well as for occupant detection and driver monitoring,” said Gaetan Koers, Product Line Manager at Melexis. “Time-of-flight cameras are a couple generations away from CMOS imagers, so the volumes are still relatively small, but there will be an introduction on a car later this year.”
While cameras may make inroads, they aren’t the only technology suitable for driver monitoring. As with occupant sensing years ago, alternative sensors may be used to observe drivers. Some sensors are already on vehicles, so using them only requires the development of clever software.
“You can measure steering changes and pressure on the gas pedal, when people are drowsy their foot often eases back and they make fewer and bigger steering corrections,” said Martin Duncan, Business Unit Director at STMicroelectronics. “Sensors on the steering wheel and maybe even heart-rate monitors may also be used to monitor drivers.”
Another major challenge for cameras is that there’s not a good way to add light in interiors without using the infrared spectrum. CMOS cameras have some infrared capabilities, though it may not be enough given the low lighting levels of many vehicle cabins. That could force developers to add IR cameras, which drives up cost.
“At night you don’t get much reliable information unless you have infrared light,” said Shalini Gupta, Senior Research Scientist at NVIDIA Research. “We developed a multi-sensor system with color and depth sensors along with a short-range radar. By having redundant sensors, you always have at least two sensors so you get reliable information.”