Volkswagen made its first ever appearance at the Consumer Electronics Show in Las Vegas, in January 2015. In a display of Vegas-style razzle-dazzle, the German automaker unveiled the Golf R Touch concept vehicle, demonstrating how contactless intricate gestures can control infotainment and cabin features.
VW used the term “switchless” to refer to the concept car. The vehicle confirmed the technical feasibility of drivers, for example, controlling music volume by pointing one finger toward a touchscreen—several inches away from the glass—and sliding it left or right. Then, with similar movements of two fingers, the voice navigation got louder, and with three fingers phone volume was adjusted. Swipe your entire hand to the right to advance to the next song, or back to the left for the previous tune. Similar gestures allow contactless operation of the sunroof, lighting, and mirrors—turning the air space in front of the center console into a field for gesticulation.
So, is VW ready to ditch buttons and knobs and introduce Nintendo Wii-like means for vehicle operations? Not quite.
“We implemented the Golf R Touch to get our heads around gestures,” said Dr. Andreas Titze, head of Volkswagen’s electrical development for the interactive electronics department, based in Wolfsburg, Germany. “We pushed the concept of a switchless gesture-controlled car to the limit in order learn what we can do, what we should do, and what our customers want us to do.”
Make no mistakes. Volkswagen and other manufacturers experimenting with gesture-based human-Machine Interface (HMI) are fully vetting these concepts with customers, making sure that familiar controls are preserved before new systems are introduced. “We are not playing with customers or the cars,” said Titze. “We are about rock-solid high customer value.”
At the same time, the entire industry is doing its best to respond to the evolving ways consumers use phones, tablets, and other electronic devices. “Our customers are changing, and we are adjusting to that,” he said.
Benjamin Oberkersch, a spokesman for research, development, and environmental communications at Daimler said that Mercedes Benz concept cars, as early as 2010, introduced controls via touchpads, cameras, voice, and eye tracking—but for evaluation, not immediate implementations. “We always show our latest innovations, but long before these technologies are used in series production cars,” he wrote in an email.
Market demand, and confusion
“It’s not like you have crowds of people with pitch forks storming the capital,” said Mark Boyadjis, Senior Analyst of HMI and Infotainment at IHS Automotive, a market research firm. “People aren’t demanding gesture recognition, but they are demanding an intuitive user interface.”
The problem, according to both Titze and Boyadjis, is that offering full-blown gesture control could add rather than reduce user confusion. Boyadjis cited lack of industry standards as a stumbling block. “A left swipe in a Toyota could mean something totally different than a left swipe in a Hyundai,” he said. Even if standards are quickly developed, users will still face a learning curve—not something you want to encounter while speeding down the highway.
Proximity sensors are close
Gesture controls could be viewed as an extension of touchscreens. Touchscreens have evolved from resistive (required a legitimate push on the screen); to capacitive (only needing the finger to be at the screen); to “touch with no touch,” in which fingers can be close but not actually in contact.
The user of proximity sensors started in 2012 with the Cadillac Cue system—followed by Volkswagen in 2014 with proximity sensors in standard radio units of the seventh-generation Golf.
Proximity sensors aren’t only about easier press-targets. They also allow interfaces to change as your hand approaches. For example, if you are using a navigation system, all the screen real estate could be used for the map to maximize legibility. However, when your hand approaches, additional menus or inputs are offered. Similarly, eye tracking could be used to anticipate a desired function simply by looking in a certain direction.
Infrared sensors are sufficient for proximity or crude whole-hand gesture recognition of up, down, left, and right. However, more robust gestures that recognize individual fingers will require either a high-resolution stereo camera or a combination of sensors.
Delphi announced at 2015 CES that its technology—which uses an overhead 3D infrared (mono) camera—will be put into production this year for finger recognition in select BMW models for the European market. “It’s the same basic technology as [Microsoft Xbox] Kinect,” said Doug Welk, Delphi’s Chief Engineer of Advanced Entertainment and Communications.
“This is the first gesture module to go to market. It’s a start,” said Welk. “We see a lot of interest in the ability to control the vehicle through non-tactile things, particularly in Europe where displays are very forward, and not convenient to reach out and touch.”
IHS’s forecast for relatively simple proximity sensors is a rise from 43,000 units in 2012 to a whopping 18 million by 2021. Boyadjis expects whole-hand gesture systems to grow to 4.3 million by 2021—and the full battery of gesture controls to reach about 2 million units by 2021, according to IHS.
Where to draw the line
Nobody expects that gestures will be used to control steering, acceleration, and braking—and already there are signs of backpedaling.
Boyadjis said that Ford discovered that customers were confused by overlapping speech, steering wheel, and touchscreen options, so it decided against complex haptics, proximity, and other controllers in its new Sync 3 platform. “Ford appears to be backing away from multimodal, and solidifying touchscreen as the primary input mechanism, with voice as a close second,” said Boyadjis.