Gesture control is becoming more commonplace in many of the devices we use in our daily lives, from games consoles, swiping and pinching on mobile phones, to point-of-sale devices such as ticketing systems in train stations. The automotive segment is already adopting gesture control for infotainment systems as well as serving as a human-machine interface (HMI) for subsystems such as the sunroof, climate control and audio.
BMW’s latest 7 Series and 5 Series offer an HMI system that can detect four different gestures: setting the car’s navigation, browsing apps and starting the audio, answering phone calls and controlling the on-board computer. Volkswagen last year announced gesture controls on its Golf and pledges to spread the technology across its model ranges. Other major OEMs have the technology in their product pipelines.
Early reviews of gesture controls from the road-test media have been a mixed bag. While the reviewers have been impressed with the technology, they’ve not been thoroughly pleased with the functionality. In the 7 Series, for example, the driver can change the audio volume via a hand-circle motion, answer or dismiss a phone call using a left and right swiping motion, and use a two-fingers-down swipe to affect a user-configurable setting.
For designers and engineers developing and integrating new HMI, one of the challenges is providing the driver with the ability to manipulate gesture-controlled equipment while driving, without taking his or her eyes off the road. The logical solution is in using virtual controls combined with mid-air haptic feedback. Future systems from Ultrahaptics could offer designs that are infinitely more flexible.
The missing link
How can gesture-controlled interfaces integrate with the automobile infotainment/comfort control system in both safe and simple ways which the driver can trust? Current forms of touchless gesture control fall short of the requirements for effective human-machine communication: Feedback.
Voice-controlled systems can be difficult to operate; often exact phases require memorization and long menu chains must be navigated to obtain the desired effect. Voice can have such a variation from one user to another that the recognition software can make mistakes when ‘listening;' the result can be a very frustrated driver that defaults back to traditional controls.
Touch is a modern form of control and works to a point. The disadvantage is that drivers can lose sight of the road for precious seconds while manipulating the infotainment or comfort system controls. Again, it is often difficult to navigate multiple nested menus and options while driving safely.
Tried and true physical controls are becoming increasingly sophisticated with multi-function switches, navigation knobs and selection buttons. There is a lot of wiring and hardware for designers and engineers to consider here, not to mention packaging/real estate and placement.
Gesture control has become a simple and familiar approach to system control and is used quite well in many industries. Mid-air gesture control, most common place in gaming systems and virtual reality, is well suited for the automotive environment. While driving, the driver can maintain vigilance while manipulating vehicle amenities; one would seldom need to glance away from the road. The failure of mid-air gesture control lies in feedback to the user. The driver may never know where the controls are or whether the control was confirmed and executed. But that is about to change thanks to new developments.
Leap Motion Sensing is the key
Ultrahaptics' mid-air haptic technology is currently unique in the industry. The haptic system allows a user to feel and manipulate virtual objects in mid-air as if they were touching real physical controls.
Currently, the system's motion tracking ability is mainly based on the Leap Motion sensor, which tracks your hand in free space. The tactile interface is generated by an array of ultrasonic transducers, similar to those used in the reverse-warning systems in most modern cars.
The transducers generate ultrasonic waves at 40 kHz that cause constructive interference where the waves meet. These interference points can be manipulated to create invisible turbulence points that you can feel. The secret is, to control the touch sensation, sophisticated computer algorithms are implemented to manage this distortion by modulating the ultrasound beams.
Using a 16 x 16 array, haptic sensations can be produced up to 1 m (3.2 ft) away, with a focal point accuracy of 8.6-mm (.33-in) diameter—within a finger’s width.
The result of this modulation is that the user feels pressure where the beams focus on the hand. For example, a close-focused beam could simulate the sensation of a curved knob or button and a rapid modulation pattern would feel much like rubbing over a corrugated surface.
In the case of automotive infotainment systems, progression toward gesture-based controls seems to be the natural industry evolution. Most consumers are already familiar with the control method. Mid-air gestures are well suited for the automotive environment. What might such gesture control look like?
One implementation might be to create an interaction zone somewhere central to the dashboard (in a traditional placement). If the driver passes their hand through an acoustic barrier, they could actively feel where the boundaries of the interaction zone are. They would then know they can make appropriate gestures to control the infotainment system.
Another method might look for the driver's hand. Imagine that the driver places their hand within an interaction zone, then using the systems cameras and image processing ability, a virtual control could be locked to the hand. While activating the ultrasonic transducers with a predefined gesture a virtual button or knob could be continually available to the driver, even if their hand moves around within the interaction zone.
With these new design concepts, HMI related to drive control of the HVAC, power windows, sunroof, seat adjustments, and other systems, could be easily integrated into the infotainment center creating a new kind of all-inclusive control nexus within the cockpit.
Development of integrated, stylish and easy to use infotainment and comfort control systems are now only limited by the designer's imagination. Displays could be set up in a conventional way, with a primary central screen that contains all the visual feedback the driver needs. This would effectively leave controls as familiar to drivers as they have been for years.
Alternatively, automotive interior designers could explore new approaches to in-car infotainment. With new curved screen technologies, display information could integrate into the dash almost anywhere, in almost any shape. The ultrasonic sensor array could be placed appropriately to allow for effective mid-air feedback, creating an innovative, modern interior design.
A designer could also go as far as exploring new technologies. Head-up display options may be viable, or perhaps laser display projectors within the windshield. All are plausible options due to mid-air gesture control and feedback. The result is virtually endless design possibilities.
The technology that Ultrahaptics has introduced may well transform the way in which vehicle drivers control the technology at their disposal. New users bringing their experience with gestures from other tech gadgets will help minimize learning curves and make systems more intuitive to use. Line-of-site would be less used, leaving the driver’s eyes trained on the road rather than on various in-car distractions. And under the ‘skin,’ control systems can be made less complex regarding physical wiring, mechanical mechanisms, and the bill of material.
David Owen is VP Business Development at Ultrahaptics. A graduate engineer, he has held senior management roles with Philips, GEC Plessey and Ferranti.