The best way to understand how a new-developed vehicle and its parts will look, fit together and be built and serviced is through a full-scale prototype. Fashioning a prototype, however, often is too slow, too expensive or completely impractical early in the development process when so many decisions remain in flux.
That's one reason Virtual Reality and Augmented Reality technologies—often shorthanded to VR and AR—are emerging as immensely useful development tools.
“Using these technologies helps in the decision-making process,” explained Joe Guzman, Engineering Group Manager for Global Virtual Design for General Motors. “You are able to quickly mock up digitally and in 3D what the physical product will look like two years later,” he said.
Creating a VR prototype takes hours or maybe days—compared to weeks or months for a physical prototype, Guzman said.
But why the extra expense of VR or AR technologies when two-dimensional screen renderings of the CAD data are easily accessible? Although CAD renderings are good, they are lacking when it comes to human interactions: the ability to walk around, sit in, or understand how to access a part. Virtual reality and AR provide a sense of how a product will interact with humans who use and build it, how it will live in a three-dimensional world.
“It is especially useful for executives and program management, people who are responsible for the whole vehicle or the line of vehicles, but make no mistake, this technology is useful for every part of the business,” stated Guzman. This includes trainers for the assembly plant, D&R engineers trying to understand assembly and serviceability, or marketers creating brochures. Physical effects include how well flush-and-gaps show up against lighter or darker paints. “People use this from well before program kickoff all the way through to Job 1 and sometimes longer,” he said.
Guzman related that GM finds the company's four-sided, six-foot CAVE environment (see photo) particularly useful, where any number of individuals can move and observe a 3-D projection of a entire vehicle. He was also quick to point out that the company takes advantage of all levels of available VR, including the less-expensive mixed-reality headsets aimed at individual users.
“We are looking for strategic partners to help us,” he said, noting that a technology developed for the home market might have limitations in durability or safety in a work setting, hence the need for further development.
Collaboration and systems engineering
What currently fuels excitement in this area is the growing pace of development, especially for head-mounted systems. “I have been working in this technology for a long time and improvements were steady but small until about five years ago,” remarked Elizabeth Baron, Virtual Reality and Advanced Visualization Technical Specialist for Ford. “The big leap has been in headsets from companies like Oculus with its Rift and HTC with its Vive. Developed for gamers, those two headsets started this next revolution.”
While acknowledging the usefulness of 3-D VR and AR to any single individual, the contribution to systems engineering and collaboration are what make the technologies special for her. “It allows multiple disciplines in the company to communicate effectively,” she stated. An expert working on ergonomics and visibility can effectively talk to body structure engineers worrying about crash and safety as well as, say, an electrical engineer concerned with wiring-harness placement and clearance.
But increasing reliance on VR and AR for systems engineering requires careful selection of the CAD data to use, according to Baron. “By taking as much information from all of the engineering areas as we can, aimed at systems engineering, we get this immersive environment that crosses all of those disciplines,” she explained. Collaboration reaches across continents as well as disciplines: a Ford employee in Australia can interact through the VR room in real time with a colleague in Dearborn, MI.
The data also can include manufacturing processes and tolerances; users can see in 3-D the visual effect of GD&T tolerances from part-to-part at their extremes, or twist deformations from assembly. In the mathematical world of tolerance stack-up calculations, it is difficult to understand those effects as a customer would see them.
Interestingly, the room Baron in which demonstrated Ford's key VR technology was not a more-involved CAVE; instead, it was one equipped with a "powerwall" and multiple full-immersive headsets. Donning the headsets, an individual is enveloped by the 3-D environment, while the powerwall displays what the headset-wearer sees for the many others that might be attending a meeting. She also noted Ford continues to use actual parts and devices created from 3-D printers to mix into their immersive environments. “We might print out an instrument panelwith knobs the user can touch, then virtually put colors and textures on the panel,” she explained.
“The user experience is the centerpiece of our VR technology,” Baron stated. “We believe that Ford is leading in the use of real-time ray tracing (a computationally-intensive technique that creates scenes with realistic reflections and other specular effects) for immersive VR reviews."
Technology pull versus technology push
“The interesting thing is that in the last couple of years, customers have gotten ahead of us and are pushing us to develop AR and VR,” stated Mohsen Rezayat, Chief Solutions Architect for Siemens PLM. According to him, these customers see the potential and are eager to explore more. They find that it reduces error, increases training effectiveness, reduces build time and facilitates transfer of knowledge, according to Rezayat.
Rezayat, like others speaking here, points to the recent development of individual head-mounted displays as spurring development. “I think their manufacturers have begun to recognize that, particularly AR, is going to find its niche in business first,” he said, citing cost and aesthetics as barriers to wide consumer acceptance.
Dedicated to research, his group is looking ahead, particularly at the power of mixed-reality technology such as might be used in electronic work instructions.
“All of the data is right in front of you and displayed next to what you are trying to do,” he explained. “As we get closer to anchoring the real and virtual, we can display right on the vehicle how to install complex assemblies like a wire harness, where they can clearly see where the connections need to be and eliminate mistakes.”
Another scenario is mixed VR and AR, with an expert at a remote central location interacting with shop-floor or repair personnel. The remote expert could be fully immersed in VR, while the other needs to be involved with the immediate surroundings. Siemens is working on latency and data-transfer issues to enable in real-time interactions for these kind of remote connections.
Are there issues that still need to be addressed for even more adoption of VR and AR? “Proving ROI is important. With some wearables costing $5000 apiece, it might be hard to justify hundreds or thousands of units,” Rezayat speculated. Other key issues he pointed to were adaptability (some users can be subject to vertigo), battery life, limited field of view, insufficient memory to handle very large models—as well as the unknown effect of wearing such devices for many hours.
“There are challenges, but I want to emphasize that long-term there will be real benefits of AR and VR,” he said.
Action and interaction
Another focus is improving virtual interactions without relying solely on mixed realities, which is what the ESI Group is doing with its IC.IDO product, according to Eric Kam, Product Marketing Manager for Immersive Experience solutions for the company. As the company's flagship VR product, IC.IDO (pronounced “I see, I do”) provides—in addition to good visualization tools—a method to interact with virtual CAD designs inside virtual environments.
ESI's premise is that only through interactions will engineers understand key issues like product operation, assembly, or serviceability—while exclusively using digital geometry means doing it faster and at lower cost. Physical limitations are provided through real time physics such solid mechanics, collision detection and elastic and kinematic physics simulations.
How IC.IDO provides the “do” part is through a pair of grips the user sees in the VR world as disembodied hands. A trigger on the grip keys the hand to “grab” onto objects and move them in the virtual world.
A demonstration amply proved that one feels truly immersed in a virtual world when wearing head-mounted displays and using hand-like grips to move and manipulate objects. In many instances, the demonstration proved interacting with the virtual world indeed was intuitive and effective in understanding how parts move and fit with one another. The system did not include haptic cues which would have been useful; Kam noted that both ESI and the community at large is working to incorporate haptic feedback.
Kam emphasized that IC.IDO is scalable, from a desktop using a 2-D monitor to a fully-immersive experience through a head-mounted display and larger-scale projected VR CAVEs. “Until this latest generation of head-mounted displays came out, it required expensive equipment in special rooms with trackers; now engineers can do their own validation at their own workstation,” he said. “I like to think we are democratizing virtual reality with this product and others we are in the process of developing.”