SAE 2014 Augmented and Virtual Reality (AR/VR) Technologies Symposium

Technical Session Schedule

Wednesday, November 19

Day Two: VR
(Session Code: ARVR2)


Time Paper No. Title
8:30 a.m. ORAL ONLY
Keynote: Beyond the Hype-cycle: VR/AR Opportunities and Challenges Ahead
Virtual and augmented reality (VR/AR) technologies have matured into viable problem-solving tools for a wide spectrum of industrial applications. However, despite this modest commercial success, the overall impact of these technologies is just beginning. In fact, the US National Academy of Engineering recently named “Enhanced Virtual Reality” as one of fourteen Grand Challenges of the 21st Century. This presentation describes recent advances in the technologies enabling VR/AR, summarizes the opportunities afforded by them, and outlines some of the open research challenges in VR/AR and other emerging interface technologies.
James H. Oliver, Iowa State Univ.
9:00 a.m. ORAL ONLY
Using AR for Product Development & Manufacturing process validation at Audi
During development phase of new vehicles prototypes are traditionally assembled manually by operators. Two of the main challenges for workers are to position flexible parts (such as brake tubes, cables and wiring) at exactly the right position and to verify if all parts are present during the assembly. To complete this process, operators are currently using drawings to compare the real prototype with the virtual CAD Data. One issue is that no measurement tools can verify exactly the position of flexible parts within a reasonable time frame. Augmented Reality solutions stand to both accelerate detection of construction and design errors while helping operators to place components or flexible parts at the exact position required. By overlaying the virtual CAD 3-D model on the real prototype with a high accuracy, operators are able to quickly compare the potential points of failure and detect if any components are missing. Furthermore the system allows designers to preview or evaluate design alternatives in real scale while minimizing the production of physical prototypes. The keynote will be focused on AR solutions supporting industrial processes and will be illustrated with a highlight use-case developed for AUDI AG. By employing this solution enterprises are able to save costs and time during prototyping assembly phases, detect construction and design errors in an early stage and reduce the amount of prototypes required in the development phase.
Thomas Jouhanneau, Metaio
9:30 a.m.
10:00 a.m. ORAL ONLY
Digitalized Expertise: Acquiring and Transferring Workflow Knowledge Using Augmented Reality
Workflow knowledge comprises both explicit, verbalizable knowledge and implicit knowledge, which is acquired through practice. Learning a complex workflow therefore benefits from training with a permanent corrective. Augmented Reality manuals that display instructive step-by-step information directly into the user's field of view provide an intuitive and provably effective learning environment.
Nils Petersen, DFKI
10:30 a.m. ORAL ONLY
Making the Business Case for AR/VR: Lessons learned from Aerospace and Energy
NGRAIN is working on two significant AR/VR project implementations in the aerospace and energy industries. The aerospace industry project is centered on the combined use of 3D AR/VR and visual analytics to address fundamental problems in the manufacturing of large-scale composite aerospace components, while the energy industry project is centered on using AR for field-based performance support and the integration of field operations/maintenance personnel with enterprise logistics and asset management systems.

Similar challenges exist in defining the business case for AR/VR in each application. While focussing on the aerospace application, in this presentation, we will share our lessons learned in making the business case for AR/VR with different stakeholders across the organization. Where appropriate, we will hi-light the business value similarities and differences across industries and organizations. We will also discuss the successes and challenges seen at the current stage of the projects and how they correspond to the initial objectives set for the projects.
Gabe Batstone, NGRAIN

11:00 a.m. ORAL ONLY
Workshop: Creation of Augmented Reality Scenarios
Augmented Reality (AR) enriches the user’s view by superimposing available digital information in context with the real world in the right place at the right time. These systems usually consist of four different elements: tracking, real images, virtual content and visualization. Tracking is the centerpiece of our technology. The perspective and coordinates of a camera - the so-called “pose” -are determined optically, with external or internal sensors. Depending on the use case, a wide range of virtual content is applicable, from simple text or symbols to 3D models that originate from a CAD system. Visualization means that the actual scene and virtual content are aligned correctly regarding perspective, scale and position, while being displayed for the user. The workshop will focus on the creation of an Augmented Reality scene and will be demonstrated through a live demo, which shows how to create your own AR scenarios. It will also explain the different tracking possibilities and how to choose the most appropriate one. The workshop will also answer questions about tracking, contents and how to optimize the tracking.
Thomas Jouhanneau, Metaio; Paul Robert Davies, Boeing
11:00 a.m. ORAL ONLY
Workshop: Location, Location, Location - Working with VR
This hands-on workshop will cover common motion tracking issues faced by Virtual Reality users. An optical motion capture system will be used for demonstration of the following techniques: Camera setup strategies for successful motion capture, Camera focus and calibration, and Rigid body tracking – how to align a rigid body model with the physical object being tracked. The latest Oculus HMD will also be utilized to demonstrate the capabilities of the built-in motion tracking hardware on the Oculus. In addition to the demonstrations, there will be time for discussion of motion tracking issues and sharing of best practices.
Kurt A. Chipperfield, John Deere Dubuque Works; Dan Lincoln, Ford Motor Co.
1:30 p.m. ORAL ONLY
Natural Human Interactivity in the World of AR & VR: A Pipe Dream or Reality?
As augmented and virtual reality get traction in enterprises, smart glasses are graduating from a tiny monocular display to a 3D immersive experience, overlaying rich contextual information right where you need it. Now the questions coming into the spotlight are “What’s the optimal interaction model for the enterprise-class workflows powered by the smart glasses? Can we combine hand gestures, voice, eye tracking, head motion, and contextualization to build a more intuitive and natural user interaction?” Augmented interactive reality promises to increase productivity and streamline workflows in ways that hasn’t been seen before.
Soulaiman Itani, Atheer
2:00 p.m. ORAL ONLY
LED Projection, Manufacturing Applications
Standard paper or electronic forms of work instructions have been used to standardize manual processes since the first days of manufacturing, and are still in wide use today in factories around the world. As augmented reality (AR) technology has continued to advance at a rapid pace, it is opening new doors to drive higher levels of quality, productivity, and training efficiency into worldwide manufacturing processes through its unique ability to place the right information, at the right place, and at the right time.
Paul Ryznar, OPS Solutions LLC
2:30 p.m. ORAL ONLY
Man Wearable AR/VR Systems
Discuss and demonstrate the advances in Man Wearable AR/VR systems that have been under development at Lockheed Martin. The trials and successes of designing, prototyping and manufacturing plastic Asymmetric lens for AR displays that provide Wide Field of views using COTS displays. The second display takes advantage of a specially designed Fresnel lens set to create a 170 degree VR display. These display systems designs are focused on Military Training requirements but should not be limited in their use.
Richard Boggs, Danielle Holstine, Lockheed Martin Corp.
3:30 p.m. Panel
Expert Panel Discussion: Head Mounted Display
A panel of 4-6 subject matter experts will be on hand to participate in a moderated discussion focused on the state of head mounted display technologies for augmented and virtual reality applications, and their implementation by end users in industrial use cases. The discussion will be centered on head mounted displays and what is possible in monocular, stereoscopic, and immersive platforms. Additional focus on what is accepted in the industry as value added for the end-user and what criteria to look for in selecting a headset will also be highlighted. The panel will also address technical barriers that currently exist in integration of headsets in production platforms, how technology is addressing the challenges in the field and how these are being overcome, leading to a discussion on the vision of where mounted headsets are maturing in the future. Audience participation is an integral part of the panel discussion, and questions/comments will be solicited by the moderator throughout.
Moderators - Jay S. Seddon, Boeing Company
Panelists -
Elizabeth S. Baron, Ford Motor Co.
Yuval Boger, Sensics
Kurt A. Chipperfield, John Deere Dubuque Works
Jeffrey Jacobsen, Kopin Corporation
Richard Moore, The Boeing Company