1

Augmented Reality in the Aerospace Industry

There are many use cases for Augmented Reality in the aerospace industry and the leaders in this industry have a long history with the technology. In this post, we review some of the milestones and provide highlights of the recent AREA webinar.

In 1969, while working in the Human Engineering Division of the Armstrong Aerospace Medical Research Laboratory (USAF), Wright-Patterson AFB, Thomas Furness presented a paper entitled “Helmet-Mounted Displays and their Aerospace Applications” to attendees of the National Aerospace Electronics Conference.

Over 20 years later the paper was one of eight references cited by two Boeing engineers, Thomas Caudell and David Mizell. In their 1992 paper published in the Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, Caudell and Mizell coined the term “Augmented Reality.” The degree to which the team drew from the work of Furness, who had started the Human Interface Technology Lab at University of Washington in 1989, is unclear but the focus of the Boeing team was on reducing errors when building wire harnesses for use in aircraft and other manual manufacturing tasks in aerospace. 

Aerospace

While the technology was not sufficiently mature to leave the lab or to deliver on its potential at the time, they suggested that with an AR-assisted system an engineer would in the future be able to perform tasks more quickly and with fewer errors. 

Proof of Concepts

Approximately fifteen years later, in 2008, Paul Davies, a research & development engineer at AREA member Boeing began working with Boeing Technical Fellow, Anthony Majoros. Together, Davies and Majoros picked up where the Caudell and Mizell paper left off. They used commercially-available technologies such as Total Immersion’s D’Fusion platform to show how technicians building satellites could perform complex tasks with Augmented Reality running on tablets.

Airbus has also been experimenting with Augmented Reality for over a decade. In this paper published in the ISMAR 2006 proceedings, Dominik Willers explains how Augmented Reality was being studied for assembly and service tasks but judged too immature for introduction into production environments. The paper, authored in collaboration with the Technical University of Munich, focused on the need for advances in tracking. 

Since those proof of concept projects, AR technology has advanced to the point that it is being explored for an increasing number of use cases in the aerospace industry. In parallel with the expansion of use cases, the pace of applied research into AR-enabling technology components has not abated.

Augmented Reality in Aerospace in 2016

While today AR may not be found in many aerospace production environments, the promise of the technology to increase efficiency is widely acknowledged.

On February 18, David Doral of AERTEC Solutions, Jim Novack of Talent Swarm, and Raul Alarcon of the European Space Agency joined Paul Davies and me to discuss the status of Augmented Reality in their companies and client projects.

Each participant described the use cases and drivers for Augmented Reality adoption. For Boeing, the key metrics are reduction of errors and time to task completion. Use cases include training and work assistance. AERTEC Solutions, which works closely with Airbus, and Talent Swarm are both focusing on use cases where live video from a head-mounted camera can bring greater understanding of a technician’s context and questions, and permit more rapid analysis and resolution of issues.

The European Space Agency sees a variety of use cases on Earth and in space. Inspection and quality assurance, for example, could benefit from the use of Augmented Reality-assisted systems.

Turbulence Ahead 

During the discussion, webinar panelists explored the obstacles that continue to prevent full-scale adoption. In general, most barriers to adoption can be considered as technological in nature. But there are also significant obstacles stemming from human factors and business considerations. We also discussed the degree to which other industries may be able to apply lessons learned from aerospace.

To learn more about the state of AR in the aerospace industry, please watch the webinar archive.

Do you have use cases and projects that you would like to share with the AREA and our audiences? Please let us know in the comments of this post.

 




New Augmented Reality Case Studies Suggest Productivity Improvement

In the future, Augmented Reality could play a role in a variety of production or assembly processes. On the one hand it can provide support for those working on individual, custom products made in mom-and-pop shops or by specialized welders on location. At the other extreme, Augmented Reality can also play a role in high-volume, low-mix manufacturing in factories full of automated and specialized machines.

In highly automated production facilities, workers are few and far between. Their role is to anticipate and respond to the needs of machines. These machines usually have dozens or even hundreds of sensors continually capturing information about the machine’s activities in the real world.

In today’s factories, most sensor data is sent directly to a control room. Human operators receive alerts or make decisions based on raw readings or on algorithms that analyze the sensor observations, and then go to the machine to perform planned and unplanned procedures on the equipment. The operator travels between the control room and the production machinery to determine the status as procedures are implemented. There may be changes in the data while the operator is in transit. The operator may make mental errors, forget or invert data when transcribing observations or once at the machine.

New case studies recently released by AREA member DAQRI provide a glimpse into the future.

Kazakhstan Seamless Pipe Steel Operators See More

A team of DAQRI solution architects visited the Kazakhstan Seamless Pipe Steel (KSP Steel) factory in Pavlodar, Kazakhstan and studied the problems facing machinery operators up close. They then developed and demonstrated an application for Hot Rolling Mill Line optimization using the DAQRI Smart Helmet.

Live machine performance data could be seen in real time by those using the DSH when on the shop floor. The factory supervisor remarked that this technology has the potential to “decentralize” the control room and reduce the time for workers to respond to machinery performance data.

The results of the demonstration suggest that using Augmented Reality in the manner implemented by this project could reduce downtime by 50% and increase machine operator productivity by 40%.

More information about this project and a video of the DSH in use are available on the DAQRI web site.

HyperLoop Welders Receive Support on the Spot

A project involving the DSH on the HyperLoop, a transportation system invented by Elon Musk and being prototyped in 2016, demonstrates another use case that has a great deal of potential to offer productivity gains.

In a proof of concept with HyperLoop engineers and the DSH Remote Expert application, experts in a central “command” center view live video coming from remote robotic welders. The supervising engineer in the Los Angeles office sees construction progress and provides audio and telestration guidance while a welder performs a very specific spot weld. The description of the project and a video of the DSH in use are also available on DAQRI’s web site.

Tip of the Iceberg

These case studies reveal the potential for dramatic productivity improvements when workers are equipped with Augmented Reality-assisted systems such as the DSH.

Other enterprise customers are testing the use of Augmented Reality for manufacturing and production of a wide range of products. Stay tuned! New case studies with details about the potential for significant customer benefit will soon be coming to light.

If you have a case study that you would like to share, provide a link to it in the comments of this post or contact the AREA’s editorial team. We will be happy to support the preparation and publication of your case studies and testimonials.

Daqri_logo_Horizontal-sm

 




Customers Are in Focus at Augmented World Expo

By Christine Perey and Ketan Joshi

Every enterprise AR project is a tremendous learning experience. While every enterprise AR project requires a team, there’s always that shining hero without whose commitment the project would not have come into existence. These heroes of enterprise AR will be the focus of attention during a full day of sessions of the Augmented World Expo 2016 Enterprise AR track.

The in-house managers of the first enterprise AR projects at customer organizations are a special breed. They are special by virtue of their vision, their passion, their persistence and their ability to span many disciplines and stakeholders.

On the one hand they must master dialects of an emerging “Augmented Reality” language that vendors speak, from the nitty gritty details of tracking technology to the subtleties of interactions like hand gestures and voice commands. On the other, they must know when and how to manage their company’s internal IT department priorities and constraints.

And they are rarely recognized for their role in bringing Augmented Reality from science project to enterprise-ready solution.

Bringing the Best and Brightest to the AWE Stage

The AREA is hosting the AWE Enterprise AR track. June 2 will be dedicated to presentations by, and discussions with extraordinary enterprise project managers as they share their important AR project achievements.

AWE

While AREA members will bring these pioneering enterprise project managers to the AWE stage, we are sure there are many others who have gone unnoticed.

  • Are you a leader in a company that has been testing enterprise AR?
  • Did you sacrifice nights, weekends and holidays to make sure that your project stayed on course and could continue?
  • Do you feel you’ve had to reset every goal and yet have never forgotten the ultimate benefits that your company could gain from enterprise AR introduction?

We hope you will let us know if you are one of this special breed, or if you know a manager at a customer company who has such experiences to share.

A Simple Framework

During these AREA-hosted Enterprise AR track sessions, AWE delegates will learn about a variety of unique enterprise Augmented Reality pilot projects and deployments. The presentations will follow a framework that will provide practical guidance to those who will follow in their footsteps.

The case studies will cover:

  • Use cases
    • Tasks or processes prior to AR implementation and selection criteria
  • Custom or off-the-shelf tools and services used in the project
    • Selection process of project partners
  • Project time and resource requirements
  • Demonstration or a video of the solution in action
  • Project outcomes and their measurement
  • Future plans

With your support, we are looking forward to identifying and bringing together the heroes of enterprise AR projects and celebrating their achievements on June 2.




Advancing Toward Open and Interoperable Augmented Reality

Enterprise Augmented Reality engineers and content managers who published experiences created with Metaio’s software tools have or will soon encounter a situation they didn’t anticipate: the publishing and delivery environments are unsupported and not evolving to take advantage of the latest enabling technologies.

Are you among this group? If so, you are not the only one to find yourself in this uncomfortable situation.

If there was a mandate to continue providing the value of their AR experiences to end users, customers of other AR software providers who are no longer supporting or advancing their platforms with the latest technology innovations hit the same roadblock. Prior to agreement on standards, they could not “port” their experiences to another AR platform. Evaluating and choosing another proprietary AR technology platform, and then investing in re-authoring, testing and re-deploying AR experiences based on their original designs, was the only way forward.

Unfortunately, some of those reading this blog are in this awkward position today.

Successfully addressing the root causes of low AR experience “portability” and the inherent lack of integration or interoperability between AR authoring and publishing systems is an important, highly collaborative process.  Different parts of the AR ecosystem must agree that there are issues, firstly, and then on principles for collaboration. Then, based on shared conceptual frameworks, they must work together towards implementing those principles in their workflows and solutions.

Supporting that collaborative process is the reason I’ve been leading the grassroots community for open and interoperable Augmented Reality content and experiences since 2009.

Is There Really a Problem?

Interoperable Augmented Reality is not a high priority for most people. Only about a hundred people are consistently investing their time in advancing the principles of open and interoperable Augmented Reality. We know one another on a first name basis; many of us compare notes in person a few times per year. Another few hundred people know of such activities but don’t directly invest in meaningful ways.

For most companies, the investment in AR has not been great. A few tens of thousands of dollars to rebuild and deploy a half dozen carefully handcrafted AR experiences is minor by comparison to investments in other enterprise technologies. 

“There’s still too much innovation to begin working on standards,” is another commonly heard refrain. Clearly they haven’t been reading the posts or listening to the presentations made by AREA member IEEE Standards Association, or leaders of other standards development groups. When designed collaboratively and to address interoperability in strategic places, there are many examples of standards doing the reverse.

There are other reasons for many to turn a blind eye to the problems. They are valid for different people to different levels.

This is a Serious Problem

In my opinion, ignoring the lack of open and interoperable Augmented Reality solutions and services is doing everyone a disservice.

The fact that only a relatively low amount of money has been invested to date is a poor justification for investing yet more time and money into building experiences with another proprietary platform, only to have the same scenario in a matter of months or years.

In fact, innovation in Augmented Reality is not what it should be today because many of the best AR developers are building a better mouse trap: smart engineers are working to solve problems that have, for the most part been solved by others, in a different way. Whether it’s for reasons of avoiding encroachment on a third party’s patents or something else, this investment of effort is in highly integrated proprietary silos and at the expense of solving other problems that remain unaddressed.

There are three more serious problems with having only proprietary technology silos and very low use of widely agreed standards for Augmented Reality experiences. The first of these is that enterprises with assets that could be leveraged for AR experiences are unable to integrate production of AR experiences into their corporate workflows. This lack of integration between AR as a method of information delivery and other information delivery systems (e.g., web pages and mobile services without AR support) means we can’t seriously stand before a CIO and recommend they support the development of AR content. What we are recommending requires setting up another entirely separate and different content management system.

In the same vein, the second reason that enterprise CIOs and CFOs are justifiably reluctant to deepen their investment in AR projects is that they cannot deploy modular architectures in which multiple vendors can propose different components. In today’s landscape of offerings, it’s all or nothing. The customer can buy into provider A’s system or that offered by provider B. If provider C comes along with a better option, too bad.

The third reason the lack of standards is a serious problem worthy of your support is closely related to the other two. Deep collaboration between AR-enabling technology vendors (providers of technologies) and service providers is currently very difficult.  They are not working to improve customer outcomes: they are working much more on competing with one another for attention and for the small investments that might be made.

Three serious enterprise AR obstacles that agreements about open and interoperable AR could reduce

  1. Low or lack of content or experience portability between proprietary technology silos

  2. Strong customer aversion to risks due to vendor lock-in

  3. Low cooperation between competitors or ecosystem members to partner for best customer outcomes

This situation with lack of interoperability and fear of vendor lock-in would be addressed if the vendors took a more serious look at possible open interfaces and standards within a larger framework. Conversely, vendors might study new approaches and establish some level of interoperability if they believed that customers would respond by increasing their budgets for Augmented Reality.

This is all very serious.

Another recent development is not helping: it’s clear that some internet and IT giants are paying a lot of attention to AR. The lack of visibility into what highly competitive and successful companies like Microsoft, Google, Apple and PTC will do about AR interoperability and integration has cast a very cold spell over enterprise AR adoption.

Their lack of support for standards and their unwillingness (to date) to shed light in a public way on how they will cooperate or how their proposed (future) systems will interoperate is causing so much uncertainty. No CIO or CFO should seriously invest in enterprise Augmented Reality until these companies’ plans with respect to integration and interoperability are clearer.

Progress is Being Made

We should be open to the possibility that 2016 will be different.

Thanks to the dedication of members of the grassroots community, the situation is not as bleak as it could be. A few weeks ago a few dozen members met in Seoul, Korea, to compare notes on progress. SK Telecom, a strong supporter of open and interoperable Augmented Reality, hosted two days of sessions. We heard status updates from four standards organizations that have highly relevant activities ongoing (Khronos Group, Open Geospatial Consortium, IEEE and ISO/IEC). We also received reports from AR developers who are working to advance their solutions to support standards.

The fact that the ISO/IEC JTC1 Joint Adhoc Group for Mixed and Augmented Reality Reference Model is nearing completion of its work is a major development about which I presented in Seoul.

In the spirit of full disclosure: the community of people in support of open and interoperable AR was the environment in which this work began, and I have been a member of that ad hoc group since its formation. If you would like to obtain a draft of the Mixed and Augmented Reality Reference Model, please send me an email request.

We are also seeing increased interest from industry-centric groups. There is a German government supported project that may propose standards for use in automotive industry AR. The results of an EU-funded project for AR models in manufacturing became the basis for the establishment of the IEEE P1589 AR Learning Experience Model working group (which I co-chair). In a recent meeting of oil and gas industry technologists, formation of a new group to work on requirements for hands-free display hardware was proposed.

These are all encouraging signs that some are thinking about open and interoperable Augmented Reality. If you want to monitor the activities of the grassroots community focusing on this topic, and to receive announcements of upcoming meetings, visit this page and register yourself for one or more of the mailing lists.

Have you seen other signs that there is increasing awareness of the problems? Do you know about any new standards that should be monitored by and presented during a future meeting of the grassroots community?




Enterprises Want to Use Wearables

Many workplace scenarios require use of both hands to manipulate physical world objects. Having a display on the wrist or head (or both) with a variety of sensors and optional cloud services, offers attractive alternatives to tablets for supporting access to real time or contextual information.

According to a Gartner Group report shared at the Enterprise Wearable Technology Summit (EWTS), sales of head-mounted displays will be greater in enterprise than consumers until at least 2020.

Gartner-slide

Unfortunately, the interest in enterprise wearable computing is not currently being addressed by consumer technology providers.

Connecting Those with Questions to Those with Experience

What are current enterprise customer requirements? What have enterprise wearable pioneers learned? What are enterprise customers’ best options today? These were among the questions that the EWTS organizer, BrainXchange, set out to answer.

BrainXchange chose Houston for its inaugural event on October 20-21, 2015.  The city is a business center for the oil and gas industry and is reachable from an international airport as well as from both coasts of the US.

Over 150 delegates from at least six countries gathered to hear from 60 speakers, including many veterans of the Google Glass Explorer program and vendors looking for new customers. The format offered plenty of networking in a convivial and relaxed atmosphere. 

AREA Members at EWTS

AREA Member Role
XMReality Sponsor
Augmate-Logo3x2 Speaker
 EPRI-profile-logo Speaker
 APX-Profile-logo  Delegate in attendance 
perey-profile-logo Delegate in attendance 

Criteria for Enterprise Wearable Success

There is wide agreement with the simple guidance that Joe White, VP and GM Enterprise Mobile Computing at Zebra Technologies offered during his opening remarks.  White recommends that enterprises focus on systems that are:

  • Technically sound
  • Socially acceptable
  • Solve a problem

These criteria sound simple, but adhering to them requires careful research and planning. Many delegates at the summit who are shopping for wearable technologies don’t feel that the current commercial technology options are sufficiently mature for most of their use cases. One person confided that everything his team has evaluated to date “feels like a science project.”

Weight, balance and resolution remain significant technical obstacles but short battery life as a result of high power consumption continues to be high on the list of technology barriers.

One test of wearable display technology reliability is how well it performs in a live demo on stage. There were more videos than live demos, but Rafael Grossman, a highly promoted surgeon in the Google Glass Explorer program successfully demonstrated Atheer Labs’ AiR platform for the audience.

Another criteria added to White’s list over the course of the first day was cost. If devices are expensive to purchase and to operate or maintain, adoption and use will remain limited.

Regardless of the criteria and how firmly an organization wants to adhere to them, customers remain divided about what’s truly going to solve their problems. Some feel that their use cases require true Augmented Reality in enterprise. Others are, at least for the present, finding the “simple” delivery of live information or images to a wearable display (as currently done by Google Glass or Vuzix M-100) sufficient. In the opinion of those who use information “snacking” devices, real time registration and tracking of data in the real world are still expensive and technically difficult.

Connecting Remote Experts with those in the Field

Real time consultation between a remote expert and a person wearing a camera and display while performing difficult tasks is a highly compelling use case for most of the EWTS speakers. Although a few speakers mentioned their experience with AR-assisted remote assistance, the majority shared numerous and immediate benefits of having another “set of eyes” focused on a particular procedure.

MedEx_abulance

For example, emergency medical technicians working on MedEx ambulances as part of the Google Glass Explorer program can transmit more information about injuries or patient conditions to emergency room staff ahead of their arrival at the hospital.

In another case study, a tradesperson working on a Rogers-O’Brien Construction job site can see and transmit the details of the job site and get guidance or feedback from an architect or supervisor in real time.

Some Industries Are Further Along

While the medical and construction industries were highly represented among the Enterprise Wearable Technology Summit speakers in Houston, some case studies and presentations highlighted the promise of wearable technology in the logistics industry. DHL and Ubimax described how they are working together to put their warehouse picking solution into production and conducting research on their next generation systems for pallet packing. 

Energy production and distribution were also frequently mentioned. John Simmins of the Electric Power Research Institute (EPRI), an AREA member, spoke of projects underway in some power generating facilities. Speakers from CenterPoint Energy and Sullivan Solar Power also attested they are actively exploring the use of wearables in their businesses.

Many Challenges Remain

An entire event could focus exclusively on expected and promised technology improvements. For example, uneven network coverage and issues preventing secure access to off-device content came up frequently. But, EWTS did not limit its scope to technology barriers.

Getting wearables into production requires companies in highly regulated industries such as healthcare and construction to educate decision makers and executives and to negotiate creation of many new policies. Those are both very lengthy and costly processes.

Compliance

Complex regulatory environments are but one item in the list of business challenges.

Lack of trust is another significant obstacle to adoption. Large enterprises are looking for vendors that are on the one hand nimble and responsive to special requirements while on the other endowed with the financial resources to quickly ramp up production for large orders.

Despite these and other challenges, wearables continue to hold enormous promise and will increasingly demand the attention of enterprise technology buyers and users. We can expect these to be on the agenda at future BrainXchange summits. The company announced that it will produce its next event in June 2016 on the East Coast, although details were not provided.

Are there events you plan to attend to learn about enterprise wearable technologies?




Augmented Reality Developer Options after Metaio

This post originally was published in French on augmented-reality.fr.

Just before summer, we launched a survey to better understand the strategies of Augmented Reality developers following Metaio’s sudden change in circumstances. This blog post presents the results of our survey and our interpretations.

 

AR Dev Options After Metaio 1

We launched the survey in mid-June and left it open over the summer of 2015. There was no specific respondent selection and therefore we cannot speak of any representative sample. However, with 63 responses, approximately 30 to 50% of whom were English speakers,  we decided that the dataset was sufficient to be representative.

First, we present the results of the survey. We then offer our interpretations.

Metaio Product Distribution

Options 8

 

Respondents were mainly users of Metaio’s SDK, and slightly more than half were users of Metaio Creator. The Continuous Visual Search (CVS) tool is used relatively little by our sample. Although it is not easy to fully know respondents’ use of Metaio tools, we can assume that the majority of respondents work in or near development because only 2 of the 63 respondents exclusively use Metaio Creator.

The Impact on Business

 

AR Dev Options After Metaio 2

 

 

AR Dev Options After Metaio 3

The impact of Metaio’s cessation of its offers on the developers’ business is important, even if 16% of respondents do not see the effects. While 40% of respondents said they have alternatives to Metaio products, 34% said they do not.

Open Source Solutions

 

AR Dev Options After Metaio 4

The use of an Open Source alternative to avoid the current situation is mixed. Although the survey was not specific about the capabilities of the offering, sixty percent of the respondents thought they would consider using an open source option, but a quarter of respondents remained uncertain.

Software Development Kits

 

AR Dev Options After Metaio 5

Not surprisingly, developers responded that, alone or in combination, Vuforia and Wikitude were the best alternatives to the Metaio SDK. Other proposed alternatives included ARToolkit, Catchoom and ARmedia. However, it is important to note that the third most common answer among respondents was “I don’t know.”

Metaio Creator

 

AR Dev Options After Metaio 6

Presently it seems that the vast majority of users have not found an alternative for Metaio Creator. Wikitude Studio is popular but Layar Creator,  though popular one or two years ago, no longer seems a viable alternative. It is surprising not to find Aurasma in the options considered by survey respondents.

Metaio Continuous Visual Search

 

AR Dev Options After Metaio 7

The results concerning Metaio CVS proved difficult to interpret as few people use it. Although Vuforia Cloud Recognition gained slightly more traction than other proposed alternatives, CVS users are much more divided on alternatives overall.

Open Comments from the Survey

Comments we received from respondents raise a few salient points.  In particular, Metaio’s technical expertise and advanced solutions were noted. Despite Wikitude and Vuforia having the same capabilities, there is currently no product in Metaio’s class.

We also see bitterness against Apple as well as an awareness of the potential fragility of other alternatives.

General Remarks

Today there is no obvious miracle solution to take Metaio’s place. The impact of the company’s change in circumstances on developers clearly demonstrates the overall fragility of the global Augmented Reality ecosystem. It is rather surprising to me that a third of respondents have no viable alternatives to Metaio technology. Rumors of Vuforia’s sale by Qualcomm may make the situation even more complicated in the coming months.

Paradoxically, these uncertainties do not help in the establishment of an Open Source solution. Although half of respondents believe this would be a good thing, a quarter remains uncertain. After discussions with several companies specializing in Augmented Reality, I felt a certain reluctance to support an open source system, primarily due to fear of losing an advantage in terms of technical prowess. There is much to say about this and I plan to prepare a more complete article in the coming weeks. In fact RA’pro will launch an invitation for a debate on this topic via web conference in the near future.

Returning to alternative tools, there is not a lot of surprise in seeing mention of the major market players: Vuforia, Wikitude, ARToolkit, ARmedia, Catchoom, etc. I am personally amazed at the few mentions of Layar, which seems to be a relatively major player in the AR print arena. However, it is true that the absence of a freemium model does not facilitate adoption by small businesses. The total absence of Aurasma and Total Immersion in the responses was also surprising.

As a final note, no one really knows if Metaio’s place can be taken since Apple has made no statement on the future of the product. We can however, presume that Metaio technology will be integrated in future products and will, therefore, lose the cross-platform nature that made Metaio products successful.

What do you think? Please leave your comments below.




Augmented Reality’s Expanding Role in the Automotive Value Chain

 

Use Cases for the Factory Floor

With successful conclusions of pilots and trials, Augmented Reality continues to move into areas where the overlay of virtual information promotes vehicle quality and helps employees work faster and better, but also where more experience with the technology is a prerequisite. As well, higher numbers of AR implementations put greater technical and organizational demands on projects.

One key trend is the growing number of use cases for Augmented Reality in pre- and post-production processes in the automotive industry. Vehicle design and development, and then final verification after assembly are the most popular use cases.

Lina Longhitano of Mercedes-Benz Vans leads the transformation of advanced manufacturing facilities through the Van Technology Center and has a wealth of experience with digital transformation in manufacturing and the use of Mixed and Augmented Reality in vehicle development. The center provides high-end visualization and analysis for ergonomics and buildability of vehicles.

In particular, she mentioned three Mixed Reality use cases for engineering:

  • The visualization of out-of-position and validation of flexible parts.
  • The overlay of digital crash simulation data on physical crash vehicles.
  • Digital assembly and disassembly simulations with collision testing.

Mercedes-Benz Vans uses Augmented Reality for factory floor layout and design, as well as for visually inspecting components to assess differences between virtual and physical objects.

In a similar vein, Hermann Gross of Opel is putting AR to use in pre-production processes, especially in vehicle development and component integration. Opel’s Augmented Reality-assisted systems also verify the quality of physical vehicle mockups. Gross provides a number of examples for these, such as verifying the final position of parts and optimizing cable positioning. He revealed a number of benefits of AR, including:

  • Shortening the duration of mockup builds and increasing their quality
  • Speeding up problem solving
  • Positively influencing data quality

On the other end of the production spectrum, Sebastian Rauh has in-depth knowledge about how Audi is using Augmented Reality for final assembly inspection. These range from vehicle start-up to engine parameter optimization and calibration of control units and sensor parameters. On behalf of Hochschule Heilbronn, Mr. Rauh is also working with Audi to design post-production verification workflows and equip personnel with Google Glass and the Epson Moverio BT-200 to execute tasks.

The Industrialization of Augmented Reality

Juergen Lumera of Bosch, an AREA sponsor member, is one of the first in automotive who is moving beyond simple AR prototypes and into larger deployments involving greater numbers of users, departments, processes and tools. Taking a holistic approach to the human, technological, financial and organizational aspects of incorporating AR technology across an enterprise, he outlined ways to expand projects beyond pilots. Mr. Lumera emphasized that AR adoption is a journey whose destination, as well as roadmap, has to be carefully planned in order to reduce risk and promote success.

Bosch’s Common Augmented Reality Platform (CAP) is an example of a system that integrates authoring and publishing of AR content across business units and technology silos, and can become part of a wider move towards the digital factory.

Matthias Ziegler of Accenture presented a framework for enterprise Augmented Reality adoption by Accenture’s clients and confirms the expanding interest in use of wearables that support AR for hands-free workplace performance. Accenture is expecting 212 billion devices and autonomously driven cars by 2020, with a doubling of IP traffic between 2013 and 2016. Bulky form factors will delay adoption by consumers, but Accenture sees enormous opportunity for hands-free AR-enabled displays in the enterprise space.

Their template, based on a number of pilot projects, compiles statistics and experiences and defines business value drivers and use cases, guiding investment in potential areas where AR can increase ROI. For example, if a company can quantify the length of time spent researching work instructions in paper documentation, and attribute a given number of errors to misinterpretations of drawings or procedures, then AR might promise higher returns.

Augmented Reality and Customer Experiences

Ashutosh Tomar of Jaguar Land Rover says the company’s vision is to use AR for enhancing the driver experience in their vehicles. Today’s typical car is packed with sensors and features—one type of vehicle having over 70 onboard computers and 170 “smart features.”

Customers are no longer judging automobile features as a selling point alone, but also expect a better customer experience. How can cars automatically change settings (e.g., music station, seat and mirror adjustments, etc.) based on who’s driving? How can cars communicate with drivers via other sensory inputs such as haptics? JLR is making large investments in human factors research and in ways to increase driver safety via Augmented Reality, for example:

  • Visualization of “ghost cars” in windshields driving ahead to clearly demonstrate the safest way to make turns in a city.
  • The projection of cones in windshields for training purposes.
  • “B pillars” enhancing a driver’s line of sight and situational awareness by turning car walls “transparent” in certain situations, like when making narrow turns in cities.
  • Haptic feedback in the seat behind a driver’s shoulder to alert them of another vehicle passing in their blind spot.

Legal Implications

New features such as the projection of information and images in the driver’s windshield will require new regulatory regimes. Brian Wassom, intellectual property attorney at Honigman Miller Schwartz and Cohn LLP, described the current regulatory environment and spoke about the principles of the National Highway Traffic Safety Administration’s “Visual-Manual NHTSA Driver Distraction Guidelines for In-Vehicle Electronic Devices.”

  • Distractions in all forms, including cognitive and visual, should be recognized by designers and regulators.
  • Displays should be as near the driver’s forward line of sight as possible.
  • A number of distracting features should be avoided entirely: glare, social media interactions and text that scrolls or contains more than 30 characters.
  • Glances away from the road should last no more than 1.5 to 2 seconds.

The above principles apply to current systems (dashboard layouts with navigation and phone information), but might also be the basis of conversations about Augmented Reality safety and liability.

In his presentation, Ashutosh Tomar had also emphasized the need to minimize the amount of information displayed to drivers to reduce distraction, as a basic tenet of safety.

Conclusions

In addition to those already mentioned, there were interesting presentations by Volkswagen, Ubimax, the German Research Center for Artificial Intelligence (DFKI), Feynsinn, Frauenhofer Insititute and others on topics ranging from showroom use cases to the latest research on AR user experiences.

Overall it was encouraging to witness the depth of questions about Augmented Reality being asked by companies in automotive manufacturing, research, design and others, and to get the sense of its evolving acceptance in enterprise, complete with growing pains and successes.




ESA Puts Augmented Reality Through the Paces

There’s a lot of attention currently focused on how NASA is planning to send Microsoft HoloLens hardware to space to help astronauts perform tasks. According to a post published on the Trove blog in June 2015, the first use case being tested will permit NASA professionals on Earth to see what the astronauts see on the International Space Station (ISS). In Remote Expert Mode, HoloLens will be valuable when the astronaut encounters undocumented situations. It will also be possible for HoloLens to provide procedural guidance, for example, to retrieve objects or to put objects away in their correct place after use.

Tests of HoloLens, both on the ground and in underwater laboratories simulating space, will certainly validate the latest technology components Microsoft provides but will not be the first tests of Augmented Reality in space.

ESA Columbia

A First Use Case for Augmented Reality in Space

According to David Martinez, a simulation and visualization engineer and member of the European Space Agency (ESA) Software Systems Division, Augmented Reality was first evaluated by ESA for space use in a project beginning in 2006. Using the ESA-designed Wearable Augmented Reality (WEAR) system, Augmented Reality was tested on Earth and, eventually, on the ISS in 2009. The use case was for an astronaut to inspect and, if needed, to service ISS air quality system components. Before examining and changing filters on the air quality system, an astronaut had to remove a panel on the floor. Then cables and hoses needed to be repositioned. Once the filter was accessible, the color of an indicator had to be examined. 

“We learned a lot about what was and wasn’t possible with the technology at that point in time,” recalls Martinez.

Exploring Guidance and Remote Expert Assistance

ESA works with payloads designed for a wide variety of different purposes. Some of the payloads end up on the International Space Station. As astronauts on the ISS cannot be trained on all the possible payloads in advance, they would like to have clear and compact Augmented Reality-assisted systems that make sure the astronauts conduct experiments consistently and correctly, even when they are not trained on them before going into space.

In 2014 the ESA team collaborated with the Technical University of Delft to explore the use of Augmented Reality using hands-free and head-mounted displays to provide remote expert assistance for performing experiments. The study used a payload representative of what’s on the Columbus module, a science laboratory that is part of the ISS and one of the most important contributions to the ISS made by the European Space Agency.

“We demonstrated that the remote expert was able to support the hands-on use of the various dials, buttons and knobs,” explains Mikael Wolff, a senior software manager who manages several projects in the domain of crew informatics. 

“The remote expert could speak to the user and also annotate the object in the astronaut’s field of view with arrows and text messages that would remain in place with respect to the payload,” clarifies ESA engineer Sérgio Agostinho.

Technologies are continually advancing and ESA is testing systems for their ability to track targets in 3D with far higher flexibility than earlier generations. “We’re not using fiducial markers on any of our current projects,” assures Agostinho. He feels that if a system it is to be deployed on the ISS, it can’t rely on markers.  “We’re aiming for the Iron Man quality of experience,” he says enthusiastically.

AR Overlay Usability Study

Long List of Use Cases

“We know that there are many ways Augmented Reality may bring value to projects and people on the ground and in space,” reports Wolff. “We’re always coming up with new ideas.”

In collaboration with partners in industry and academia, ESA is currently focused on several use cases it considers to be relatively low hanging fruit. One of these is support for complex product assembly, integration and testing on Earth. ESA and European aerospace industry engineers are routinely involved in, support or perform the final assembly and integration of parts procured from aerospace industry suppliers. Components include everything from printed circuit boards to large payload systems and harnesses that eventually go into space.

Augmented Reality could assist technicians during the assembly of telecommunication satellites. Currently the manual procedures take days or weeks to complete. By highlighting for users the steps directly on the parts of the satellite with Augmented Reality, the assembly, integration and testing processes could be performed with fewer errors and more quickly.

Barriers Remain

The ESA team has segmented its current and potential future Augmented Reality projects into those that could provide value when engineers perform tasks on Earth and others that could lead to AR being deployed in space for use by astronauts. This is due to the fact that systems or components that meet requirements on Earth are not immediately ready to go to the ISS. Not only is hardware certification for custom built and commercial off-the-shelf devices required, but software conflicts or bugs simply aren’t tolerated in space.

Before anything is sent to the ISS, it must undergo extremely rigorous testing and validation. “This means that almost everything on ISS is at least one generation behind what’s available on Earth, in terms of technology maturity,” explains Martinez.

“We also have real challenges with lack of interoperability,” says Wolff. “As an industry and as a public agency we can’t rely on a single supplier for any technology component. The Augmented Reality ecosystem needs to expand and different vendors need to provide components that are comparable or else we could put the agency or a mission at risk.”

Despite delays and the complex testing environments, ESA engineers continue to study AR use cases and to evaluate the latest technologies. As commercial solutions mature and pass required reliability and accuracy thresholds, having them in use on the ISS and on complex space assembly and integration projects on Earth could become commonplace.




Augmented Reality Use-cases at Newport News Shipbuilding

Shipbuilding has been the perfect environment for industrial innovation for hundreds of years. Sails to steam, wood to iron, rivets to welds, blueprints to CAD, stick-built to modular construction–all major innovations to building extraordinarily complex vehicles. At Newport News Shipbuilding, we constantly seek new innovations to improve our safety, quality, cost, and schedules. Since 2007, we have explored Augmented Reality as a means to shift away from paper-based documentation in our work.

Since we began looking into AR for construction, operation, and maintenance workflows, we’ve come up with hundreds of use-cases to improve tasks or processes. These range from assisting shipbuilders in painting, ship-fitting, electrical installation, pipefitting, and more in several ways – on new construction ships, ship overhaul, facility maintenance, and decommissioning. Every use-case improves our ability to deliver nuclear aircraft carriers and submarines, but at different degrees of improvement.

We’re always adding new use-cases to the list, and we’ve needed to devise an adaptable framework for organizing and categorizing existing, proven uses and prioritizing future, potential use-cases.

Genesis of a Use Case

Augmented Reality should be employed first in places where it creates the most value – and that actually can be subjective. Sometimes, this is helping people become more efficient and working more quickly, sometimes this is about helping to reduce errors and rework, and sometimes it is all about improving safety. At Newport News Shipbuilding, a dedicated team of AR professionals help determine where AR is best suited, whether the technology is ready for the use-case, and how to best implement and scale a solution.

The first step in defining a use-case is performed by an AR industrial engineer, who determines where AR brings value in a workflow. She first meets with a skilled craftsman, and understands their challenges and needs. The industrial engineer identifies pain points in processes, such as when and where shipbuilders must consult paper documentation to complete a task. She must also consider human factors and always balance the needs of the craftsman against the capability of the AR solution as it can be delivered today.

Then, the AR engineer works with an AR designer and an AR developer to deliver a product. The AR designer determines the available data, components, interfaces and models for the system to satisfy requirements. Once the use-case is fully defined and the data is assembled, an AR developer implements software solutions, tests the system, and ensures reliable and adaptable development tools. At the end of the process, a new use-case is addressed, and a high-value product is delivered to the skilled craftsman.

A Classification Scheme

Over the years we’ve devised hundreds of use-cases and needed a way to understand and prioritize them. We started by categorizing them into a taxonomy that we think of as general, but we admit they might be specific to our business. We call these our seven use-case categories.

Category

Description

Inspection (quality assurance)

An inspector determines how well a component or part conforms to defined requirements.

Work instruction

Guides a person or otherwise provides information useful for task execution.

Training

AR as a new medium for training skilled craftspeople, especially on complex and/or expensive systems.

Workflow management

Helps a supervisor plan and execute workflows for a team.

Operational

Use-cases for visualizing data about ongoing operations or system states (energy in a circuit breaker, flow rate in a pipe, etc.).

Safety

Enhance situational awareness for craftspeople.

Logistics

Helps a craftsman or supervisor understand where people and things are in space.

These 7 categories then are applied across three additional axes. These variables create a volume of exploration, or “trade space” for each use-case. The three application axes are as follows.

Variable

Description

Product line

Ship types such as aircraft carriers, submarines, etc., are differentiated and determine the content available for a use-case. For example, what type of, if any, 3D CAD models are available. Products without 3D CAD can still benefit from AR, but require laser scanning, data collation, and other methods to create effective AR uses. Also, industrial processes for one product may be different from the process for another, and these differences may make AR valuable on one product, and unnecessary on another.

Product life cycle

Represents phases of a ship’s life cycle, such as new construction, operation, overhaul and inactivation. Understanding the life cycle provides purpose and scope for the content, and also defines the type of AR consumer – shipbuilder, sailor, engineering maintainer, etc.

Trade skill

Workshop roles such as welders, pipefitters, electricians, etc., which determine AR needs, personal protective equipment, user factors, and in many cases, content and tolerance requirements.

Return on Investment

When investing in new technology, it’s important to find those areas offering the highest return on investment (ROI) for every dollar spent. At the same time, there are potentially high value use-cases that are simply not conducive to an AR solution today. As a professional AR team, we pride ourselves on understanding when we can have an impact, when we can have a really big impact, and when AR technology simply isn’t yet up to the challenge. We primarily focus on advancing the seven use-case categories, and use the three variable axes to ensure we are maximizing customer value and ROI. As our expertise has grown, and as the technology matures, we have steadily increased value and readiness of AR throughout the entire trade space.

Today, we assess highest potential ROI and use that as a metric for scaling priority. Our model shows the greatest ROI in use-cases for inspection, work instruction, and training. Our focus there is now on scalability. We also know that the ROI is really tied directly to the technology readiness levels (TRL) of AR for those use-cases. While we are certain there will be benefit, maybe even higher ROI, on workflow management, operations, safety, and logistics – the readiness levels of AR for those use-cases within our trade space simply isn’t as high (today) as for the first three mentioned. You can’t scale what doesn’t yet work. So for the latter four uses, therefore, the investment isn’t in scalability, but rather in improving the TRL.

As Augmented Reality technology becomes more capable and less expensive to implement, enterprises will find ever-increasing uses. We’d like to learn how others in different industries have been developing theirs. Please share your comments and experiences with us.




Augmented Reality Can Increase Productivity

Technological and cultural shifts that result in enhancements in manufacturing tend to increase complexity in products and processes. In turn, this complexity increases requirements in manufacturing and puts added pressure on organizations to squeeze out inefficiencies and lower costs where and when feasible.

This trend is acute in aerospace, where complexity, quality and safety require a large portion of final assembly to be done by humans. Corporations like AREA member Boeing are finding ways to improve assembly workflows by making tasks easier and faster to perform with less errors.

At ARise ’15, Paul Davies of Boeing presented a wing assembly study in collaboration with Iowa State University, showing dramatic differences in performance when complex tasks are performed following 2D work instructions versus Augmented Reality.

A Study in Efficiency

In the study, three control groups were asked to assemble parts of a wing, which required over 50 steps to assemble nearly 30 different parts. Each group performed the task using three different modes of work instruction:

  • A desktop computer screen displaying a work instruction PDF file. The computer was immobile and sat in the corner of the room away from the assembly area.
  • A mobile tablet displaying a work instruction PDF file, which participants could carry with them.
  • A mobile tablet displaying Augmented Reality software showing the work instructions as guided steps with graphical overlays. A four-camera infrared tracking system provided high-precision motion tracking for accurate alignment of the AR models with the real world.

Subjects assembled the wing twice; during the first attempt, observers measured first time quality (see below) before disassembling the wing and having participants reassemble it to measure the effectiveness of instructions on the learning curve.

Participants’ movements and activities were recorded using four webcams positioned around the work cell. In addition, they wore a plastic helmet with reflective tracker balls that allowed optical tracking of head position and orientation in order for researchers to visualize data about how tasks were fulfilled. Tracker balls were also attached to the tablet (in both AR and non-AR modes).

First Time Quality

To evaluate the ability of a novice trainee with little or no experience to perform an operation the first time (“first time quality”), errors are counted and categorized. The study revealed that tablet mode yielded significantly less errors (on average) than desktop mode.

In the diagram above, the blue bar represents the first assembly attempt and the green bar is the second. The diagram also shows that subjects using Augmented Reality mode made zero errors on average per person, indicating the potential of AR to improve first time quality for assembly tasks.

In the diagram above, the blue bar represents the first assembly attempt and the green bar is the second. The diagram also shows that subjects using Augmented Reality mode made zero errors on average per person, indicating the potential of AR to improve first time quality for assembly tasks.

Rapid assembly

ARIncreaseProductivity-graph2

This diagram measures time taken to complete tasks by mode, both the first and second time. AR-assisted participants completed tasks faster the first time than with other modes

Conclusions

Overall the study witnessed an almost 90% improvement in first time quality between desktop and Augmented Reality modes, with AR reducing time to build the wing by around 30%. Researchers also found that when instructions are presented with Augmented Reality, people gain a faster understanding and need less convincing of the correctness of tasks.

Bottom line is that this study shows and quantifies how complex tasks performed for the first time can benefit from Augmented Reality work instructions. If the task is done with fewer errors and faster, the impact on productivity is highly significant.

Where can Augmented Reality make an impact in your organization?