1

Augmented Reality for Production and Maintenance with NGRAIN

AREA member NGRAIN started in the enterprise training market and today is an AR solutions provider for a range of companies and industries.

We recently interviewed Barry Po, NGRAIN’s Senior Director of Product and Business Development, to discover the latest developments about NGRAIN’s offerings for industrial Augmented Reality.

What is your company’s mission and focus in the market?

NGRAIN has been working with customers to prepare and publish training programs and other types of information in rich and engaging ways for over 15 years. We develop solutions using both Augmented Reality and VR to meet the needs of our customers in aerospace and defense, energy and utilities, oil and gas and manufacturing and healthcare.

In each of these industries there are specialists who work with physical objects—whether to deploy, operate or to maintain and service these machines—and who need the right information in the right place at the right time. That’s what Augmented Reality brings: the ability to access information that would otherwise not be readily available or easy to understand, and equipping these people with knowledge they need to make better decisions. As a result, training time is shortened and they can perform tasks quickly and correctly every time it’s required.

A field technician’s work is often more complex than outsiders understand. When preparing and executing some tasks, there is a staggering number of details. Many human errors happen when working with heavy assets, such as maintaining or operating heavy equipment like a vehicle or a complex assembly. The value of Augmented Reality in those situations is to reduce errors, as well as cut down on missteps and omissions of technicians in the field. The technology makes it more practical for someone to do a complex job and ensures that they don’t forget anything along the way.

Another major benefit is that a comparatively less experienced person can use the information without having to spend time in the classroom before becoming productive.

What products and technologies does NGRAIN offer?

We offer a full suite of solutions so that the customer can reach the results they seek quickly. Our AR software development kit allows customers to build custom applications with AR. NGRAIN Producer Pro is what people who want to author their own AR applications with a GUI use. It doesn’t require programming experience. It allows authors to create or import their 3D content and to link it to metadata, as well as display it on Windows, iOS and Android mobile devices. Lastly, NGRAIN also provides customized Augmented Reality solutions tailored to customers’ specific needs.

Are there some use cases that, in your experience, are particularly well-suited for AR?

One major use case that NGRAIN addresses is maintenance training. Our AR-enabled solutions help someone in the field learn on the job rather than just in the classroom. It helps them figure out what they need to do, as well as what’s needed for their work and to get feedback. Having it all on a mobile device such as smart glasses makes it easily accessible.

Another major use case is visual inspection and damage assessment. Our solutions for battle damage assessment and repair are deployed in the field by Lockheed Martin, which has been an NGRAIN customer for eight years. US Air Force technicians use our technology to assess and organize repairs for F-22 and F-35 aircraft. This maximizes the amount of time the aircraft spend in flight and reduces maintenance costs and time spent in the hangar.

Which measurements or metrics for assessing AR’s impact do you prefer?

From an AR perspective, our customers are in the process of defining business cases and metrics, so measurements such as ROI have yet to be defined in a standardized way.

If we take a broader perspective that includes Mixed and Virtual Reality however, we can make a few generalizations about KPIs. Based on NGRAIN’s experience deploying 3D applications for maintenance training, we find the technology can double knowledge retention, which, in turn, brings a variety of benefits. For example, technicians become less prone to missing steps or mixing up the sequence. This increased efficiency also enables them to focus more on the job as a whole and ensure it’s well executed.

A third interesting metric is a statistic measuring how often a job is completed correctly the first time it’s performed. When we deploy the technology, we find customers are able to execute the job correctly nine times out of ten. In the oil and gas industry, for example, correct first-time job execution only occurs 30-40% of the time, so the technology’s impact can be significant.

What is your approach to introducing customers to AR?

We look at the customer’s problems first, focusing on their business environment and organization. It’s important to understand a customer’s pain points in achieving their goals, and one way we do this is by spending time at their sites and observing their operations firsthand.

Recently we spent time with an oil and gas customer’s technicians in the field that was maintaining drilling equipment. We learned that much of the knowledge needed to correctly do jobs isn’t actually documented, but nevertheless is subsequently required by less-experienced people. Our aim, with our 3D guidance solutions, is to provide this kind of tribal knowledge as a virtual mentor might.

What are the typical customer organization’s approaches with respect to new technology introduction?

Everyone agrees that technology is a valuable part of any organization, but we often find differences of opinion in the degree of intensity that new technologies should be introduced. For example, many people who would benefit from AR really don’t care about the technology itself but are looking for the efficiency gains it provides.

In our view, introducing new technologies is less about imposing an approach on the customer or the end user. We make them a part of the process of discovering what works best for them. This ensures that everyone’s perspective is taken into account in the process, rather than the process being solely about the vision of a person or small group of people at the top.

A successful deployment of AR technology takes effort and is unique to each customer and group. Discovering the right approach for a particular customer is greatly helped by working with stakeholders at all levels.




AREA Members at Augmented World Expo 2016

If you only have a few days to get up to speed about the use of Augmented Reality to improve workplace productivity or safety, then you’ve come to the right website. You’ll quickly pick up the concepts and a working vocabulary of AR by browsing our site and watching our webinars. You’ll also learn about our members, leaders dedicated to providing AR-enabling technologies and solutions who offer a wealth of knowledge and experience.

But, for most people immersing themselves in this exciting new discipline, surfing the web isn’t sufficient. Let’s face it: enterprise AR experiences always involve a physical world component.

Putting hands on the technologies while they’re in use, to see different options and meet people in person, are critical to tapping the potential that enterprise AR offers.

AWE Brings the AR Industry Together

The upcoming Augmented World Expo (AWE) on June 1 and 2, 2016, in Santa Clara, California, is an important event for AREA members. Over four thousand AR practitioners and enthusiasts will experience the latest technologies when gathering important data for decisions on behalf of their companies and projects.

Good Starting Project

Organized annually since 2010 by industry mover-and-shaker Ori Inbar and his team, the event has both a conference and a trade show. These bring together customers, vendors, researchers, investors and many others who are important to the continued expansion of this industry in a variety of formats. 

Whether you’re looking for something specific or just exploring, AWE provides an opportunity to get to know the experts, such as AREA members, and to try out the latest Augmented Reality technologies and products first hand.

AREA Members at AWE

The AREA and its members will be leading and speaking during the AWE 2016 enterprise AR track of sessions, as well as demonstrating solutions on the exhibition floor.  

In order to provide the greatest impact to our diverse audiences, we’ve divided the enterprise AR sessions into vendor-neutral insights and recommendations from a range of technology providers on June 1, followed by customer case studies and testimonials on June 2.

Chaired by Paul Davies, Technical Fellow at Boeing, an AREA founding sponsor member, our June 1 speakers will provide a lot of practical advice based on their experience across many industries. Since it is frequently the first major barrier to success, the day will begin with speakers sharing recommendations about how to select and prioritize enterprise AR use cases.

NGRAIN-AWE

Then AREA members will offer their suggestions for how to prepare and deliver digital content for enterprise AR experiences. In this session, I will provide the results of a research project on different AR authoring platforms. David Marimon of Catchoom will describe the results of recent studies with 3D sensing platforms for real world object recognition and Alex Hill, CTO of CN2 Technologies will offer guidance on how to optimize 3D assets for use in AR experiences.

The rest of the day promises great talks on enterprise AR wearable technology strategies and the use of AR as a human interface to Industrial Internet of Things.

On June 2, Bob Meads, CEO of iQagent, another AREA founding sponsor member, will chair three hours of sessions during which customers will share their experiences working in pilot and proof-of-concept projects. These sessions will feature case studies and testimonial presentations. The afternoon will offer round tables and panel discussions with customers, and we’ll hear the results of recent projects and lessons learned throughout the day.

AREA members will also be exhibiting in record numbers and many will have their booths in the AR for Enterprise Pavilion.

Will you be there? Stop by AREA member booths to introduce yourself to us and let us know how we can help you to get the greatest value from attending AWE 2016 and your enterprise AR investments.




Augmented Reality Boosts Efficiency in Logistics

Fulfilling customer orders at a warehouse, or order picking, can be costly. A well-known study on warehouse management cited the typical costs of order picking as being nearly 20% of all logistics costs and up to 55% of the total cost of warehousing. The use of technology to streamline order picking offers an important opportunity to reduce cost.  

While great strides have been made in automating warehouse processes, customer expectations also continue to rise. For example, Amazon offers same-day delivery in many US metropolitan areas and this is becoming a standard elsewhere. Increasing fulfillment and delivery speeds may result in increased errors that are not caught prior to shipment.

Four panel image

Augmented Reality can significantly increasing order picking efficiency. An AR-enabled device can display task information in the warehouse employee’s field of view. Logistics companies such as DHL, TNT Innight and others have been collaborating with providers of software and hardware systems to test the use of Augmented Reality in their warehouses.

A recent study by Maastricht University conducted in partnership with Realtime Solutions, Evolar and Flos brings to light the impact smart glasses can have on order fulfillment. The research sought to:

  • Confirm prior research that smart glasses improve efficiency compared with paper-based approaches
  • Study usability, required physical and mental effort and potential empowerment effects of the technology in a real world environment
  • Assess the impact of an individual’s technology readiness on previously introduced performance and well-being measures

Design of the Study

Sixty-five business students at the University of Maastricht participated in a three-day study conducted in a controlled environment. Study participants were given instructions to pick individual items from bins containing items and place them into appropriate customer bins:

  • One group picked items from 28 bins using item IDs printed on paper and then matched those to IDs on customer bins. The study assessed order picking efficiency by measuring the ability and speed of participants to place the items in the correct customer bins.
  • The other group used AR-enabled smart glasses to scan barcodes in item bins and follow the displayed instructions to place them in the customer bins.

The researchers evaluated metrics such as:

  • Performance measures of error rates and picking times per bin
  • Health and psychological measures such as heart rate variability, cognitive load and psychological empowerment
  • Usability measures such as perceived ease of use
  • “Technology readiness” on a scale measuring personal characteristics such as optimism for, and insecurity with new technologies

View through smartglasses

Faster with Smart Glasses

The researchers found that smart glasses using code scanners permitted users to work 45% faster than those using paper-based checklists, while reducing error rates to 1% (smart glasses users made ten times less picking errors than the control group).

The smart glasses group also expended significantly less mental effort to find the items with the same heart rate variability as the group using paper.

Overall the usage of smart glasses empowers users and engenders positive attitudes toward their work and the technology: in comparison with the group following checklists, they felt the successful completion of tasks was more attributable to their own behavior. This corroborates other studies in efficiency gains such as this one, and demonstrates the level of impact Augmented Reality can have in the workplace.

You can read about more Augmented Reality research from Maastricht University and other university partners at this portal.

Maastricht University Logo




Data Visualization with 3D Studio Blomberg

AREA member 3D Studio Blomberg (3DS) excels at visualization of data and especially at enterprise solutions for Augmented Reality. The AREA asked Pontus Blomberg, founder and CEO of 3DS, about his company’s history and projects in the space.

Q. Where do you have the greatest number of projects or customers?

Our customers are mainly in heavy industry, and include both large and mid-sized companies. We are also targeting the educational and consumer sectors for our AR solutions.

Q. How did 3DS become popular as a supplier to the industries you just identified?

Since the company’s founding we’ve led the way to digital transformation through advanced content delivery systems to promote process efficiency, expert knowledge and overall quality.

In 2006 we recognized the potential of AR to boost productivity in industrial workplaces and introduced the technology to Wartsila, a major Finnish power equipment supplier in 2008. At that time we evaluated ALVAR, Vuforia, and Metaio to survey their functionality from a visualization standpoint and assess their capabilities in handling 3D scenes and animations. In 2012 we delivered a proof of concept to Wartsila, and in 2013 we joined a Finnish national R&D program to study the potential of AR in knowledge sharing solutions for field service personnel.

3DS Wartsila

This study showed that research and practical industry applications were not in sync, and many players were concerned with achieving efficiency through dynamic AR content and data integration. We entered an AR solutions provider partnership with Metaio in 2014 but realized the platform focused on technology functionality rather than on system utilization and process implementation, which is our focus today. We are currently studying the potential of Osterhaut Design Group’s R-7 smart glasses and continue to perform proof of concept projects with emphasis on process analysis, system development and AR in production use.

Q. What are the most common metrics used to assess task performance or project success?

We recommend that customer metrics be in line with their quality management system for effective reference and comparison. Broadly speaking, examples of common metrics include:

  • Improvements in product and service quality
  • Effectiveness
  • Safety and risk reduction

Taking simple definitions of effectiveness (“doing the right thing”) and efficiency (“doing the thing right”), we believe it’s possible to work efficiently but it doesn’t contribute to productivity until we’re able to efficiently do the right things at the right time.

Q. What is your approach to AR introduction at customer sites?

As AR is new to most organizations, we recommend detailed analysis of the customer’s business strategy. In order to achieve digital transformation in line with the AR solution, the project needs to be aligned with the business strategy all the way to the board room. We also recommend demos and proof of concept projects to help organizations gain knowledge and understanding.

Q. How is data prepared for your customer projects?

It’s all a question of knowledge and experience gained through project implementation. Initially data has to be prepared manually, but at later stages of the project we’re better able to develop ways of handling new types of content in existing enterprise content systems.

Q. Do you get involved in the design of content that goes into pilot projects?

Yes, this is where our long experience and advantage really shines. Our expertise in visualization, combined with the customer’s industrial product and process expertise, play a significant role in achieving digital transformation through AR solutions. But no large-scale transformations can occur before new knowledge and tools are in place that allow for productivity and dynamic content.

Q. Do you study project risks with the customer or project leader?

There have been no major studies until now but naturally new technologies bring risks with them. Imagine driving your car with GPS assistance in heavy traffic and suddenly you can’t get a signal.

Q. Do you know if your customers perform user studies prior to and following use of the proposed system?

Yes, the fact that we start to see significant achievements in implementing AR solutions drives these kinds of studies. We’ve also had the chance to work together with partners in bigger collaborative research projects.

Q. What are the attitudes of those in the workplace where AR projects are successfully introduced?

Employees at the customer site are very positive and even surprised. We often encounter statements similar to, “Wow! I’ve seen this on YouTube and the Internet. It’s incredible to see that it really works.”

Q. Describe the technologies at play. What types of components do you offer?

Through our key partner network we offer the entire pipeline of smart glasses, mobile solutions, UIs, server-client databases and content development.

We use worldclass tracking technologies today but expect that Simultaneous Localization and Mapping (SLAM) technologies will gain ground. We realize this type of technology isn’t applicable in unique or dynamic situations at larger scales, although we’ve performed several demos and proof of concept projects with SLAM and the results are promising.

At the moment we see marker-based (or with code/ID) and geo-tracking as the most stable and flexible ways to acquire user context. We’ve built upon these technologies in our products and platforms.

At the same time we realize significant investment is needed in the modification of existing customer processes and new competences. To be successful, we aim to help our customers drive this change through systematic long-term cooperation.

Q. What must customers provide in terms of system components?

For rapid familiarization with the technology we recommend providing data to achieve a real look and feel. We recommend not overdoing it with complex UIs and information flows. Developing proof of concept projects with small, incremental steps for easy evaluation and quick changes is important to identify precisely the drivers of an AR introduction.

Q. With whom do you partner most often?

We partner with technology providers (hardware, software and tracking technologies), and we also see content providers as strategic because of their long-term customer relationships. To get all these complex systems to work together with business process changes is a team effort. It will take a few years. We aim to use what’s already been applied in an enterprise because we want to leverage the significant investments that have already been made in IT and visualization.

Q. What are the environmental conditions where customer projects are being conducted?

We’ve experienced both laboratory and real environmental conditions, especially in terms of lighting, vibrations and sound. Many of our customers use ruggedized solutions for their projects, which means unique and custom solutions for harsh, dynamic environments.

Q. What are your other offerings?

In terms of training, 3DS also provides competence development in combination with process development. For data, we use the customer’s cloud and offer commercial cloud solutions.

Q. What are the greatest challenges you currently face in AR introduction projects?

Customers often don’t have sufficient insight into the possibilities that emerging visualization technologies and content can provide. Therefore a clear understanding of customer expectations, goals and their business is needed. Customers also need a certain amount of trust that their expectations will be met.

Many times the only way forward is to agree on a proof of concept or demo that shows the technology, content, functionality, added value and supplier capabilities.

From the customer point of view, there are also uncertainties about the new types of content that will be needed to enrich the current PLM process to allow for visualization on a large scale. How will this information be connected and utilized together with the new visual content? We offer expertise in these questions and they need to be processed in very close cooperation with the customer as they touch the very core of their business.

Q. What are the future plans or next steps for your company?

We’ll continue to systematically monitor and build our international client base and partner network and develop state-of-the-art products and services.




Augmented Reality in the Aerospace Industry

There are many use cases for Augmented Reality in the aerospace industry and the leaders in this industry have a long history with the technology. In this post, we review some of the milestones and provide highlights of the recent AREA webinar.

In 1969, while working in the Human Engineering Division of the Armstrong Aerospace Medical Research Laboratory (USAF), Wright-Patterson AFB, Thomas Furness presented a paper entitled “Helmet-Mounted Displays and their Aerospace Applications” to attendees of the National Aerospace Electronics Conference.

Over 20 years later the paper was one of eight references cited by two Boeing engineers, Thomas Caudell and David Mizell. In their 1992 paper published in the Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, Caudell and Mizell coined the term “Augmented Reality.” The degree to which the team drew from the work of Furness, who had started the Human Interface Technology Lab at University of Washington in 1989, is unclear but the focus of the Boeing team was on reducing errors when building wire harnesses for use in aircraft and other manual manufacturing tasks in aerospace. 

Aerospace

While the technology was not sufficiently mature to leave the lab or to deliver on its potential at the time, they suggested that with an AR-assisted system an engineer would in the future be able to perform tasks more quickly and with fewer errors. 

Proof of Concepts

Approximately fifteen years later, in 2008, Paul Davies, a research & development engineer at AREA member Boeing began working with Boeing Technical Fellow, Anthony Majoros. Together, Davies and Majoros picked up where the Caudell and Mizell paper left off. They used commercially-available technologies such as Total Immersion’s D’Fusion platform to show how technicians building satellites could perform complex tasks with Augmented Reality running on tablets.

Airbus has also been experimenting with Augmented Reality for over a decade. In this paper published in the ISMAR 2006 proceedings, Dominik Willers explains how Augmented Reality was being studied for assembly and service tasks but judged too immature for introduction into production environments. The paper, authored in collaboration with the Technical University of Munich, focused on the need for advances in tracking. 

Since those proof of concept projects, AR technology has advanced to the point that it is being explored for an increasing number of use cases in the aerospace industry. In parallel with the expansion of use cases, the pace of applied research into AR-enabling technology components has not abated.

Augmented Reality in Aerospace in 2016

While today AR may not be found in many aerospace production environments, the promise of the technology to increase efficiency is widely acknowledged.

On February 18, David Doral of AERTEC Solutions, Jim Novack of Talent Swarm, and Raul Alarcon of the European Space Agency joined Paul Davies and me to discuss the status of Augmented Reality in their companies and client projects.

Each participant described the use cases and drivers for Augmented Reality adoption. For Boeing, the key metrics are reduction of errors and time to task completion. Use cases include training and work assistance. AERTEC Solutions, which works closely with Airbus, and Talent Swarm are both focusing on use cases where live video from a head-mounted camera can bring greater understanding of a technician’s context and questions, and permit more rapid analysis and resolution of issues.

The European Space Agency sees a variety of use cases on Earth and in space. Inspection and quality assurance, for example, could benefit from the use of Augmented Reality-assisted systems.

Turbulence Ahead 

During the discussion, webinar panelists explored the obstacles that continue to prevent full-scale adoption. In general, most barriers to adoption can be considered as technological in nature. But there are also significant obstacles stemming from human factors and business considerations. We also discussed the degree to which other industries may be able to apply lessons learned from aerospace.

To learn more about the state of AR in the aerospace industry, please watch the webinar archive.

Do you have use cases and projects that you would like to share with the AREA and our audiences? Please let us know in the comments of this post.

 




Efficiency Climbs Where Augmented Reality Meets Building Information Management

At Talent Swarm we envisage that by using pre-existing platforms and standards for technical communication, our customers will reach new and higher levels of efficiency. Our vision relies on video calling to make highly qualified remote experts available on demand, and the data from Building Information Management (BIM) systems will enhance those live video communications using Augmented Reality.

Converging Worlds

There have been significant improvements in video calling and data sharing platforms and protocols since their introduction two decades ago. The technologies have expanded in terms of features and ability to support large groups simultaneously. Using H.264 and custom extensions, a platform or “communal space” permits people to interact seamlessly with remote presence tools.  The technology for these real time, parallel digital and physical worlds is already commonplace in online video gaming. 

But there are many differences between what gamers do at their consoles and enterprise employees do on job sites. As our professional workforce increasingly uses high-performance mobile devices and networks, these differences will decline. Protocols and platforms will connect a global, professionally certified talent pool to collaborate with their peers on-site. 

Enterprises also have the ability to log communications and activities in the physical world in a completely accurate, parallel digital world.

Growth with Lower Risk

We believe that introducing next generation Collaborative Work Environments (CWE) will empower managers in many large industries, such as engineering, construction, aviation and defense. They will begin tapping the significant infrastructure now available to address the needs of technical personnel, as well as scientific research and e-commerce challenges. When companies in these industries put the latest technologies to work for their projects, risks will decline.

Most IT groups in large-scale engineering and construction companies now have an exhaustive register of 3D models that describe every part of a project. These are developed individually and used from initial design through construction. But these have yet to be put to their full use. One reason is that they are costly to produce, and companies are not able to re-use models created by third parties. There are no codes or systems that help the companies’ IT departments determine origins of models or if the proposed model is accurate. The risks of relying on uncertified models, then learning that there is a shortcoming or the model is not available when needed, are too great.

Another barrier to our vision is that risk-averse industries and enterprises are slow in evaluating and adopting new hardware. Meanwhile, hardware evolves rapidly. In recent years, video conferencing has matured in parallel with faster processors and runs on many mobile platforms. Specialized glasses (such as ODG´s R-7s, Atheer Air and, soon, Microsoft’s HoloLens), helmets (DAQRI´s Smart Helmet), real time point-cloud scanners (such as those provided by Leica or Dot Products) or even tablets and cell phones can capture the physical world to generate “virtual environments.”

With enterprise-ready versions of these tools coupled with existing standards adopted for use in specific industries, the digital and physical worlds can be linked, with data flowing bi-directionally in real time. For example, a control room operator can see a local operator as an avatar in the digital world. By viewing the video streaming from a camera mounted on the local operator’s glasses, the remote operator can provide remote guidance in real time. 

Standards are Important Building Blocks

At Talent Swarm, we have undertaken a detailed analysis of the standards in the construction industry and explored how to leverage and extend these standards to build a large-scale, cloud-based repository for building design, construction and operation.

We’ve concluded that Building Information Management (BIM) standards are reaching a level of maturity that makes them well suited for developing a parallel digital world as we suggest. Such a repository of 3D models of standard parts and components will permit an industry, and eventually many disparate industries, to reduce significant barriers to efficiency. Engineers will not need to spend days or weeks developing the models they need to describe a buttress or other standard components.

Partnerships are Essential

The project we have in mind is large and we are looking for qualified partners in the engineering, construction and oil and gas industries, and with government agencies, to begin developing initial repositories of 3D models of the physical world.

By structuring these repositories during the design phase, and maintaining and adding to this information in real time from on-site cameras, we will be able to refine and prove CWE concepts and get closer to delivering on the promise.

Gradually, throughout the assembly and construction phases we will build a database that tracks the real world from cradle to grave. Analyzing these databases of objects and traces of physical world changes with Big Data tools will render improvement and maintenance insights previously impossible to extract from disjointed, incomplete records. We believe that such a collaborative project will pave the way towards self-repairing, sentient systems.

We look forward to hearing from those who are interested in testing the concepts in this post and collaborating towards the development of unprecedented Collaborative Work Environments.  




Advancing Toward Open and Interoperable Augmented Reality

Enterprise Augmented Reality engineers and content managers who published experiences created with Metaio’s software tools have or will soon encounter a situation they didn’t anticipate: the publishing and delivery environments are unsupported and not evolving to take advantage of the latest enabling technologies.

Are you among this group? If so, you are not the only one to find yourself in this uncomfortable situation.

If there was a mandate to continue providing the value of their AR experiences to end users, customers of other AR software providers who are no longer supporting or advancing their platforms with the latest technology innovations hit the same roadblock. Prior to agreement on standards, they could not “port” their experiences to another AR platform. Evaluating and choosing another proprietary AR technology platform, and then investing in re-authoring, testing and re-deploying AR experiences based on their original designs, was the only way forward.

Unfortunately, some of those reading this blog are in this awkward position today.

Successfully addressing the root causes of low AR experience “portability” and the inherent lack of integration or interoperability between AR authoring and publishing systems is an important, highly collaborative process.  Different parts of the AR ecosystem must agree that there are issues, firstly, and then on principles for collaboration. Then, based on shared conceptual frameworks, they must work together towards implementing those principles in their workflows and solutions.

Supporting that collaborative process is the reason I’ve been leading the grassroots community for open and interoperable Augmented Reality content and experiences since 2009.

Is There Really a Problem?

Interoperable Augmented Reality is not a high priority for most people. Only about a hundred people are consistently investing their time in advancing the principles of open and interoperable Augmented Reality. We know one another on a first name basis; many of us compare notes in person a few times per year. Another few hundred people know of such activities but don’t directly invest in meaningful ways.

For most companies, the investment in AR has not been great. A few tens of thousands of dollars to rebuild and deploy a half dozen carefully handcrafted AR experiences is minor by comparison to investments in other enterprise technologies. 

“There’s still too much innovation to begin working on standards,” is another commonly heard refrain. Clearly they haven’t been reading the posts or listening to the presentations made by AREA member IEEE Standards Association, or leaders of other standards development groups. When designed collaboratively and to address interoperability in strategic places, there are many examples of standards doing the reverse.

There are other reasons for many to turn a blind eye to the problems. They are valid for different people to different levels.

This is a Serious Problem

In my opinion, ignoring the lack of open and interoperable Augmented Reality solutions and services is doing everyone a disservice.

The fact that only a relatively low amount of money has been invested to date is a poor justification for investing yet more time and money into building experiences with another proprietary platform, only to have the same scenario in a matter of months or years.

In fact, innovation in Augmented Reality is not what it should be today because many of the best AR developers are building a better mouse trap: smart engineers are working to solve problems that have, for the most part been solved by others, in a different way. Whether it’s for reasons of avoiding encroachment on a third party’s patents or something else, this investment of effort is in highly integrated proprietary silos and at the expense of solving other problems that remain unaddressed.

There are three more serious problems with having only proprietary technology silos and very low use of widely agreed standards for Augmented Reality experiences. The first of these is that enterprises with assets that could be leveraged for AR experiences are unable to integrate production of AR experiences into their corporate workflows. This lack of integration between AR as a method of information delivery and other information delivery systems (e.g., web pages and mobile services without AR support) means we can’t seriously stand before a CIO and recommend they support the development of AR content. What we are recommending requires setting up another entirely separate and different content management system.

In the same vein, the second reason that enterprise CIOs and CFOs are justifiably reluctant to deepen their investment in AR projects is that they cannot deploy modular architectures in which multiple vendors can propose different components. In today’s landscape of offerings, it’s all or nothing. The customer can buy into provider A’s system or that offered by provider B. If provider C comes along with a better option, too bad.

The third reason the lack of standards is a serious problem worthy of your support is closely related to the other two. Deep collaboration between AR-enabling technology vendors (providers of technologies) and service providers is currently very difficult.  They are not working to improve customer outcomes: they are working much more on competing with one another for attention and for the small investments that might be made.

Three serious enterprise AR obstacles that agreements about open and interoperable AR could reduce

  1. Low or lack of content or experience portability between proprietary technology silos

  2. Strong customer aversion to risks due to vendor lock-in

  3. Low cooperation between competitors or ecosystem members to partner for best customer outcomes

This situation with lack of interoperability and fear of vendor lock-in would be addressed if the vendors took a more serious look at possible open interfaces and standards within a larger framework. Conversely, vendors might study new approaches and establish some level of interoperability if they believed that customers would respond by increasing their budgets for Augmented Reality.

This is all very serious.

Another recent development is not helping: it’s clear that some internet and IT giants are paying a lot of attention to AR. The lack of visibility into what highly competitive and successful companies like Microsoft, Google, Apple and PTC will do about AR interoperability and integration has cast a very cold spell over enterprise AR adoption.

Their lack of support for standards and their unwillingness (to date) to shed light in a public way on how they will cooperate or how their proposed (future) systems will interoperate is causing so much uncertainty. No CIO or CFO should seriously invest in enterprise Augmented Reality until these companies’ plans with respect to integration and interoperability are clearer.

Progress is Being Made

We should be open to the possibility that 2016 will be different.

Thanks to the dedication of members of the grassroots community, the situation is not as bleak as it could be. A few weeks ago a few dozen members met in Seoul, Korea, to compare notes on progress. SK Telecom, a strong supporter of open and interoperable Augmented Reality, hosted two days of sessions. We heard status updates from four standards organizations that have highly relevant activities ongoing (Khronos Group, Open Geospatial Consortium, IEEE and ISO/IEC). We also received reports from AR developers who are working to advance their solutions to support standards.

The fact that the ISO/IEC JTC1 Joint Adhoc Group for Mixed and Augmented Reality Reference Model is nearing completion of its work is a major development about which I presented in Seoul.

In the spirit of full disclosure: the community of people in support of open and interoperable AR was the environment in which this work began, and I have been a member of that ad hoc group since its formation. If you would like to obtain a draft of the Mixed and Augmented Reality Reference Model, please send me an email request.

We are also seeing increased interest from industry-centric groups. There is a German government supported project that may propose standards for use in automotive industry AR. The results of an EU-funded project for AR models in manufacturing became the basis for the establishment of the IEEE P1589 AR Learning Experience Model working group (which I co-chair). In a recent meeting of oil and gas industry technologists, formation of a new group to work on requirements for hands-free display hardware was proposed.

These are all encouraging signs that some are thinking about open and interoperable Augmented Reality. If you want to monitor the activities of the grassroots community focusing on this topic, and to receive announcements of upcoming meetings, visit this page and register yourself for one or more of the mailing lists.

Have you seen other signs that there is increasing awareness of the problems? Do you know about any new standards that should be monitored by and presented during a future meeting of the grassroots community?




Enterprises Want to Use Wearables

Many workplace scenarios require use of both hands to manipulate physical world objects. Having a display on the wrist or head (or both) with a variety of sensors and optional cloud services, offers attractive alternatives to tablets for supporting access to real time or contextual information.

According to a Gartner Group report shared at the Enterprise Wearable Technology Summit (EWTS), sales of head-mounted displays will be greater in enterprise than consumers until at least 2020.

Gartner-slide

Unfortunately, the interest in enterprise wearable computing is not currently being addressed by consumer technology providers.

Connecting Those with Questions to Those with Experience

What are current enterprise customer requirements? What have enterprise wearable pioneers learned? What are enterprise customers’ best options today? These were among the questions that the EWTS organizer, BrainXchange, set out to answer.

BrainXchange chose Houston for its inaugural event on October 20-21, 2015.  The city is a business center for the oil and gas industry and is reachable from an international airport as well as from both coasts of the US.

Over 150 delegates from at least six countries gathered to hear from 60 speakers, including many veterans of the Google Glass Explorer program and vendors looking for new customers. The format offered plenty of networking in a convivial and relaxed atmosphere. 

AREA Members at EWTS

AREA Member Role
XMReality Sponsor
Augmate-Logo3x2 Speaker
 EPRI-profile-logo Speaker
 APX-Profile-logo  Delegate in attendance 
perey-profile-logo Delegate in attendance 

Criteria for Enterprise Wearable Success

There is wide agreement with the simple guidance that Joe White, VP and GM Enterprise Mobile Computing at Zebra Technologies offered during his opening remarks.  White recommends that enterprises focus on systems that are:

  • Technically sound
  • Socially acceptable
  • Solve a problem

These criteria sound simple, but adhering to them requires careful research and planning. Many delegates at the summit who are shopping for wearable technologies don’t feel that the current commercial technology options are sufficiently mature for most of their use cases. One person confided that everything his team has evaluated to date “feels like a science project.”

Weight, balance and resolution remain significant technical obstacles but short battery life as a result of high power consumption continues to be high on the list of technology barriers.

One test of wearable display technology reliability is how well it performs in a live demo on stage. There were more videos than live demos, but Rafael Grossman, a highly promoted surgeon in the Google Glass Explorer program successfully demonstrated Atheer Labs’ AiR platform for the audience.

Another criteria added to White’s list over the course of the first day was cost. If devices are expensive to purchase and to operate or maintain, adoption and use will remain limited.

Regardless of the criteria and how firmly an organization wants to adhere to them, customers remain divided about what’s truly going to solve their problems. Some feel that their use cases require true Augmented Reality in enterprise. Others are, at least for the present, finding the “simple” delivery of live information or images to a wearable display (as currently done by Google Glass or Vuzix M-100) sufficient. In the opinion of those who use information “snacking” devices, real time registration and tracking of data in the real world are still expensive and technically difficult.

Connecting Remote Experts with those in the Field

Real time consultation between a remote expert and a person wearing a camera and display while performing difficult tasks is a highly compelling use case for most of the EWTS speakers. Although a few speakers mentioned their experience with AR-assisted remote assistance, the majority shared numerous and immediate benefits of having another “set of eyes” focused on a particular procedure.

MedEx_abulance

For example, emergency medical technicians working on MedEx ambulances as part of the Google Glass Explorer program can transmit more information about injuries or patient conditions to emergency room staff ahead of their arrival at the hospital.

In another case study, a tradesperson working on a Rogers-O’Brien Construction job site can see and transmit the details of the job site and get guidance or feedback from an architect or supervisor in real time.

Some Industries Are Further Along

While the medical and construction industries were highly represented among the Enterprise Wearable Technology Summit speakers in Houston, some case studies and presentations highlighted the promise of wearable technology in the logistics industry. DHL and Ubimax described how they are working together to put their warehouse picking solution into production and conducting research on their next generation systems for pallet packing. 

Energy production and distribution were also frequently mentioned. John Simmins of the Electric Power Research Institute (EPRI), an AREA member, spoke of projects underway in some power generating facilities. Speakers from CenterPoint Energy and Sullivan Solar Power also attested they are actively exploring the use of wearables in their businesses.

Many Challenges Remain

An entire event could focus exclusively on expected and promised technology improvements. For example, uneven network coverage and issues preventing secure access to off-device content came up frequently. But, EWTS did not limit its scope to technology barriers.

Getting wearables into production requires companies in highly regulated industries such as healthcare and construction to educate decision makers and executives and to negotiate creation of many new policies. Those are both very lengthy and costly processes.

Compliance

Complex regulatory environments are but one item in the list of business challenges.

Lack of trust is another significant obstacle to adoption. Large enterprises are looking for vendors that are on the one hand nimble and responsive to special requirements while on the other endowed with the financial resources to quickly ramp up production for large orders.

Despite these and other challenges, wearables continue to hold enormous promise and will increasingly demand the attention of enterprise technology buyers and users. We can expect these to be on the agenda at future BrainXchange summits. The company announced that it will produce its next event in June 2016 on the East Coast, although details were not provided.

Are there events you plan to attend to learn about enterprise wearable technologies?




AREA Members Accelerating Success with Augmented Reality

Augmented Reality offers tremendous opportunity for organizations to improve workforce productivity and reduce human error through increased contextual awareness and guidance. Whether implemented on a head-mounted display, on a tablet or through a stationary system, AR can deliver and collect information for a myriad of applications including training, manufacturing, field service and warehouse logistics.

It is an exciting time to join and participate in the AR ecosystem. Many companies are jumping in. Some are making tremendous advancements in wearable technology through miniaturization. Innovation at the silicon level is lowering power consumption and processing. Others are focusing on improvements in computer vision. Mobile systems including phones, tablets, watches and glasses are becoming more interconnected and integrated, and smart fabrics present the potential for a fully integrated mobile augmented human.

Truths are Difficult to Accept

Progress is being made but significant challenges to the effective development and deployment of AR within the enterprise environment remain. And, unfortunately, the hype around AR and the initial example demonstrations (and concept videos) have created the perception that AR is ready to go and can be easily implemented and deployed.

In truth, many technical issues still need to be solved to enable successful implementation and widespread use of AR for extended periods of time. Organizational issues including culture, security and safety are other significant barriers that must be addressed. Most current AR examples are custom developed for specific, focused applications with highly controlled conditions. And, the AR tools and technology provider and developer ecosystems are still immature. The path to AR success is not obvious.

We Are Working Together

The AREA is here to address these issues among others, and to create an environment for organizations—large and small—to learn, share and accelerate the adoption of AR in the enterprise.

Within the AREA, member organizations from around the world have committed to sharing their experiences and challenges in a collegial atmosphere to solve complex technical and implementation problems. AREA members represent a unique blend of AR end users, systems integrators, content developers, and technology providers as well as not-for-profit research centers and academic organizations from multiple industries. Through a combined program of thought leadership, education and outreach, best practices development and communication, and technology and implementation research, AREA members are actively building the community and knowledge base that will ensure successful implementation of AR-enabled information technology environments across the enterprise.

Meetings Make Member Collaboration Tangible

By joining the AREA you will become part of a global AR ecosystem. Our shared vision for the potential of enterprise AR infuses our member meetings, like the one in Houston on October 22. We are learning and sharing best practices. We collaborate to define the best problem-solving research, and to support workforce development.

As President of the AREA and as a Sponsor Member, I am witnessing, firsthand, the level of knowledge sharing and exchange across member organizations. It is clear to me that the AREA is the only organization that provides this opportunity for AR technology providers, developers and customers.

If you didn’t get to our recent member meeting, then this website is the best place to learn more about enterprise Augmented Reality and the benefits of joining the AREA. I invite you to take the next step by contacting me or Christine Perey, AREA’s executive director, to discuss how you can contribute and participate.

We look forward to welcoming you and collaborating with you at a future meeting!

Carl Byers
AREA President
Chief Strategy Officer at Contextere




Augmented Reality’s Expanding Role in the Automotive Value Chain

 

Use Cases for the Factory Floor

With successful conclusions of pilots and trials, Augmented Reality continues to move into areas where the overlay of virtual information promotes vehicle quality and helps employees work faster and better, but also where more experience with the technology is a prerequisite. As well, higher numbers of AR implementations put greater technical and organizational demands on projects.

One key trend is the growing number of use cases for Augmented Reality in pre- and post-production processes in the automotive industry. Vehicle design and development, and then final verification after assembly are the most popular use cases.

Lina Longhitano of Mercedes-Benz Vans leads the transformation of advanced manufacturing facilities through the Van Technology Center and has a wealth of experience with digital transformation in manufacturing and the use of Mixed and Augmented Reality in vehicle development. The center provides high-end visualization and analysis for ergonomics and buildability of vehicles.

In particular, she mentioned three Mixed Reality use cases for engineering:

  • The visualization of out-of-position and validation of flexible parts.
  • The overlay of digital crash simulation data on physical crash vehicles.
  • Digital assembly and disassembly simulations with collision testing.

Mercedes-Benz Vans uses Augmented Reality for factory floor layout and design, as well as for visually inspecting components to assess differences between virtual and physical objects.

In a similar vein, Hermann Gross of Opel is putting AR to use in pre-production processes, especially in vehicle development and component integration. Opel’s Augmented Reality-assisted systems also verify the quality of physical vehicle mockups. Gross provides a number of examples for these, such as verifying the final position of parts and optimizing cable positioning. He revealed a number of benefits of AR, including:

  • Shortening the duration of mockup builds and increasing their quality
  • Speeding up problem solving
  • Positively influencing data quality

On the other end of the production spectrum, Sebastian Rauh has in-depth knowledge about how Audi is using Augmented Reality for final assembly inspection. These range from vehicle start-up to engine parameter optimization and calibration of control units and sensor parameters. On behalf of Hochschule Heilbronn, Mr. Rauh is also working with Audi to design post-production verification workflows and equip personnel with Google Glass and the Epson Moverio BT-200 to execute tasks.

The Industrialization of Augmented Reality

Juergen Lumera of Bosch, an AREA sponsor member, is one of the first in automotive who is moving beyond simple AR prototypes and into larger deployments involving greater numbers of users, departments, processes and tools. Taking a holistic approach to the human, technological, financial and organizational aspects of incorporating AR technology across an enterprise, he outlined ways to expand projects beyond pilots. Mr. Lumera emphasized that AR adoption is a journey whose destination, as well as roadmap, has to be carefully planned in order to reduce risk and promote success.

Bosch’s Common Augmented Reality Platform (CAP) is an example of a system that integrates authoring and publishing of AR content across business units and technology silos, and can become part of a wider move towards the digital factory.

Matthias Ziegler of Accenture presented a framework for enterprise Augmented Reality adoption by Accenture’s clients and confirms the expanding interest in use of wearables that support AR for hands-free workplace performance. Accenture is expecting 212 billion devices and autonomously driven cars by 2020, with a doubling of IP traffic between 2013 and 2016. Bulky form factors will delay adoption by consumers, but Accenture sees enormous opportunity for hands-free AR-enabled displays in the enterprise space.

Their template, based on a number of pilot projects, compiles statistics and experiences and defines business value drivers and use cases, guiding investment in potential areas where AR can increase ROI. For example, if a company can quantify the length of time spent researching work instructions in paper documentation, and attribute a given number of errors to misinterpretations of drawings or procedures, then AR might promise higher returns.

Augmented Reality and Customer Experiences

Ashutosh Tomar of Jaguar Land Rover says the company’s vision is to use AR for enhancing the driver experience in their vehicles. Today’s typical car is packed with sensors and features—one type of vehicle having over 70 onboard computers and 170 “smart features.”

Customers are no longer judging automobile features as a selling point alone, but also expect a better customer experience. How can cars automatically change settings (e.g., music station, seat and mirror adjustments, etc.) based on who’s driving? How can cars communicate with drivers via other sensory inputs such as haptics? JLR is making large investments in human factors research and in ways to increase driver safety via Augmented Reality, for example:

  • Visualization of “ghost cars” in windshields driving ahead to clearly demonstrate the safest way to make turns in a city.
  • The projection of cones in windshields for training purposes.
  • “B pillars” enhancing a driver’s line of sight and situational awareness by turning car walls “transparent” in certain situations, like when making narrow turns in cities.
  • Haptic feedback in the seat behind a driver’s shoulder to alert them of another vehicle passing in their blind spot.

Legal Implications

New features such as the projection of information and images in the driver’s windshield will require new regulatory regimes. Brian Wassom, intellectual property attorney at Honigman Miller Schwartz and Cohn LLP, described the current regulatory environment and spoke about the principles of the National Highway Traffic Safety Administration’s “Visual-Manual NHTSA Driver Distraction Guidelines for In-Vehicle Electronic Devices.”

  • Distractions in all forms, including cognitive and visual, should be recognized by designers and regulators.
  • Displays should be as near the driver’s forward line of sight as possible.
  • A number of distracting features should be avoided entirely: glare, social media interactions and text that scrolls or contains more than 30 characters.
  • Glances away from the road should last no more than 1.5 to 2 seconds.

The above principles apply to current systems (dashboard layouts with navigation and phone information), but might also be the basis of conversations about Augmented Reality safety and liability.

In his presentation, Ashutosh Tomar had also emphasized the need to minimize the amount of information displayed to drivers to reduce distraction, as a basic tenet of safety.

Conclusions

In addition to those already mentioned, there were interesting presentations by Volkswagen, Ubimax, the German Research Center for Artificial Intelligence (DFKI), Feynsinn, Frauenhofer Insititute and others on topics ranging from showroom use cases to the latest research on AR user experiences.

Overall it was encouraging to witness the depth of questions about Augmented Reality being asked by companies in automotive manufacturing, research, design and others, and to get the sense of its evolving acceptance in enterprise, complete with growing pains and successes.