1

Advancing Toward Open and Interoperable Augmented Reality

Enterprise Augmented Reality engineers and content managers who published experiences created with Metaio’s software tools have or will soon encounter a situation they didn’t anticipate: the publishing and delivery environments are unsupported and not evolving to take advantage of the latest enabling technologies.

Are you among this group? If so, you are not the only one to find yourself in this uncomfortable situation.

If there was a mandate to continue providing the value of their AR experiences to end users, customers of other AR software providers who are no longer supporting or advancing their platforms with the latest technology innovations hit the same roadblock. Prior to agreement on standards, they could not “port” their experiences to another AR platform. Evaluating and choosing another proprietary AR technology platform, and then investing in re-authoring, testing and re-deploying AR experiences based on their original designs, was the only way forward.

Unfortunately, some of those reading this blog are in this awkward position today.

Successfully addressing the root causes of low AR experience “portability” and the inherent lack of integration or interoperability between AR authoring and publishing systems is an important, highly collaborative process.  Different parts of the AR ecosystem must agree that there are issues, firstly, and then on principles for collaboration. Then, based on shared conceptual frameworks, they must work together towards implementing those principles in their workflows and solutions.

Supporting that collaborative process is the reason I’ve been leading the grassroots community for open and interoperable Augmented Reality content and experiences since 2009.

Is There Really a Problem?

Interoperable Augmented Reality is not a high priority for most people. Only about a hundred people are consistently investing their time in advancing the principles of open and interoperable Augmented Reality. We know one another on a first name basis; many of us compare notes in person a few times per year. Another few hundred people know of such activities but don’t directly invest in meaningful ways.

For most companies, the investment in AR has not been great. A few tens of thousands of dollars to rebuild and deploy a half dozen carefully handcrafted AR experiences is minor by comparison to investments in other enterprise technologies. 

“There’s still too much innovation to begin working on standards,” is another commonly heard refrain. Clearly they haven’t been reading the posts or listening to the presentations made by AREA member IEEE Standards Association, or leaders of other standards development groups. When designed collaboratively and to address interoperability in strategic places, there are many examples of standards doing the reverse.

There are other reasons for many to turn a blind eye to the problems. They are valid for different people to different levels.

This is a Serious Problem

In my opinion, ignoring the lack of open and interoperable Augmented Reality solutions and services is doing everyone a disservice.

The fact that only a relatively low amount of money has been invested to date is a poor justification for investing yet more time and money into building experiences with another proprietary platform, only to have the same scenario in a matter of months or years.

In fact, innovation in Augmented Reality is not what it should be today because many of the best AR developers are building a better mouse trap: smart engineers are working to solve problems that have, for the most part been solved by others, in a different way. Whether it’s for reasons of avoiding encroachment on a third party’s patents or something else, this investment of effort is in highly integrated proprietary silos and at the expense of solving other problems that remain unaddressed.

There are three more serious problems with having only proprietary technology silos and very low use of widely agreed standards for Augmented Reality experiences. The first of these is that enterprises with assets that could be leveraged for AR experiences are unable to integrate production of AR experiences into their corporate workflows. This lack of integration between AR as a method of information delivery and other information delivery systems (e.g., web pages and mobile services without AR support) means we can’t seriously stand before a CIO and recommend they support the development of AR content. What we are recommending requires setting up another entirely separate and different content management system.

In the same vein, the second reason that enterprise CIOs and CFOs are justifiably reluctant to deepen their investment in AR projects is that they cannot deploy modular architectures in which multiple vendors can propose different components. In today’s landscape of offerings, it’s all or nothing. The customer can buy into provider A’s system or that offered by provider B. If provider C comes along with a better option, too bad.

The third reason the lack of standards is a serious problem worthy of your support is closely related to the other two. Deep collaboration between AR-enabling technology vendors (providers of technologies) and service providers is currently very difficult.  They are not working to improve customer outcomes: they are working much more on competing with one another for attention and for the small investments that might be made.

Three serious enterprise AR obstacles that agreements about open and interoperable AR could reduce

  1. Low or lack of content or experience portability between proprietary technology silos

  2. Strong customer aversion to risks due to vendor lock-in

  3. Low cooperation between competitors or ecosystem members to partner for best customer outcomes

This situation with lack of interoperability and fear of vendor lock-in would be addressed if the vendors took a more serious look at possible open interfaces and standards within a larger framework. Conversely, vendors might study new approaches and establish some level of interoperability if they believed that customers would respond by increasing their budgets for Augmented Reality.

This is all very serious.

Another recent development is not helping: it’s clear that some internet and IT giants are paying a lot of attention to AR. The lack of visibility into what highly competitive and successful companies like Microsoft, Google, Apple and PTC will do about AR interoperability and integration has cast a very cold spell over enterprise AR adoption.

Their lack of support for standards and their unwillingness (to date) to shed light in a public way on how they will cooperate or how their proposed (future) systems will interoperate is causing so much uncertainty. No CIO or CFO should seriously invest in enterprise Augmented Reality until these companies’ plans with respect to integration and interoperability are clearer.

Progress is Being Made

We should be open to the possibility that 2016 will be different.

Thanks to the dedication of members of the grassroots community, the situation is not as bleak as it could be. A few weeks ago a few dozen members met in Seoul, Korea, to compare notes on progress. SK Telecom, a strong supporter of open and interoperable Augmented Reality, hosted two days of sessions. We heard status updates from four standards organizations that have highly relevant activities ongoing (Khronos Group, Open Geospatial Consortium, IEEE and ISO/IEC). We also received reports from AR developers who are working to advance their solutions to support standards.

The fact that the ISO/IEC JTC1 Joint Adhoc Group for Mixed and Augmented Reality Reference Model is nearing completion of its work is a major development about which I presented in Seoul.

In the spirit of full disclosure: the community of people in support of open and interoperable AR was the environment in which this work began, and I have been a member of that ad hoc group since its formation. If you would like to obtain a draft of the Mixed and Augmented Reality Reference Model, please send me an email request.

We are also seeing increased interest from industry-centric groups. There is a German government supported project that may propose standards for use in automotive industry AR. The results of an EU-funded project for AR models in manufacturing became the basis for the establishment of the IEEE P1589 AR Learning Experience Model working group (which I co-chair). In a recent meeting of oil and gas industry technologists, formation of a new group to work on requirements for hands-free display hardware was proposed.

These are all encouraging signs that some are thinking about open and interoperable Augmented Reality. If you want to monitor the activities of the grassroots community focusing on this topic, and to receive announcements of upcoming meetings, visit this page and register yourself for one or more of the mailing lists.

Have you seen other signs that there is increasing awareness of the problems? Do you know about any new standards that should be monitored by and presented during a future meeting of the grassroots community?




Enterprises Want to Use Wearables

Many workplace scenarios require use of both hands to manipulate physical world objects. Having a display on the wrist or head (or both) with a variety of sensors and optional cloud services, offers attractive alternatives to tablets for supporting access to real time or contextual information.

According to a Gartner Group report shared at the Enterprise Wearable Technology Summit (EWTS), sales of head-mounted displays will be greater in enterprise than consumers until at least 2020.

Gartner-slide

Unfortunately, the interest in enterprise wearable computing is not currently being addressed by consumer technology providers.

Connecting Those with Questions to Those with Experience

What are current enterprise customer requirements? What have enterprise wearable pioneers learned? What are enterprise customers’ best options today? These were among the questions that the EWTS organizer, BrainXchange, set out to answer.

BrainXchange chose Houston for its inaugural event on October 20-21, 2015.  The city is a business center for the oil and gas industry and is reachable from an international airport as well as from both coasts of the US.

Over 150 delegates from at least six countries gathered to hear from 60 speakers, including many veterans of the Google Glass Explorer program and vendors looking for new customers. The format offered plenty of networking in a convivial and relaxed atmosphere. 

AREA Members at EWTS

AREA Member Role
XMReality Sponsor
Augmate-Logo3x2 Speaker
 EPRI-profile-logo Speaker
 APX-Profile-logo  Delegate in attendance 
perey-profile-logo Delegate in attendance 

Criteria for Enterprise Wearable Success

There is wide agreement with the simple guidance that Joe White, VP and GM Enterprise Mobile Computing at Zebra Technologies offered during his opening remarks.  White recommends that enterprises focus on systems that are:

  • Technically sound
  • Socially acceptable
  • Solve a problem

These criteria sound simple, but adhering to them requires careful research and planning. Many delegates at the summit who are shopping for wearable technologies don’t feel that the current commercial technology options are sufficiently mature for most of their use cases. One person confided that everything his team has evaluated to date “feels like a science project.”

Weight, balance and resolution remain significant technical obstacles but short battery life as a result of high power consumption continues to be high on the list of technology barriers.

One test of wearable display technology reliability is how well it performs in a live demo on stage. There were more videos than live demos, but Rafael Grossman, a highly promoted surgeon in the Google Glass Explorer program successfully demonstrated Atheer Labs’ AiR platform for the audience.

Another criteria added to White’s list over the course of the first day was cost. If devices are expensive to purchase and to operate or maintain, adoption and use will remain limited.

Regardless of the criteria and how firmly an organization wants to adhere to them, customers remain divided about what’s truly going to solve their problems. Some feel that their use cases require true Augmented Reality in enterprise. Others are, at least for the present, finding the “simple” delivery of live information or images to a wearable display (as currently done by Google Glass or Vuzix M-100) sufficient. In the opinion of those who use information “snacking” devices, real time registration and tracking of data in the real world are still expensive and technically difficult.

Connecting Remote Experts with those in the Field

Real time consultation between a remote expert and a person wearing a camera and display while performing difficult tasks is a highly compelling use case for most of the EWTS speakers. Although a few speakers mentioned their experience with AR-assisted remote assistance, the majority shared numerous and immediate benefits of having another “set of eyes” focused on a particular procedure.

MedEx_abulance

For example, emergency medical technicians working on MedEx ambulances as part of the Google Glass Explorer program can transmit more information about injuries or patient conditions to emergency room staff ahead of their arrival at the hospital.

In another case study, a tradesperson working on a Rogers-O’Brien Construction job site can see and transmit the details of the job site and get guidance or feedback from an architect or supervisor in real time.

Some Industries Are Further Along

While the medical and construction industries were highly represented among the Enterprise Wearable Technology Summit speakers in Houston, some case studies and presentations highlighted the promise of wearable technology in the logistics industry. DHL and Ubimax described how they are working together to put their warehouse picking solution into production and conducting research on their next generation systems for pallet packing. 

Energy production and distribution were also frequently mentioned. John Simmins of the Electric Power Research Institute (EPRI), an AREA member, spoke of projects underway in some power generating facilities. Speakers from CenterPoint Energy and Sullivan Solar Power also attested they are actively exploring the use of wearables in their businesses.

Many Challenges Remain

An entire event could focus exclusively on expected and promised technology improvements. For example, uneven network coverage and issues preventing secure access to off-device content came up frequently. But, EWTS did not limit its scope to technology barriers.

Getting wearables into production requires companies in highly regulated industries such as healthcare and construction to educate decision makers and executives and to negotiate creation of many new policies. Those are both very lengthy and costly processes.

Compliance

Complex regulatory environments are but one item in the list of business challenges.

Lack of trust is another significant obstacle to adoption. Large enterprises are looking for vendors that are on the one hand nimble and responsive to special requirements while on the other endowed with the financial resources to quickly ramp up production for large orders.

Despite these and other challenges, wearables continue to hold enormous promise and will increasingly demand the attention of enterprise technology buyers and users. We can expect these to be on the agenda at future BrainXchange summits. The company announced that it will produce its next event in June 2016 on the East Coast, although details were not provided.

Are there events you plan to attend to learn about enterprise wearable technologies?




Google Glass 2.0—Primed for the Enterprise: Foldable, Rugged and Waterproof

When it was introduced in February 2013, Google Glass 1.0 was far ahead of its time. Consumers and developers identified many issues that needed to be addressed and, although some have adopted the hardware, it was deemed unsuitable for widespread use by consumers or enterprise customers.

Over two years later, in early summer 2015, Google began showing key developers and sharing with the media that it is working on the next generation of Glass, code named “Project Aura” powered by Intel.

Google Glass Ready

The new device is geared for professional users. Employees using the information provided via the wearable display will be able to perform tasks with fewer human errors while enhancing productivity and operational efficiency.

The new “ruggedized” Google Glass hardware design is said to be easy to fold and more durable in work environments. Some options include the ability to clip the tiny display unit on the existing eyewear.

Perhaps Google Glass 2.0 is primed to grow in many industries such as oil and gas, warehousing, manufacturing, agriculture and mining.  The likely impacts depend on the use cases and company readiness for change.

The Benefits of Hands-Free Displays in Warehousing Operations

 In April 2014, DHL published a report describing how logistics operations can be improved with the assistance of hands-free wearable devices. The use cases fell into four categories:

  • Warehouse operations
  • Transportation optimization
  • Last mile delivery
  • Service and repair and other enhanced services

The evidence to support the assertion that warehouse picking can be improved, the first use case identified in the DHL study, is mounting.

Google Glass can also be used for reducing the cost of warehouse redesign as well as factory planning but studies about metrics for these use cases are not available at this time.

The Future of Google Glass

Will Google Glass 2.0 address the issues seen in the first prototype?  This remains to be seen, but with several confirmed reports on the changes and improvements Google is making with Glass 2.0, it is evident that Google is all-in on changing the future of computing through wearables and, ultimately, with Augmented Reality.

Have you tested Google Glass 2.0? Share your thoughts and feedback below.




The Augmented Reality Provider Landscape Shifts, Again

Developers of Augmented Reality experiences select tools and technology for a project to match use case requirements. If the use case involves a page in a book or the side of a package, then in these cases 3D tracking is overkill. If the project accesses records in a company’s ERP, there must be plug-ins or a customization. If the customer needs reports (e.g., number of objects recognized, interaction of the user, etc.), then the platform needs to support their production. If the target is a movie poster, the security considerations are entirely different than if the target involves a proprietary industrial process.

After five years of Metaio’s dominance of the AR software provider landscape, developers’ options are changing dramatically. This post reviews the recent changes in this provider landscape, how these impact developers and suggests that those who license and purchase development tools could use this period of research and evaluation as an opportunity to communicate more clearly about their project requirements to all the tool and technology vendors.

A Rapidly Changing Provider Landscape

In early 2015, Metaio’s ecosystem ranged from dedicated individuals producing one or two experiences, to Fortune 100 companies. Some were researchers designing prototypes; others were automotive industry giants like BMW and Audi who used Metaio’s robust tracking algorithms for precision engineering and design. Then, in mid-May 2015, a message appeared on Metaio’s website saying that it would stop selling licenses immediately, and that support for its Augmented Reality services and software technologies would end on December 15 of the same year. The mysterious announcement took the company’s global developer ecosystem by surprise.

Many, if not most, of those developers’ authoring experiences for enterprise and industrial projects were using Metaio’s software tools. Metaio’s change in direction put developers in an uncomfortable position. Many were furious. Others expressed frustration. To this day there remain many questions about the circumstances that led to the announcement. Regardless of the changes to a company that the developer ecosystem had grown to trust, serious business issues remain:

  • What will happen to the channels published in a platform operated by Metaio?
  • What will developers use in the place of Metaio’s tools?

Many developers are now doing what more could have done consistently over the previous years: investing their resources to evaluate other potential tools and technologies. The best developers will resume proposing projects to their customers once they have thoroughly tested the alternatives.   

Gaps for Enterprise Augmented Reality

While there are alternate enterprise Augmented Reality technology providers with solutions and services worthy of evaluation (see table below), none offer the breadth and maturity, the professional documentation and support that Metaio provided for its SDK, Creator, Suite, Cloud and Continuous Visual Search matching system.  

Enterprise AR authoring providers and products

Source: © 2014 – 2015
Company Platform
DAQRI 4D Studio and AR Toolkit
Wikitude Wikitude SDK
Inglobe Technologies AR Media (and other)
BuildAR BuildAR
Catchoom CraftAR (and other)
NGRAIN Vergence (and other)
Diota DiotaPlayer, DiotaConnect
EON Reality EON Studio (and other)
Bitstars Holobuilder
Fraunhofer IGD Instant Reality
Kudan Kudan SDK

Metaio’s dominance wasn’t limited to breadth of offering and AR developer mind share. Among its peers, it probably also generated the greatest revenue from licensing its software tools and providing services. To deliver value to customers and drive development of its technology suite, Metaio employed over 75 of the world’s most qualified and experienced enterprise AR engineers.Table 1. Enterprise AR authoring providers and their products

Those that can have been furiously hiring engineers to write code and build out their teams and offerings but breadth and depth like what Metaio offered doesn’t happen in a matter of months. 

Vuforia’s Focus on Consumer Use Cases

No one knows precisely how much of the Metaio developer ecosystem overlapped that of Qualcomm Vuforia, but anecdotal evidence suggests that developers who had use for both, leveraged their qualities for entirely different projects. 

Vuforia is strongly optimized for delivery to consumers on smartphones: entertainment, cultural heritage, education and marketing use cases. For this reason, developers who explored its use for their enterprise or industrial projects did not place Vuforia’s current offerings at the top of their list of preferred enterprise-ready AR tools.

In an October 12 press release, PTC, a global provider of enterprise platforms and solutions for creating, operating, and servicing connected objects, announced that it had reached an agreement to acquire the Vuforia technology, and its developer ecosystem, from Qualcomm Connected Experiences, Inc., a subsidiary of Qualcomm Incorporated.

The acquisition of Vuforia by PTC suggests that while Metaio technology is probably being integrated into a platform and tools for consumer-facing solutions, the tools most popular for consumer-facing AR experiences (i.e., the Vuforia SDK) will evolve to better meet the needs of developers seeking to address enterprise use cases.

The Landscape Continues to Evolve

The reversal of relative positions of the two popular Augmented Reality SDKs with respect to their target markets and one another is one of several trends.

First, the list of developer options is expanding. Firms that were previously quiet have the opportunity to engage with developers who are more interested in learning of their offers. Google is getting closer to its Glass at Work 2.0 release. Microsoft is showing HoloLens and the tools it has designed for authoring (aka “Holo Lens Studio”) to more developers. Some firms with significant experience and investments in enterprise Augmented Reality are becoming more attractive, or at least more visible. For example, Diotasoft, a French technology provider with loyal enterprise customers including Renault, PSA Peugot Citroen, Total and Dassault Aviation announced a rebranding (the company is now called “Diota”) and launched a new platform for enterprise Augmented Reality.

Another trend is a shift in positioning. PTC and Vuforia’s statements in their October 12 press release emphasize where they see the greatest potential for impact. They draw a line between Augmented Reality and the need for people to visualize data stored in and managed by PTC’s Internet of Things-oriented systems. This echoes the suggestion made by Gerry Kim, professor at Korea University, in a meeting of the AR Community on October 6: Augmented Reality is the human interface for IoT.

As the number of options increases, so does the potential cost of integration. In a highly fragmented market one large enterprise could easily end up with solutions addressing different use cases based on multiple different and incompatible SDKs.

AR Data Integration

An Opportunity to Mandate Open Solutions

A unique opportunity lies in the middle of the increasing fragmentation and investment in new technology providers.

What if, instead of accepting the status quo of many competing and incompatible AR platforms, large enterprise customers and their developers were to clearly demonstrate their need for open systems?

Developers can seize the next few weeks and months to prepare a campaign describing new or existing systems with which they would prefer to create and manage enterprise content. They can document the barriers to interoperability and mount pressure on enabling technology providers. What if, prior to a purchase or licensing decision, the provider of an AR authoring platform were required to demonstrate interoperability with content generated from Metaio’s SDK?

Openness does not mean Open Source. Openness is a condition that is based on explicit or implied agreements between vendors. Providers of technologies must agree upon common data formats, and provide interfaces and APIs that are well documented and designed for interoperability with solutions of potential competitors.

Without issuing a clear mandate for AR technology providers to support a greater level of integration and interoperability with enterprise IT systems, developers should not be surprised if their options remain highly rigid and difficult to integrate. Unless some forward thinking people don’t take action, developers and their large enterprise customers must be prepared to face many more years investing in brittle transcoding systems or other approaches to “work around” the lack of openness and interoperability.

How are you going to respond to this rapidly shifting AR technology provider landscape? Are you taking this opportunity to share your requirements with new vendors? 




Starting the Enterprise Augmented Reality Conversation

Have you asked any IT professionals or business managers what they’re doing with Augmented Reality? A small fraction can share how they’ve considered using AR for improving their workplace processes, but most inquiries about how companies are using AR begin with a blank stare and end in frustration.  

The AREA and its members are developing high-quality content that can be the basis of more precise and fruitful dialog than we often have today. Once there is a shared conceptual foundation, we’ll be able to discuss the concrete benefits as well as the risks of introducing Augmented Reality in the enterprise with our audiences.

Explore the Audience Knowledge Level

Casual discussion between acquaintances or between a supplier and a potential customer can’t evolve gracefully if they must begin with deep explanations or clarifications of confusing terminologies. Don’t start with a dry definition. Focus first on either a known or shared challenge or potential benefit and make sure you can squeeze a few terms in casually in the first minutes.

“Isn’t it frustrating that we can’t significantly increase our productivity?” you can inquire. Be specific about the use case, if you can. You can substitute “increasing productivity” with other metrics such as reduce errors, reduce risk or increase safety. Drop in some keywords to make sure they understand that you feel new technologies could help. Avoid buzzwords such as wearables, IoT, Augmented Reality or Virtual Reality in the first five minutes. Try to avoid bringing up Hollywood movies or popular science fiction books that have Augmented Reality.

Then you can say that you’ve heard or that you’re exploring how this new technology could play a role by overlaying digital information on the real world. Let your prospective customer or partner, or whomever you’re speaking to, be the first to mention wearables or AR.

When asked if they’ve heard of it and what they’re doing or planning to do with Augmented Reality, an IT professional will respond in one of two ways. The younger the person, the more likely they are to have heard and understood the potential. That said, they may not have thought to apply it to their job.

“That’s technology for your smartphone. I’ve seen it used in a museum, once” they might say. Then they either describe how the AR experience failed or just didn’t bring value to them.  Such conversations often conclude with the person dismissing the whole idea.

“It’s probably good for entertainment, but we’re not that kind of company,” is not an uncommon conclusion.

A more knowledgeable audience may remember Virtual Reality and the promises it held but didn’t deliver. Then you will need to reprogram them to understand the differences. 

Others will have had no exposure at all to Augmented Reality.

Light Bulb Moment

Once you’ve decided if the conversation is worthy of continuing investment, you’re going to aim for a “light bulb” moment: a look in their eye that shows that the person with whom you’re meeting has had a breakthrough in understanding.

To get to that moment of realization may take several steps. As already suggested, if you’re in conversation with an IT professional or line manager with a lot of engineering experience, you will get there more quickly.

Begin by building upon something very familiar. Everyone has seen and almost all have personally used video conferencing. AREA member David Doral, Director of AERTEC Solutions begins his education process by suggesting that when trying to understand a problem at a remote location, it would be valuable to be able to see things as if from another’s eyes.

“We suggest to the customer that we support the technician in the field or on the shop floor with an expert who is somewhere else,” explains Doral. He doesn’t say where that expert is, but makes it perfectly clear that they are the key to solving a problem and there’s not time for that expert to personally fly to the location. In AR, this use case is known as the “remote expert,” but this term doesn’t need to be introduced.

“Then, if they like this concept, we can suggest that the expert could draw arrows, point or otherwise indicate steps with animations,” continues Doral. “Imagine that the person who is in the field or on the shop floor is providing the remote hands, performing tasks as directed and under the supervision of the expert.”

AR Overlay Usability Study

Up Close and Personal

Another approach to reach a light bulb moment is to demonstrate an Augmented Reality experience right away. Sometimes, this can be performed using a tablet and an object that you’ve brought with you. Choose an object that is likely to be professional and slightly complex in nature but with a very simple user interface, such as a pocket projector. A virtual interface can appear with Augmented Reality to help the user with configuration and operation.

Three-dimensional objects are nice and have a big “wow” factor but a photo will also work well and may have higher performance. Lighting, and reflections on a glossy surface, may have a big impact on your ability to track the target, so test your sample photo or object well before using it. Be sure to give the other person the device to hold and move around, to interact with the content in the experience.

Often people try to simulate this effect, and reduce the risk of failure, by showing a video of an AR experience recording, but your audience will assign lower credibility to a video because they understand that special effects as seen in the movies are now commonplace.  Hasn’t everyone seen Minority Report and Iron Man?

From a shared understanding of the benefits of Augmented Reality, you might be able to progress to talking about a project and the potential of implementing AR in a few use cases.

What techniques have you used to successfully start a conversation about enterprise Augmented Reality?  Share your methods with others in the comments below.




Measuring Impacts of Enterprise Augmented Reality

The single most compelling reason to invest in enterprise Augmented Reality is to improve the productivity of people while on the job. Workplace performance is a broad concept that can be measured in many ways, and the impacts of a new user interface providing contextually sensitive information are new to most technology managers.

During the ARise ’15 conference, Matt Kammerait, Vice President of Products at DAQRI, presented some of the methodologies currently in use for gathering metrics and assessing the full impact of introducing Augmented Reality. This post builds upon experiences shared in the presentation and offers project leaders suggestions for how to quantify the impacts of enterprise AR.

Consider the Organizational Attitudes and Experience with Innovation

Before selecting the metrics for an AR project, consider the attitudes and experience of the various stakeholders in your organization. Aligning the project objectives with stakeholder goals and adapting available best practices or existing metrics from other projects can boost the chance of success. It will help when you want to expand the project beyond the pilot stage.

In parallel, those defining the parameters of an internal study need to take into account industry-wide metrics or constraints. Time savings is very important, but may not be the most important or best metric for all industries. In industries where workers’ lives are at risk, safety is almost surely a more important organizational metric. If an industry is heavily regulated, then compliance is likely to be at or near the top of the list.

It’s important to prioritize potentially valuable metrics. Take into account the time requirement and cost of studying each type of metric as an organizational constraint during the design phase.

If capturing detailed or complex metrics will increase the need for specialized staff and otherwise exceed the study resources, compromises based on the original set of priorities will need to be made.

AR Metrics

Detect the Broad Patterns

Human-computer interfaces are key to the digital economy. Screens, keyboards, speakers, cameras and microphones provide faster and more accurate ways to acquire information or knowledge or, conversely, to quickly develop and communicate it. Knowledge workers build value with computers and networks by manipulating pixels that form meaningful symbols such as numbers, letters, lines and even virtual 3D objects.

In contrast to the tools in use by knowledge workers today, Augmented Reality is most useful when and where people interact with, or perform tasks with objects in the physical world. In the use cases that guide workers, Augmented Reality-assisted visualization provides individuals who lack information or encounter obstacles a means to retrieve and use digital symbols directly, without losing focus on the physical world.

Before any Augmented Reality introduction project begins, the objects with which people interact and the interactions themselves can be inventoried. The most frequently manipulated objects are going to be most familiar and the least likely to benefit from an AR-assisted interface. On the contrary, it is those processes or objects that are infrequently encountered and yet complex where the greatest potential for Augmented Reality can be tapped.

Which tasks or objects frequently present obstacles in terms of inexperience, limited human cognition or memory lapses? For example, in a warehouse, nearly every order or the contents of every truck is unique. There’s very little that past experience or strength can do to help humans perform their job better, but a digital guide for where to find an object or how to pack it on a palette, can reduce the need for search, trial and error.

In a field service scenario, every part has had a unique life history with respect to use, environmental conditions and other factors. Workers can better use past experience for rapid diagnostics when the track record of the part is rapidly and clearly available. 

Learning or training organizations are often good partners for project managers who want to document patterns across a workforce when performing key processes. These groups have a unique perspective on the tasks that are the most difficult to teach or retain, and may also have well established methods for measuring performance in the lab and on the ground.  

Capture Ground Truth

Prior to introducing Augmented Reality, perform a systematic measurement of the un-assisted process. Interviews with those who will participate in the project are always beneficial to assess attitudes about the new technology, but the documentation of ground truth must include actual task observations.

Accompanying a person and observing all their activities is one way to document their existing processes but this is likely to introduce a variety of errors in the data. If possible, automatic measuring tools that are completely invisible to the subject and do not interfere in any way with the normal flow of tasks should be explored.

Frequently, when a person needs assistance and cannot easily find the information in a manual, there is a need to consult another worker. Since AR could reduce the need for a person to seek assistance from others, the impacts on other workers’ productivity should also be considered.

A representative sample of people with different training or experience levels is key to getting good ground truth data. By observing the methods of a novice as well as a highly skilled journeyman and people between the two ends of the spectrum, it may be possible to narrow down a limited number of steps that are most likely to benefit from AR support.

Build Recording and Capture Tools into the System

When designing AR experiences, it may be possible to record the achievement of specific steps or the entire session of use. This may require mounting an independent camera into a workspace, or adding components to the AR delivery platform.  To get the highest fidelity recording may require adjustments to ambient lighting since the AR experience setting may not be suitable for the camera that is recording activities. If using a mobile platform to capture activities, the additional task of recording interactions will impact battery life and, if network-based storage is part of the design, the communication needs (in terms of coverage and bandwidth) will certainly be different than those of the AR-assisted application itself.

Another key component when recording the user’s interaction is to have permission in advance for the project to use the recording in documenting the impacts of the AR-enabled system. Usually a simple release is adequate but in a unionized work setting, having the cooperation and support of the union might be necessary for successfully documenting the impacts.

Be Flexible

In some projects, the additional information provided by an Augmented Reality-assisted system introduces new opportunities to save resources, to catch un-discovered errors or even to document entirely new methods to complete a task. It may also enhance the work experience of the employee, which may be an important “soft metric” that is difficult to assess. In general, these are all important factors to consider, even if difficult to quantify.

In order to reduce the likelihood of overlooking the qualitative (as well as unanticipated quantitative) impacts, projects should include an in-depth exit interview.  This can be conducted either online or face-to-face. In the interview with study participants, invite discussion and feedback on all aspects of the experience. Something valuable is likely to shed light on metrics collected as well as other obstacles to, or drivers of adoption.

Recommendations

Every organization has a unique approach to new technology introduction and different industries place emphasis on different performance metrics, but there are some basic best practices to follow, based on past experience of AREA members. When designing an AR introduction impact measurement system, project leaders can apply these best practices:

  1. Consider the business setting and management priorities in order to design metrics that matter most to those making the final decisions.
  2. Collaborate with different stakeholders and groups to identify and thoroughly document characteristics of bottlenecks or pain points that are common or similar across diverse professional skill sets, tasks, groups, products or facilities in the organization (e.g., transit time, down time, assembly errors and inspections).
  3. Capture existing processes that are part of the proposed AR introduction use case, as performed by both novices and senior members of the workforce, without the assistance of new technology.
  4. Build in or set up recording systems that do not interfere with or impact the user’s performance or the AR experience delivery.
  5. Perform an exit interview and keep an open mind about impacts that the user may have perceived that were not originally part of the study’s measured parameters.

How have you designed your pilot to capture metrics, and have the measurements helped to estimate the impact the introduction of Augmented Reality will have in your organization? Please leave your feedback below.

Want to hear more? Watch this video…

 




Augmented Reality Use-cases at Newport News Shipbuilding

Shipbuilding has been the perfect environment for industrial innovation for hundreds of years. Sails to steam, wood to iron, rivets to welds, blueprints to CAD, stick-built to modular construction–all major innovations to building extraordinarily complex vehicles. At Newport News Shipbuilding, we constantly seek new innovations to improve our safety, quality, cost, and schedules. Since 2007, we have explored Augmented Reality as a means to shift away from paper-based documentation in our work.

Since we began looking into AR for construction, operation, and maintenance workflows, we’ve come up with hundreds of use-cases to improve tasks or processes. These range from assisting shipbuilders in painting, ship-fitting, electrical installation, pipefitting, and more in several ways – on new construction ships, ship overhaul, facility maintenance, and decommissioning. Every use-case improves our ability to deliver nuclear aircraft carriers and submarines, but at different degrees of improvement.

We’re always adding new use-cases to the list, and we’ve needed to devise an adaptable framework for organizing and categorizing existing, proven uses and prioritizing future, potential use-cases.

Genesis of a Use Case

Augmented Reality should be employed first in places where it creates the most value – and that actually can be subjective. Sometimes, this is helping people become more efficient and working more quickly, sometimes this is about helping to reduce errors and rework, and sometimes it is all about improving safety. At Newport News Shipbuilding, a dedicated team of AR professionals help determine where AR is best suited, whether the technology is ready for the use-case, and how to best implement and scale a solution.

The first step in defining a use-case is performed by an AR industrial engineer, who determines where AR brings value in a workflow. She first meets with a skilled craftsman, and understands their challenges and needs. The industrial engineer identifies pain points in processes, such as when and where shipbuilders must consult paper documentation to complete a task. She must also consider human factors and always balance the needs of the craftsman against the capability of the AR solution as it can be delivered today.

Then, the AR engineer works with an AR designer and an AR developer to deliver a product. The AR designer determines the available data, components, interfaces and models for the system to satisfy requirements. Once the use-case is fully defined and the data is assembled, an AR developer implements software solutions, tests the system, and ensures reliable and adaptable development tools. At the end of the process, a new use-case is addressed, and a high-value product is delivered to the skilled craftsman.

A Classification Scheme

Over the years we’ve devised hundreds of use-cases and needed a way to understand and prioritize them. We started by categorizing them into a taxonomy that we think of as general, but we admit they might be specific to our business. We call these our seven use-case categories.

Category

Description

Inspection (quality assurance)

An inspector determines how well a component or part conforms to defined requirements.

Work instruction

Guides a person or otherwise provides information useful for task execution.

Training

AR as a new medium for training skilled craftspeople, especially on complex and/or expensive systems.

Workflow management

Helps a supervisor plan and execute workflows for a team.

Operational

Use-cases for visualizing data about ongoing operations or system states (energy in a circuit breaker, flow rate in a pipe, etc.).

Safety

Enhance situational awareness for craftspeople.

Logistics

Helps a craftsman or supervisor understand where people and things are in space.

These 7 categories then are applied across three additional axes. These variables create a volume of exploration, or “trade space” for each use-case. The three application axes are as follows.

Variable

Description

Product line

Ship types such as aircraft carriers, submarines, etc., are differentiated and determine the content available for a use-case. For example, what type of, if any, 3D CAD models are available. Products without 3D CAD can still benefit from AR, but require laser scanning, data collation, and other methods to create effective AR uses. Also, industrial processes for one product may be different from the process for another, and these differences may make AR valuable on one product, and unnecessary on another.

Product life cycle

Represents phases of a ship’s life cycle, such as new construction, operation, overhaul and inactivation. Understanding the life cycle provides purpose and scope for the content, and also defines the type of AR consumer – shipbuilder, sailor, engineering maintainer, etc.

Trade skill

Workshop roles such as welders, pipefitters, electricians, etc., which determine AR needs, personal protective equipment, user factors, and in many cases, content and tolerance requirements.

Return on Investment

When investing in new technology, it’s important to find those areas offering the highest return on investment (ROI) for every dollar spent. At the same time, there are potentially high value use-cases that are simply not conducive to an AR solution today. As a professional AR team, we pride ourselves on understanding when we can have an impact, when we can have a really big impact, and when AR technology simply isn’t yet up to the challenge. We primarily focus on advancing the seven use-case categories, and use the three variable axes to ensure we are maximizing customer value and ROI. As our expertise has grown, and as the technology matures, we have steadily increased value and readiness of AR throughout the entire trade space.

Today, we assess highest potential ROI and use that as a metric for scaling priority. Our model shows the greatest ROI in use-cases for inspection, work instruction, and training. Our focus there is now on scalability. We also know that the ROI is really tied directly to the technology readiness levels (TRL) of AR for those use-cases. While we are certain there will be benefit, maybe even higher ROI, on workflow management, operations, safety, and logistics – the readiness levels of AR for those use-cases within our trade space simply isn’t as high (today) as for the first three mentioned. You can’t scale what doesn’t yet work. So for the latter four uses, therefore, the investment isn’t in scalability, but rather in improving the TRL.

As Augmented Reality technology becomes more capable and less expensive to implement, enterprises will find ever-increasing uses. We’d like to learn how others in different industries have been developing theirs. Please share your comments and experiences with us.




Augmented Reality Puts a New User Interface on Smart, Connected Products

Data is the glue that connects customers, products and departments—the living tissues—of an enterprise. Without data and new methods of producing, collecting, storing and using life-giving data, companies and markets shut down. And, for the past decade we’ve been hearing how some companies transform themselves and their industries with more and better data, and how systems that leverage enormous amounts of data—Big Data—continue to receive huge investment.

Some of those who were successful in introducing Big Data are now surrounding themselves, and building new businesses (or new opportunities for old businesses) with “smart, connected products.” You might’ve read about early versions of such products. These are physical objects built with connectivity and embedded sensors that pump out and ingest real time data for a specific purpose. Bruce Sterling coined the term “Spimes” to capture how these physically real objects also have a strong sense of their place and time. They are fundamentally important to a generation of 21st century businesses that see a future based on Big Data, but they are hard to make and use.

A Framework for Answering Big Questions

Few question the need for smart, connected products. The big questions for which many managers would like simple and clear answers are how to design the best smart, connected products and how to develop new businesses (or better, more efficient business processes) around these.

In my opinion, few business leaders have been able to better communicate the necessary ingredients and steps for designing, building and using smart, connected products for business transformation than Michael E. Porter, faculty member at Harvard Business School, and James E. Heppelmann, president and CEO of PTC.

Originally published in Harvard Business Review on November 2014, their first article on smart, connected products defines the domain and the new technology stack upon which the domain is based. The new technology stack Porter and Heppelmann define is composed of:

  • New product hardware
  • Embedded software
  • Connectivity
  • A product cloud consisting of software running on remote servers
  • A suite of security tools
  • A gateway for external information sources
  • Integration with enterprise business systems

The authors then explain how these smart, connected products are exerting pressure on businesses by changing the competitive landscape for those companies who adopt and deploy them, and those who don’t. Essentially, the focus of the article is on how to use smart, connected products to manage or change the competitive landscape.

Building a Bridge between Smart, Connected Products and People

In the October 2015 issue of Harvard Business Review, another article by the same authors provides insights into the internal use of smart, connected products. It focuses on their use in transforming businesses and their value chains.  For those of us involved in the introduction and deployment of Augmented Reality in enterprise, this is a highly useful guide and conceptual resource to study and have handy.

How to Smart Connected Products are Transforming Business” adds concepts that utilize and build upon the previously defined technology stack and the original framework while also examining the human side of smart product introduction.

Porter and Hepplemann explain that smart, connected products require a new design discipline. The use of Augmented Reality is one of the ways that changes in design are transforming the value of limited resources, primarily by reducing task execution times by humans by displaying specialized knowledge in the field of view.

Augmented Reality is also making other processes more efficient. From configuration of unfamiliar instruments to after-sales services, the authors repeatedly illustrate how having data accessible and visible in context is reducing the time and the errors that can increase the cost of complex processes.

Many industries will be transformed by smart, connected products using new, contextually sensitive user interfaces. Although they never use the buzzword “Industry 4.0,” the authors give a wide range of examples and conclude that manufacturing companies (or manufacturing departments within other industries) are the first to tap this potential in a meaningful way.

People Remain a Limited and Valuable Resource

The adoption of Augmented Reality in enterprise fundamentally builds upon the successful introduction of smart, connected products. It will not be able to deliver on its potential without parallel investments in other enterprise IT systems, in particular those that produce, collect and store data in the new products.  

Investments in technology are required but not sufficient. One of the take home messages of the second installment of Porter and Hepplemann’s smart, connected products series is that people trained in the new design disciplines, and in the development of experiences and systems built upon them, are rare.

New expertise needed for smart product design and use is in desperately short supply. Before those with the expertise are available in numbers sufficient to meet future demand, business leaders need to develop new cultures that place value on collaboration between people in different product and service life cycle phases. New incentive models will be developed to reward productivity without errors and higher compliance levels than have ever been possible.

In the end, or at least for the next transformation of business, people using data in better and faster ways remain more important than simply producing and storing more data.

Has your organization defined roles suitable for the next transformation of business?




Augmented Reality Can Increase Productivity

Technological and cultural shifts that result in enhancements in manufacturing tend to increase complexity in products and processes. In turn, this complexity increases requirements in manufacturing and puts added pressure on organizations to squeeze out inefficiencies and lower costs where and when feasible.

This trend is acute in aerospace, where complexity, quality and safety require a large portion of final assembly to be done by humans. Corporations like AREA member Boeing are finding ways to improve assembly workflows by making tasks easier and faster to perform with less errors.

At ARise ’15, Paul Davies of Boeing presented a wing assembly study in collaboration with Iowa State University, showing dramatic differences in performance when complex tasks are performed following 2D work instructions versus Augmented Reality.

A Study in Efficiency

In the study, three control groups were asked to assemble parts of a wing, which required over 50 steps to assemble nearly 30 different parts. Each group performed the task using three different modes of work instruction:

  • A desktop computer screen displaying a work instruction PDF file. The computer was immobile and sat in the corner of the room away from the assembly area.
  • A mobile tablet displaying a work instruction PDF file, which participants could carry with them.
  • A mobile tablet displaying Augmented Reality software showing the work instructions as guided steps with graphical overlays. A four-camera infrared tracking system provided high-precision motion tracking for accurate alignment of the AR models with the real world.

Subjects assembled the wing twice; during the first attempt, observers measured first time quality (see below) before disassembling the wing and having participants reassemble it to measure the effectiveness of instructions on the learning curve.

Participants’ movements and activities were recorded using four webcams positioned around the work cell. In addition, they wore a plastic helmet with reflective tracker balls that allowed optical tracking of head position and orientation in order for researchers to visualize data about how tasks were fulfilled. Tracker balls were also attached to the tablet (in both AR and non-AR modes).

First Time Quality

To evaluate the ability of a novice trainee with little or no experience to perform an operation the first time (“first time quality”), errors are counted and categorized. The study revealed that tablet mode yielded significantly less errors (on average) than desktop mode.

In the diagram above, the blue bar represents the first assembly attempt and the green bar is the second. The diagram also shows that subjects using Augmented Reality mode made zero errors on average per person, indicating the potential of AR to improve first time quality for assembly tasks.

In the diagram above, the blue bar represents the first assembly attempt and the green bar is the second. The diagram also shows that subjects using Augmented Reality mode made zero errors on average per person, indicating the potential of AR to improve first time quality for assembly tasks.

Rapid assembly

ARIncreaseProductivity-graph2

This diagram measures time taken to complete tasks by mode, both the first and second time. AR-assisted participants completed tasks faster the first time than with other modes

Conclusions

Overall the study witnessed an almost 90% improvement in first time quality between desktop and Augmented Reality modes, with AR reducing time to build the wing by around 30%. Researchers also found that when instructions are presented with Augmented Reality, people gain a faster understanding and need less convincing of the correctness of tasks.

Bottom line is that this study shows and quantifies how complex tasks performed for the first time can benefit from Augmented Reality work instructions. If the task is done with fewer errors and faster, the impact on productivity is highly significant.

Where can Augmented Reality make an impact in your organization?




The Fourth Industrial Revolution

This article originally appeared in the AERTEC Solutions blog.

Contrary to what many people believe, the aeronautical industry is today heavily reliant on the human factor. Craftsmanship prevails in a process that produces large machines—namely aircraft—containing thousands of parts and involving disparate tasks that converge on the manufacturing of a few dozen units a month in the best of cases.

Image - the fourth industrial revolution

In reality, this figure is minuscule if we compare it to the automobile industry, where we can see manufacturing plants churning out an average of 50 vehicles per hour. This production volume and the larger number of parts and repetitive tasks it involves allow for significant cost savings as a result of the inclusion of automation processes.

The aeronautical industry is making great strides in incorporating the best knowledge and experience gained in these manufacturing sectors and including them for its own benefit, along with other more innovative technologies, procedures and concepts.

This infographic shows some of these concepts, along with others that have already been in use for some time, illustrating what some call Industry 4.0 or the Factory of the Future. We also refer to this as the Augmented Factory due to upcoming human-machine interfaces that integrate human activities into the industrial internet of things.

TFIR-IMG