1

Advancing Toward Open and Interoperable Augmented Reality

Enterprise Augmented Reality engineers and content managers who published experiences created with Metaio’s software tools have or will soon encounter a situation they didn’t anticipate: the publishing and delivery environments are unsupported and not evolving to take advantage of the latest enabling technologies.

Are you among this group? If so, you are not the only one to find yourself in this uncomfortable situation.

If there was a mandate to continue providing the value of their AR experiences to end users, customers of other AR software providers who are no longer supporting or advancing their platforms with the latest technology innovations hit the same roadblock. Prior to agreement on standards, they could not “port” their experiences to another AR platform. Evaluating and choosing another proprietary AR technology platform, and then investing in re-authoring, testing and re-deploying AR experiences based on their original designs, was the only way forward.

Unfortunately, some of those reading this blog are in this awkward position today.

Successfully addressing the root causes of low AR experience “portability” and the inherent lack of integration or interoperability between AR authoring and publishing systems is an important, highly collaborative process.  Different parts of the AR ecosystem must agree that there are issues, firstly, and then on principles for collaboration. Then, based on shared conceptual frameworks, they must work together towards implementing those principles in their workflows and solutions.

Supporting that collaborative process is the reason I’ve been leading the grassroots community for open and interoperable Augmented Reality content and experiences since 2009.

Is There Really a Problem?

Interoperable Augmented Reality is not a high priority for most people. Only about a hundred people are consistently investing their time in advancing the principles of open and interoperable Augmented Reality. We know one another on a first name basis; many of us compare notes in person a few times per year. Another few hundred people know of such activities but don’t directly invest in meaningful ways.

For most companies, the investment in AR has not been great. A few tens of thousands of dollars to rebuild and deploy a half dozen carefully handcrafted AR experiences is minor by comparison to investments in other enterprise technologies. 

“There’s still too much innovation to begin working on standards,” is another commonly heard refrain. Clearly they haven’t been reading the posts or listening to the presentations made by AREA member IEEE Standards Association, or leaders of other standards development groups. When designed collaboratively and to address interoperability in strategic places, there are many examples of standards doing the reverse.

There are other reasons for many to turn a blind eye to the problems. They are valid for different people to different levels.

This is a Serious Problem

In my opinion, ignoring the lack of open and interoperable Augmented Reality solutions and services is doing everyone a disservice.

The fact that only a relatively low amount of money has been invested to date is a poor justification for investing yet more time and money into building experiences with another proprietary platform, only to have the same scenario in a matter of months or years.

In fact, innovation in Augmented Reality is not what it should be today because many of the best AR developers are building a better mouse trap: smart engineers are working to solve problems that have, for the most part been solved by others, in a different way. Whether it’s for reasons of avoiding encroachment on a third party’s patents or something else, this investment of effort is in highly integrated proprietary silos and at the expense of solving other problems that remain unaddressed.

There are three more serious problems with having only proprietary technology silos and very low use of widely agreed standards for Augmented Reality experiences. The first of these is that enterprises with assets that could be leveraged for AR experiences are unable to integrate production of AR experiences into their corporate workflows. This lack of integration between AR as a method of information delivery and other information delivery systems (e.g., web pages and mobile services without AR support) means we can’t seriously stand before a CIO and recommend they support the development of AR content. What we are recommending requires setting up another entirely separate and different content management system.

In the same vein, the second reason that enterprise CIOs and CFOs are justifiably reluctant to deepen their investment in AR projects is that they cannot deploy modular architectures in which multiple vendors can propose different components. In today’s landscape of offerings, it’s all or nothing. The customer can buy into provider A’s system or that offered by provider B. If provider C comes along with a better option, too bad.

The third reason the lack of standards is a serious problem worthy of your support is closely related to the other two. Deep collaboration between AR-enabling technology vendors (providers of technologies) and service providers is currently very difficult.  They are not working to improve customer outcomes: they are working much more on competing with one another for attention and for the small investments that might be made.

Three serious enterprise AR obstacles that agreements about open and interoperable AR could reduce

  1. Low or lack of content or experience portability between proprietary technology silos

  2. Strong customer aversion to risks due to vendor lock-in

  3. Low cooperation between competitors or ecosystem members to partner for best customer outcomes

This situation with lack of interoperability and fear of vendor lock-in would be addressed if the vendors took a more serious look at possible open interfaces and standards within a larger framework. Conversely, vendors might study new approaches and establish some level of interoperability if they believed that customers would respond by increasing their budgets for Augmented Reality.

This is all very serious.

Another recent development is not helping: it’s clear that some internet and IT giants are paying a lot of attention to AR. The lack of visibility into what highly competitive and successful companies like Microsoft, Google, Apple and PTC will do about AR interoperability and integration has cast a very cold spell over enterprise AR adoption.

Their lack of support for standards and their unwillingness (to date) to shed light in a public way on how they will cooperate or how their proposed (future) systems will interoperate is causing so much uncertainty. No CIO or CFO should seriously invest in enterprise Augmented Reality until these companies’ plans with respect to integration and interoperability are clearer.

Progress is Being Made

We should be open to the possibility that 2016 will be different.

Thanks to the dedication of members of the grassroots community, the situation is not as bleak as it could be. A few weeks ago a few dozen members met in Seoul, Korea, to compare notes on progress. SK Telecom, a strong supporter of open and interoperable Augmented Reality, hosted two days of sessions. We heard status updates from four standards organizations that have highly relevant activities ongoing (Khronos Group, Open Geospatial Consortium, IEEE and ISO/IEC). We also received reports from AR developers who are working to advance their solutions to support standards.

The fact that the ISO/IEC JTC1 Joint Adhoc Group for Mixed and Augmented Reality Reference Model is nearing completion of its work is a major development about which I presented in Seoul.

In the spirit of full disclosure: the community of people in support of open and interoperable AR was the environment in which this work began, and I have been a member of that ad hoc group since its formation. If you would like to obtain a draft of the Mixed and Augmented Reality Reference Model, please send me an email request.

We are also seeing increased interest from industry-centric groups. There is a German government supported project that may propose standards for use in automotive industry AR. The results of an EU-funded project for AR models in manufacturing became the basis for the establishment of the IEEE P1589 AR Learning Experience Model working group (which I co-chair). In a recent meeting of oil and gas industry technologists, formation of a new group to work on requirements for hands-free display hardware was proposed.

These are all encouraging signs that some are thinking about open and interoperable Augmented Reality. If you want to monitor the activities of the grassroots community focusing on this topic, and to receive announcements of upcoming meetings, visit this page and register yourself for one or more of the mailing lists.

Have you seen other signs that there is increasing awareness of the problems? Do you know about any new standards that should be monitored by and presented during a future meeting of the grassroots community?




Enterprises Want to Use Wearables

Many workplace scenarios require use of both hands to manipulate physical world objects. Having a display on the wrist or head (or both) with a variety of sensors and optional cloud services, offers attractive alternatives to tablets for supporting access to real time or contextual information.

According to a Gartner Group report shared at the Enterprise Wearable Technology Summit (EWTS), sales of head-mounted displays will be greater in enterprise than consumers until at least 2020.

Gartner-slide

Unfortunately, the interest in enterprise wearable computing is not currently being addressed by consumer technology providers.

Connecting Those with Questions to Those with Experience

What are current enterprise customer requirements? What have enterprise wearable pioneers learned? What are enterprise customers’ best options today? These were among the questions that the EWTS organizer, BrainXchange, set out to answer.

BrainXchange chose Houston for its inaugural event on October 20-21, 2015.  The city is a business center for the oil and gas industry and is reachable from an international airport as well as from both coasts of the US.

Over 150 delegates from at least six countries gathered to hear from 60 speakers, including many veterans of the Google Glass Explorer program and vendors looking for new customers. The format offered plenty of networking in a convivial and relaxed atmosphere. 

AREA Members at EWTS

AREA Member Role
XMReality Sponsor
Augmate-Logo3x2 Speaker
 EPRI-profile-logo Speaker
 APX-Profile-logo  Delegate in attendance 
perey-profile-logo Delegate in attendance 

Criteria for Enterprise Wearable Success

There is wide agreement with the simple guidance that Joe White, VP and GM Enterprise Mobile Computing at Zebra Technologies offered during his opening remarks.  White recommends that enterprises focus on systems that are:

  • Technically sound
  • Socially acceptable
  • Solve a problem

These criteria sound simple, but adhering to them requires careful research and planning. Many delegates at the summit who are shopping for wearable technologies don’t feel that the current commercial technology options are sufficiently mature for most of their use cases. One person confided that everything his team has evaluated to date “feels like a science project.”

Weight, balance and resolution remain significant technical obstacles but short battery life as a result of high power consumption continues to be high on the list of technology barriers.

One test of wearable display technology reliability is how well it performs in a live demo on stage. There were more videos than live demos, but Rafael Grossman, a highly promoted surgeon in the Google Glass Explorer program successfully demonstrated Atheer Labs’ AiR platform for the audience.

Another criteria added to White’s list over the course of the first day was cost. If devices are expensive to purchase and to operate or maintain, adoption and use will remain limited.

Regardless of the criteria and how firmly an organization wants to adhere to them, customers remain divided about what’s truly going to solve their problems. Some feel that their use cases require true Augmented Reality in enterprise. Others are, at least for the present, finding the “simple” delivery of live information or images to a wearable display (as currently done by Google Glass or Vuzix M-100) sufficient. In the opinion of those who use information “snacking” devices, real time registration and tracking of data in the real world are still expensive and technically difficult.

Connecting Remote Experts with those in the Field

Real time consultation between a remote expert and a person wearing a camera and display while performing difficult tasks is a highly compelling use case for most of the EWTS speakers. Although a few speakers mentioned their experience with AR-assisted remote assistance, the majority shared numerous and immediate benefits of having another “set of eyes” focused on a particular procedure.

MedEx_abulance

For example, emergency medical technicians working on MedEx ambulances as part of the Google Glass Explorer program can transmit more information about injuries or patient conditions to emergency room staff ahead of their arrival at the hospital.

In another case study, a tradesperson working on a Rogers-O’Brien Construction job site can see and transmit the details of the job site and get guidance or feedback from an architect or supervisor in real time.

Some Industries Are Further Along

While the medical and construction industries were highly represented among the Enterprise Wearable Technology Summit speakers in Houston, some case studies and presentations highlighted the promise of wearable technology in the logistics industry. DHL and Ubimax described how they are working together to put their warehouse picking solution into production and conducting research on their next generation systems for pallet packing. 

Energy production and distribution were also frequently mentioned. John Simmins of the Electric Power Research Institute (EPRI), an AREA member, spoke of projects underway in some power generating facilities. Speakers from CenterPoint Energy and Sullivan Solar Power also attested they are actively exploring the use of wearables in their businesses.

Many Challenges Remain

An entire event could focus exclusively on expected and promised technology improvements. For example, uneven network coverage and issues preventing secure access to off-device content came up frequently. But, EWTS did not limit its scope to technology barriers.

Getting wearables into production requires companies in highly regulated industries such as healthcare and construction to educate decision makers and executives and to negotiate creation of many new policies. Those are both very lengthy and costly processes.

Compliance

Complex regulatory environments are but one item in the list of business challenges.

Lack of trust is another significant obstacle to adoption. Large enterprises are looking for vendors that are on the one hand nimble and responsive to special requirements while on the other endowed with the financial resources to quickly ramp up production for large orders.

Despite these and other challenges, wearables continue to hold enormous promise and will increasingly demand the attention of enterprise technology buyers and users. We can expect these to be on the agenda at future BrainXchange summits. The company announced that it will produce its next event in June 2016 on the East Coast, although details were not provided.

Are there events you plan to attend to learn about enterprise wearable technologies?




AREA Members Accelerating Success with Augmented Reality

Augmented Reality offers tremendous opportunity for organizations to improve workforce productivity and reduce human error through increased contextual awareness and guidance. Whether implemented on a head-mounted display, on a tablet or through a stationary system, AR can deliver and collect information for a myriad of applications including training, manufacturing, field service and warehouse logistics.

It is an exciting time to join and participate in the AR ecosystem. Many companies are jumping in. Some are making tremendous advancements in wearable technology through miniaturization. Innovation at the silicon level is lowering power consumption and processing. Others are focusing on improvements in computer vision. Mobile systems including phones, tablets, watches and glasses are becoming more interconnected and integrated, and smart fabrics present the potential for a fully integrated mobile augmented human.

Truths are Difficult to Accept

Progress is being made but significant challenges to the effective development and deployment of AR within the enterprise environment remain. And, unfortunately, the hype around AR and the initial example demonstrations (and concept videos) have created the perception that AR is ready to go and can be easily implemented and deployed.

In truth, many technical issues still need to be solved to enable successful implementation and widespread use of AR for extended periods of time. Organizational issues including culture, security and safety are other significant barriers that must be addressed. Most current AR examples are custom developed for specific, focused applications with highly controlled conditions. And, the AR tools and technology provider and developer ecosystems are still immature. The path to AR success is not obvious.

We Are Working Together

The AREA is here to address these issues among others, and to create an environment for organizations—large and small—to learn, share and accelerate the adoption of AR in the enterprise.

Within the AREA, member organizations from around the world have committed to sharing their experiences and challenges in a collegial atmosphere to solve complex technical and implementation problems. AREA members represent a unique blend of AR end users, systems integrators, content developers, and technology providers as well as not-for-profit research centers and academic organizations from multiple industries. Through a combined program of thought leadership, education and outreach, best practices development and communication, and technology and implementation research, AREA members are actively building the community and knowledge base that will ensure successful implementation of AR-enabled information technology environments across the enterprise.

Meetings Make Member Collaboration Tangible

By joining the AREA you will become part of a global AR ecosystem. Our shared vision for the potential of enterprise AR infuses our member meetings, like the one in Houston on October 22. We are learning and sharing best practices. We collaborate to define the best problem-solving research, and to support workforce development.

As President of the AREA and as a Sponsor Member, I am witnessing, firsthand, the level of knowledge sharing and exchange across member organizations. It is clear to me that the AREA is the only organization that provides this opportunity for AR technology providers, developers and customers.

If you didn’t get to our recent member meeting, then this website is the best place to learn more about enterprise Augmented Reality and the benefits of joining the AREA. I invite you to take the next step by contacting me or Christine Perey, AREA’s executive director, to discuss how you can contribute and participate.

We look forward to welcoming you and collaborating with you at a future meeting!

Carl Byers
AREA President
Chief Strategy Officer at Contextere




Augmented Reality Developer Options after Metaio

This post originally was published in French on augmented-reality.fr.

Just before summer, we launched a survey to better understand the strategies of Augmented Reality developers following Metaio’s sudden change in circumstances. This blog post presents the results of our survey and our interpretations.

 

AR Dev Options After Metaio 1

We launched the survey in mid-June and left it open over the summer of 2015. There was no specific respondent selection and therefore we cannot speak of any representative sample. However, with 63 responses, approximately 30 to 50% of whom were English speakers,  we decided that the dataset was sufficient to be representative.

First, we present the results of the survey. We then offer our interpretations.

Metaio Product Distribution

Options 8

 

Respondents were mainly users of Metaio’s SDK, and slightly more than half were users of Metaio Creator. The Continuous Visual Search (CVS) tool is used relatively little by our sample. Although it is not easy to fully know respondents’ use of Metaio tools, we can assume that the majority of respondents work in or near development because only 2 of the 63 respondents exclusively use Metaio Creator.

The Impact on Business

 

AR Dev Options After Metaio 2

 

 

AR Dev Options After Metaio 3

The impact of Metaio’s cessation of its offers on the developers’ business is important, even if 16% of respondents do not see the effects. While 40% of respondents said they have alternatives to Metaio products, 34% said they do not.

Open Source Solutions

 

AR Dev Options After Metaio 4

The use of an Open Source alternative to avoid the current situation is mixed. Although the survey was not specific about the capabilities of the offering, sixty percent of the respondents thought they would consider using an open source option, but a quarter of respondents remained uncertain.

Software Development Kits

 

AR Dev Options After Metaio 5

Not surprisingly, developers responded that, alone or in combination, Vuforia and Wikitude were the best alternatives to the Metaio SDK. Other proposed alternatives included ARToolkit, Catchoom and ARmedia. However, it is important to note that the third most common answer among respondents was “I don’t know.”

Metaio Creator

 

AR Dev Options After Metaio 6

Presently it seems that the vast majority of users have not found an alternative for Metaio Creator. Wikitude Studio is popular but Layar Creator,  though popular one or two years ago, no longer seems a viable alternative. It is surprising not to find Aurasma in the options considered by survey respondents.

Metaio Continuous Visual Search

 

AR Dev Options After Metaio 7

The results concerning Metaio CVS proved difficult to interpret as few people use it. Although Vuforia Cloud Recognition gained slightly more traction than other proposed alternatives, CVS users are much more divided on alternatives overall.

Open Comments from the Survey

Comments we received from respondents raise a few salient points.  In particular, Metaio’s technical expertise and advanced solutions were noted. Despite Wikitude and Vuforia having the same capabilities, there is currently no product in Metaio’s class.

We also see bitterness against Apple as well as an awareness of the potential fragility of other alternatives.

General Remarks

Today there is no obvious miracle solution to take Metaio’s place. The impact of the company’s change in circumstances on developers clearly demonstrates the overall fragility of the global Augmented Reality ecosystem. It is rather surprising to me that a third of respondents have no viable alternatives to Metaio technology. Rumors of Vuforia’s sale by Qualcomm may make the situation even more complicated in the coming months.

Paradoxically, these uncertainties do not help in the establishment of an Open Source solution. Although half of respondents believe this would be a good thing, a quarter remains uncertain. After discussions with several companies specializing in Augmented Reality, I felt a certain reluctance to support an open source system, primarily due to fear of losing an advantage in terms of technical prowess. There is much to say about this and I plan to prepare a more complete article in the coming weeks. In fact RA’pro will launch an invitation for a debate on this topic via web conference in the near future.

Returning to alternative tools, there is not a lot of surprise in seeing mention of the major market players: Vuforia, Wikitude, ARToolkit, ARmedia, Catchoom, etc. I am personally amazed at the few mentions of Layar, which seems to be a relatively major player in the AR print arena. However, it is true that the absence of a freemium model does not facilitate adoption by small businesses. The total absence of Aurasma and Total Immersion in the responses was also surprising.

As a final note, no one really knows if Metaio’s place can be taken since Apple has made no statement on the future of the product. We can however, presume that Metaio technology will be integrated in future products and will, therefore, lose the cross-platform nature that made Metaio products successful.

What do you think? Please leave your comments below.




Augmented Reality’s Expanding Role in the Automotive Value Chain

 

Use Cases for the Factory Floor

With successful conclusions of pilots and trials, Augmented Reality continues to move into areas where the overlay of virtual information promotes vehicle quality and helps employees work faster and better, but also where more experience with the technology is a prerequisite. As well, higher numbers of AR implementations put greater technical and organizational demands on projects.

One key trend is the growing number of use cases for Augmented Reality in pre- and post-production processes in the automotive industry. Vehicle design and development, and then final verification after assembly are the most popular use cases.

Lina Longhitano of Mercedes-Benz Vans leads the transformation of advanced manufacturing facilities through the Van Technology Center and has a wealth of experience with digital transformation in manufacturing and the use of Mixed and Augmented Reality in vehicle development. The center provides high-end visualization and analysis for ergonomics and buildability of vehicles.

In particular, she mentioned three Mixed Reality use cases for engineering:

  • The visualization of out-of-position and validation of flexible parts.
  • The overlay of digital crash simulation data on physical crash vehicles.
  • Digital assembly and disassembly simulations with collision testing.

Mercedes-Benz Vans uses Augmented Reality for factory floor layout and design, as well as for visually inspecting components to assess differences between virtual and physical objects.

In a similar vein, Hermann Gross of Opel is putting AR to use in pre-production processes, especially in vehicle development and component integration. Opel’s Augmented Reality-assisted systems also verify the quality of physical vehicle mockups. Gross provides a number of examples for these, such as verifying the final position of parts and optimizing cable positioning. He revealed a number of benefits of AR, including:

  • Shortening the duration of mockup builds and increasing their quality
  • Speeding up problem solving
  • Positively influencing data quality

On the other end of the production spectrum, Sebastian Rauh has in-depth knowledge about how Audi is using Augmented Reality for final assembly inspection. These range from vehicle start-up to engine parameter optimization and calibration of control units and sensor parameters. On behalf of Hochschule Heilbronn, Mr. Rauh is also working with Audi to design post-production verification workflows and equip personnel with Google Glass and the Epson Moverio BT-200 to execute tasks.

The Industrialization of Augmented Reality

Juergen Lumera of Bosch, an AREA sponsor member, is one of the first in automotive who is moving beyond simple AR prototypes and into larger deployments involving greater numbers of users, departments, processes and tools. Taking a holistic approach to the human, technological, financial and organizational aspects of incorporating AR technology across an enterprise, he outlined ways to expand projects beyond pilots. Mr. Lumera emphasized that AR adoption is a journey whose destination, as well as roadmap, has to be carefully planned in order to reduce risk and promote success.

Bosch’s Common Augmented Reality Platform (CAP) is an example of a system that integrates authoring and publishing of AR content across business units and technology silos, and can become part of a wider move towards the digital factory.

Matthias Ziegler of Accenture presented a framework for enterprise Augmented Reality adoption by Accenture’s clients and confirms the expanding interest in use of wearables that support AR for hands-free workplace performance. Accenture is expecting 212 billion devices and autonomously driven cars by 2020, with a doubling of IP traffic between 2013 and 2016. Bulky form factors will delay adoption by consumers, but Accenture sees enormous opportunity for hands-free AR-enabled displays in the enterprise space.

Their template, based on a number of pilot projects, compiles statistics and experiences and defines business value drivers and use cases, guiding investment in potential areas where AR can increase ROI. For example, if a company can quantify the length of time spent researching work instructions in paper documentation, and attribute a given number of errors to misinterpretations of drawings or procedures, then AR might promise higher returns.

Augmented Reality and Customer Experiences

Ashutosh Tomar of Jaguar Land Rover says the company’s vision is to use AR for enhancing the driver experience in their vehicles. Today’s typical car is packed with sensors and features—one type of vehicle having over 70 onboard computers and 170 “smart features.”

Customers are no longer judging automobile features as a selling point alone, but also expect a better customer experience. How can cars automatically change settings (e.g., music station, seat and mirror adjustments, etc.) based on who’s driving? How can cars communicate with drivers via other sensory inputs such as haptics? JLR is making large investments in human factors research and in ways to increase driver safety via Augmented Reality, for example:

  • Visualization of “ghost cars” in windshields driving ahead to clearly demonstrate the safest way to make turns in a city.
  • The projection of cones in windshields for training purposes.
  • “B pillars” enhancing a driver’s line of sight and situational awareness by turning car walls “transparent” in certain situations, like when making narrow turns in cities.
  • Haptic feedback in the seat behind a driver’s shoulder to alert them of another vehicle passing in their blind spot.

Legal Implications

New features such as the projection of information and images in the driver’s windshield will require new regulatory regimes. Brian Wassom, intellectual property attorney at Honigman Miller Schwartz and Cohn LLP, described the current regulatory environment and spoke about the principles of the National Highway Traffic Safety Administration’s “Visual-Manual NHTSA Driver Distraction Guidelines for In-Vehicle Electronic Devices.”

  • Distractions in all forms, including cognitive and visual, should be recognized by designers and regulators.
  • Displays should be as near the driver’s forward line of sight as possible.
  • A number of distracting features should be avoided entirely: glare, social media interactions and text that scrolls or contains more than 30 characters.
  • Glances away from the road should last no more than 1.5 to 2 seconds.

The above principles apply to current systems (dashboard layouts with navigation and phone information), but might also be the basis of conversations about Augmented Reality safety and liability.

In his presentation, Ashutosh Tomar had also emphasized the need to minimize the amount of information displayed to drivers to reduce distraction, as a basic tenet of safety.

Conclusions

In addition to those already mentioned, there were interesting presentations by Volkswagen, Ubimax, the German Research Center for Artificial Intelligence (DFKI), Feynsinn, Frauenhofer Insititute and others on topics ranging from showroom use cases to the latest research on AR user experiences.

Overall it was encouraging to witness the depth of questions about Augmented Reality being asked by companies in automotive manufacturing, research, design and others, and to get the sense of its evolving acceptance in enterprise, complete with growing pains and successes.




Google Glass 2.0—Primed for the Enterprise: Foldable, Rugged and Waterproof

When it was introduced in February 2013, Google Glass 1.0 was far ahead of its time. Consumers and developers identified many issues that needed to be addressed and, although some have adopted the hardware, it was deemed unsuitable for widespread use by consumers or enterprise customers.

Over two years later, in early summer 2015, Google began showing key developers and sharing with the media that it is working on the next generation of Glass, code named “Project Aura” powered by Intel.

Google Glass Ready

The new device is geared for professional users. Employees using the information provided via the wearable display will be able to perform tasks with fewer human errors while enhancing productivity and operational efficiency.

The new “ruggedized” Google Glass hardware design is said to be easy to fold and more durable in work environments. Some options include the ability to clip the tiny display unit on the existing eyewear.

Perhaps Google Glass 2.0 is primed to grow in many industries such as oil and gas, warehousing, manufacturing, agriculture and mining.  The likely impacts depend on the use cases and company readiness for change.

The Benefits of Hands-Free Displays in Warehousing Operations

 In April 2014, DHL published a report describing how logistics operations can be improved with the assistance of hands-free wearable devices. The use cases fell into four categories:

  • Warehouse operations
  • Transportation optimization
  • Last mile delivery
  • Service and repair and other enhanced services

The evidence to support the assertion that warehouse picking can be improved, the first use case identified in the DHL study, is mounting.

Google Glass can also be used for reducing the cost of warehouse redesign as well as factory planning but studies about metrics for these use cases are not available at this time.

The Future of Google Glass

Will Google Glass 2.0 address the issues seen in the first prototype?  This remains to be seen, but with several confirmed reports on the changes and improvements Google is making with Glass 2.0, it is evident that Google is all-in on changing the future of computing through wearables and, ultimately, with Augmented Reality.

Have you tested Google Glass 2.0? Share your thoughts and feedback below.




Technical Communicators Must Evolve to Support Augmented Reality

As other AREA blog posts and pages on this website attest, Augmented Reality can be very beneficial but it doesn’t happen by itself. The preparation and delivery of AR experiences in professional settings involves the cooperation of many groups and investments from diverse points in a larger corporate information value chain. One of those groups is responsible for technical documentation.

As a professional technical communicator, I believe that introducing AR will also be rewarding to those people and organizations delivering their content in new, contextually driven systems. However, the development and delivery of AR-enriched content also comes with a new set of challenges.

From Topic-Based Content to Experiences

Changes in technologies, skills, priorities and procedures will be necessary. Accepting responsibility for and producing AR-enriched content will involve a shift in the mindset of technical communicators who, like most of their customers, are accustomed to developing traditional, topic-based or video content. In other words, technical communicators will have to embrace a more holistic view of content: experiences.

This means that, in addition to performing their traditional information development tasks, technical communicators will need to begin designing and supporting the delivery of content that changes in real time, based on the user’s context.

Crowded Display

We Need New Approaches

When content is destined for use on AR-enabled systems, our technologies will need to change. We’ll also need to adopt new approaches designed to:

  • Position and format the experience content so that it doesn’t obstruct the viewer’s line of sight to the real world target, as well as present other objects that could introduce risk or errors.
  • Anticipate and correct error conditions in real time, under constantly changing light and environmental conditions.
  • Design overlay information so that it doesn’t overload the user’s ability to process and use the information effectively.
  • Leverage sophisticated software that produces and manages 3D models, and reduce current reliance on traditional 2D graphics and illustrations.
  • Take into consideration the higher processing power required to render digital models, graphics or other supplementary data over the real world in real time, while taking into consideration its impact on battery life.
  • Plan for both the user’s device to access high-performance networks (especially when the content is in 3D format and stored on corporate servers), and for when those connections have high latency or are interrupted.
  • Work with the strengths and limitations of new end user hardware such as smart glasses or helmets, watches and other wearable sensors, and design new software tools that are unique to these, and rapidly evolving.
  • Adopt still other types of new hardware and software to capture the objects, develop, view and test the experiences when under development.
  • Design to comply with new yet-to-be-defined policies and tools for certification, data security and encryption.
  • Notify users when their every action is being captured and recorded, and control this capture, while managing the changing acceptance (or resistance to) these technologies.
  • Manage the use of cameras in restricted environments in order to reduce risk of confidential information being exposed and pirated.
  • Measure benefits gained from, and additional costs and complexity associated with the delivery of AR experiences.

All of these changes and new skills associated with AR-enriched content development will require many years of testing, some of it by trial and error. Eventually refinement will lead to mature and widely accepted best practices.

New Standards in Augmented Reality

I believe that these skills and best practices must also be accompanied by the development of formal standards for technical communicators to follow in AR design and development. I’m co-chairing the OASIS AR Information Products Technical Committee in order to study what’s needed for the wider adoption of AR technology and associated experience development methods by technical communicators. Over time the committee members will also work together to develop standards that will guide technical communicators and improve their ability to deliver content in AR experiences. Then, the suggested benefits of using AR-assisted systems will be achievable across a great many industries.




The Augmented Reality Provider Landscape Shifts, Again

Developers of Augmented Reality experiences select tools and technology for a project to match use case requirements. If the use case involves a page in a book or the side of a package, then in these cases 3D tracking is overkill. If the project accesses records in a company’s ERP, there must be plug-ins or a customization. If the customer needs reports (e.g., number of objects recognized, interaction of the user, etc.), then the platform needs to support their production. If the target is a movie poster, the security considerations are entirely different than if the target involves a proprietary industrial process.

After five years of Metaio’s dominance of the AR software provider landscape, developers’ options are changing dramatically. This post reviews the recent changes in this provider landscape, how these impact developers and suggests that those who license and purchase development tools could use this period of research and evaluation as an opportunity to communicate more clearly about their project requirements to all the tool and technology vendors.

A Rapidly Changing Provider Landscape

In early 2015, Metaio’s ecosystem ranged from dedicated individuals producing one or two experiences, to Fortune 100 companies. Some were researchers designing prototypes; others were automotive industry giants like BMW and Audi who used Metaio’s robust tracking algorithms for precision engineering and design. Then, in mid-May 2015, a message appeared on Metaio’s website saying that it would stop selling licenses immediately, and that support for its Augmented Reality services and software technologies would end on December 15 of the same year. The mysterious announcement took the company’s global developer ecosystem by surprise.

Many, if not most, of those developers’ authoring experiences for enterprise and industrial projects were using Metaio’s software tools. Metaio’s change in direction put developers in an uncomfortable position. Many were furious. Others expressed frustration. To this day there remain many questions about the circumstances that led to the announcement. Regardless of the changes to a company that the developer ecosystem had grown to trust, serious business issues remain:

  • What will happen to the channels published in a platform operated by Metaio?
  • What will developers use in the place of Metaio’s tools?

Many developers are now doing what more could have done consistently over the previous years: investing their resources to evaluate other potential tools and technologies. The best developers will resume proposing projects to their customers once they have thoroughly tested the alternatives.   

Gaps for Enterprise Augmented Reality

While there are alternate enterprise Augmented Reality technology providers with solutions and services worthy of evaluation (see table below), none offer the breadth and maturity, the professional documentation and support that Metaio provided for its SDK, Creator, Suite, Cloud and Continuous Visual Search matching system.  

Enterprise AR authoring providers and products

Source: © 2014 – 2015
Company Platform
DAQRI 4D Studio and AR Toolkit
Wikitude Wikitude SDK
Inglobe Technologies AR Media (and other)
BuildAR BuildAR
Catchoom CraftAR (and other)
NGRAIN Vergence (and other)
Diota DiotaPlayer, DiotaConnect
EON Reality EON Studio (and other)
Bitstars Holobuilder
Fraunhofer IGD Instant Reality
Kudan Kudan SDK

Metaio’s dominance wasn’t limited to breadth of offering and AR developer mind share. Among its peers, it probably also generated the greatest revenue from licensing its software tools and providing services. To deliver value to customers and drive development of its technology suite, Metaio employed over 75 of the world’s most qualified and experienced enterprise AR engineers.Table 1. Enterprise AR authoring providers and their products

Those that can have been furiously hiring engineers to write code and build out their teams and offerings but breadth and depth like what Metaio offered doesn’t happen in a matter of months. 

Vuforia’s Focus on Consumer Use Cases

No one knows precisely how much of the Metaio developer ecosystem overlapped that of Qualcomm Vuforia, but anecdotal evidence suggests that developers who had use for both, leveraged their qualities for entirely different projects. 

Vuforia is strongly optimized for delivery to consumers on smartphones: entertainment, cultural heritage, education and marketing use cases. For this reason, developers who explored its use for their enterprise or industrial projects did not place Vuforia’s current offerings at the top of their list of preferred enterprise-ready AR tools.

In an October 12 press release, PTC, a global provider of enterprise platforms and solutions for creating, operating, and servicing connected objects, announced that it had reached an agreement to acquire the Vuforia technology, and its developer ecosystem, from Qualcomm Connected Experiences, Inc., a subsidiary of Qualcomm Incorporated.

The acquisition of Vuforia by PTC suggests that while Metaio technology is probably being integrated into a platform and tools for consumer-facing solutions, the tools most popular for consumer-facing AR experiences (i.e., the Vuforia SDK) will evolve to better meet the needs of developers seeking to address enterprise use cases.

The Landscape Continues to Evolve

The reversal of relative positions of the two popular Augmented Reality SDKs with respect to their target markets and one another is one of several trends.

First, the list of developer options is expanding. Firms that were previously quiet have the opportunity to engage with developers who are more interested in learning of their offers. Google is getting closer to its Glass at Work 2.0 release. Microsoft is showing HoloLens and the tools it has designed for authoring (aka “Holo Lens Studio”) to more developers. Some firms with significant experience and investments in enterprise Augmented Reality are becoming more attractive, or at least more visible. For example, Diotasoft, a French technology provider with loyal enterprise customers including Renault, PSA Peugot Citroen, Total and Dassault Aviation announced a rebranding (the company is now called “Diota”) and launched a new platform for enterprise Augmented Reality.

Another trend is a shift in positioning. PTC and Vuforia’s statements in their October 12 press release emphasize where they see the greatest potential for impact. They draw a line between Augmented Reality and the need for people to visualize data stored in and managed by PTC’s Internet of Things-oriented systems. This echoes the suggestion made by Gerry Kim, professor at Korea University, in a meeting of the AR Community on October 6: Augmented Reality is the human interface for IoT.

As the number of options increases, so does the potential cost of integration. In a highly fragmented market one large enterprise could easily end up with solutions addressing different use cases based on multiple different and incompatible SDKs.

AR Data Integration

An Opportunity to Mandate Open Solutions

A unique opportunity lies in the middle of the increasing fragmentation and investment in new technology providers.

What if, instead of accepting the status quo of many competing and incompatible AR platforms, large enterprise customers and their developers were to clearly demonstrate their need for open systems?

Developers can seize the next few weeks and months to prepare a campaign describing new or existing systems with which they would prefer to create and manage enterprise content. They can document the barriers to interoperability and mount pressure on enabling technology providers. What if, prior to a purchase or licensing decision, the provider of an AR authoring platform were required to demonstrate interoperability with content generated from Metaio’s SDK?

Openness does not mean Open Source. Openness is a condition that is based on explicit or implied agreements between vendors. Providers of technologies must agree upon common data formats, and provide interfaces and APIs that are well documented and designed for interoperability with solutions of potential competitors.

Without issuing a clear mandate for AR technology providers to support a greater level of integration and interoperability with enterprise IT systems, developers should not be surprised if their options remain highly rigid and difficult to integrate. Unless some forward thinking people don’t take action, developers and their large enterprise customers must be prepared to face many more years investing in brittle transcoding systems or other approaches to “work around” the lack of openness and interoperability.

How are you going to respond to this rapidly shifting AR technology provider landscape? Are you taking this opportunity to share your requirements with new vendors? 




Starting the Enterprise Augmented Reality Conversation

Have you asked any IT professionals or business managers what they’re doing with Augmented Reality? A small fraction can share how they’ve considered using AR for improving their workplace processes, but most inquiries about how companies are using AR begin with a blank stare and end in frustration.  

The AREA and its members are developing high-quality content that can be the basis of more precise and fruitful dialog than we often have today. Once there is a shared conceptual foundation, we’ll be able to discuss the concrete benefits as well as the risks of introducing Augmented Reality in the enterprise with our audiences.

Explore the Audience Knowledge Level

Casual discussion between acquaintances or between a supplier and a potential customer can’t evolve gracefully if they must begin with deep explanations or clarifications of confusing terminologies. Don’t start with a dry definition. Focus first on either a known or shared challenge or potential benefit and make sure you can squeeze a few terms in casually in the first minutes.

“Isn’t it frustrating that we can’t significantly increase our productivity?” you can inquire. Be specific about the use case, if you can. You can substitute “increasing productivity” with other metrics such as reduce errors, reduce risk or increase safety. Drop in some keywords to make sure they understand that you feel new technologies could help. Avoid buzzwords such as wearables, IoT, Augmented Reality or Virtual Reality in the first five minutes. Try to avoid bringing up Hollywood movies or popular science fiction books that have Augmented Reality.

Then you can say that you’ve heard or that you’re exploring how this new technology could play a role by overlaying digital information on the real world. Let your prospective customer or partner, or whomever you’re speaking to, be the first to mention wearables or AR.

When asked if they’ve heard of it and what they’re doing or planning to do with Augmented Reality, an IT professional will respond in one of two ways. The younger the person, the more likely they are to have heard and understood the potential. That said, they may not have thought to apply it to their job.

“That’s technology for your smartphone. I’ve seen it used in a museum, once” they might say. Then they either describe how the AR experience failed or just didn’t bring value to them.  Such conversations often conclude with the person dismissing the whole idea.

“It’s probably good for entertainment, but we’re not that kind of company,” is not an uncommon conclusion.

A more knowledgeable audience may remember Virtual Reality and the promises it held but didn’t deliver. Then you will need to reprogram them to understand the differences. 

Others will have had no exposure at all to Augmented Reality.

Light Bulb Moment

Once you’ve decided if the conversation is worthy of continuing investment, you’re going to aim for a “light bulb” moment: a look in their eye that shows that the person with whom you’re meeting has had a breakthrough in understanding.

To get to that moment of realization may take several steps. As already suggested, if you’re in conversation with an IT professional or line manager with a lot of engineering experience, you will get there more quickly.

Begin by building upon something very familiar. Everyone has seen and almost all have personally used video conferencing. AREA member David Doral, Director of AERTEC Solutions begins his education process by suggesting that when trying to understand a problem at a remote location, it would be valuable to be able to see things as if from another’s eyes.

“We suggest to the customer that we support the technician in the field or on the shop floor with an expert who is somewhere else,” explains Doral. He doesn’t say where that expert is, but makes it perfectly clear that they are the key to solving a problem and there’s not time for that expert to personally fly to the location. In AR, this use case is known as the “remote expert,” but this term doesn’t need to be introduced.

“Then, if they like this concept, we can suggest that the expert could draw arrows, point or otherwise indicate steps with animations,” continues Doral. “Imagine that the person who is in the field or on the shop floor is providing the remote hands, performing tasks as directed and under the supervision of the expert.”

AR Overlay Usability Study

Up Close and Personal

Another approach to reach a light bulb moment is to demonstrate an Augmented Reality experience right away. Sometimes, this can be performed using a tablet and an object that you’ve brought with you. Choose an object that is likely to be professional and slightly complex in nature but with a very simple user interface, such as a pocket projector. A virtual interface can appear with Augmented Reality to help the user with configuration and operation.

Three-dimensional objects are nice and have a big “wow” factor but a photo will also work well and may have higher performance. Lighting, and reflections on a glossy surface, may have a big impact on your ability to track the target, so test your sample photo or object well before using it. Be sure to give the other person the device to hold and move around, to interact with the content in the experience.

Often people try to simulate this effect, and reduce the risk of failure, by showing a video of an AR experience recording, but your audience will assign lower credibility to a video because they understand that special effects as seen in the movies are now commonplace.  Hasn’t everyone seen Minority Report and Iron Man?

From a shared understanding of the benefits of Augmented Reality, you might be able to progress to talking about a project and the potential of implementing AR in a few use cases.

What techniques have you used to successfully start a conversation about enterprise Augmented Reality?  Share your methods with others in the comments below.




ESA Puts Augmented Reality Through the Paces

There’s a lot of attention currently focused on how NASA is planning to send Microsoft HoloLens hardware to space to help astronauts perform tasks. According to a post published on the Trove blog in June 2015, the first use case being tested will permit NASA professionals on Earth to see what the astronauts see on the International Space Station (ISS). In Remote Expert Mode, HoloLens will be valuable when the astronaut encounters undocumented situations. It will also be possible for HoloLens to provide procedural guidance, for example, to retrieve objects or to put objects away in their correct place after use.

Tests of HoloLens, both on the ground and in underwater laboratories simulating space, will certainly validate the latest technology components Microsoft provides but will not be the first tests of Augmented Reality in space.

ESA Columbia

A First Use Case for Augmented Reality in Space

According to David Martinez, a simulation and visualization engineer and member of the European Space Agency (ESA) Software Systems Division, Augmented Reality was first evaluated by ESA for space use in a project beginning in 2006. Using the ESA-designed Wearable Augmented Reality (WEAR) system, Augmented Reality was tested on Earth and, eventually, on the ISS in 2009. The use case was for an astronaut to inspect and, if needed, to service ISS air quality system components. Before examining and changing filters on the air quality system, an astronaut had to remove a panel on the floor. Then cables and hoses needed to be repositioned. Once the filter was accessible, the color of an indicator had to be examined. 

“We learned a lot about what was and wasn’t possible with the technology at that point in time,” recalls Martinez.

Exploring Guidance and Remote Expert Assistance

ESA works with payloads designed for a wide variety of different purposes. Some of the payloads end up on the International Space Station. As astronauts on the ISS cannot be trained on all the possible payloads in advance, they would like to have clear and compact Augmented Reality-assisted systems that make sure the astronauts conduct experiments consistently and correctly, even when they are not trained on them before going into space.

In 2014 the ESA team collaborated with the Technical University of Delft to explore the use of Augmented Reality using hands-free and head-mounted displays to provide remote expert assistance for performing experiments. The study used a payload representative of what’s on the Columbus module, a science laboratory that is part of the ISS and one of the most important contributions to the ISS made by the European Space Agency.

“We demonstrated that the remote expert was able to support the hands-on use of the various dials, buttons and knobs,” explains Mikael Wolff, a senior software manager who manages several projects in the domain of crew informatics. 

“The remote expert could speak to the user and also annotate the object in the astronaut’s field of view with arrows and text messages that would remain in place with respect to the payload,” clarifies ESA engineer Sérgio Agostinho.

Technologies are continually advancing and ESA is testing systems for their ability to track targets in 3D with far higher flexibility than earlier generations. “We’re not using fiducial markers on any of our current projects,” assures Agostinho. He feels that if a system it is to be deployed on the ISS, it can’t rely on markers.  “We’re aiming for the Iron Man quality of experience,” he says enthusiastically.

AR Overlay Usability Study

Long List of Use Cases

“We know that there are many ways Augmented Reality may bring value to projects and people on the ground and in space,” reports Wolff. “We’re always coming up with new ideas.”

In collaboration with partners in industry and academia, ESA is currently focused on several use cases it considers to be relatively low hanging fruit. One of these is support for complex product assembly, integration and testing on Earth. ESA and European aerospace industry engineers are routinely involved in, support or perform the final assembly and integration of parts procured from aerospace industry suppliers. Components include everything from printed circuit boards to large payload systems and harnesses that eventually go into space.

Augmented Reality could assist technicians during the assembly of telecommunication satellites. Currently the manual procedures take days or weeks to complete. By highlighting for users the steps directly on the parts of the satellite with Augmented Reality, the assembly, integration and testing processes could be performed with fewer errors and more quickly.

Barriers Remain

The ESA team has segmented its current and potential future Augmented Reality projects into those that could provide value when engineers perform tasks on Earth and others that could lead to AR being deployed in space for use by astronauts. This is due to the fact that systems or components that meet requirements on Earth are not immediately ready to go to the ISS. Not only is hardware certification for custom built and commercial off-the-shelf devices required, but software conflicts or bugs simply aren’t tolerated in space.

Before anything is sent to the ISS, it must undergo extremely rigorous testing and validation. “This means that almost everything on ISS is at least one generation behind what’s available on Earth, in terms of technology maturity,” explains Martinez.

“We also have real challenges with lack of interoperability,” says Wolff. “As an industry and as a public agency we can’t rely on a single supplier for any technology component. The Augmented Reality ecosystem needs to expand and different vendors need to provide components that are comparable or else we could put the agency or a mission at risk.”

Despite delays and the complex testing environments, ESA engineers continue to study AR use cases and to evaluate the latest technologies. As commercial solutions mature and pass required reliability and accuracy thresholds, having them in use on the ISS and on complex space assembly and integration projects on Earth could become commonplace.