1

Putting the ‘work’ into ‘AR Workshop’

Deep in the snow of a wintery Chicago, the annual AREA/DMDII workshop was a hotbed of activity!

The sessions attracted around 120 attendees comprising speakers, exhibitors, academics and those representing both commercial AR technology providers and companies using or looking to use AR within their business. Given the rarity of having such a collection of AR practitioners in one place, Glen Oliver (Lockheed Martin) and I wanted to harness this collective brainpower! Together, we represented the AREA Requirements Committee whose remit is to develop a set of industry requirements and use cases to help promote the adoption of AR.

The AREA Requirements Committee strongly believes that in order to benefit the entire ecosystem we need to effectively and impactfully articulate how AR technology can be applied to business problems, what capabilities are needed within AR solutions and, perhaps more importantly, what is the business value of these tools? This will help both vendors and users of AR.

So, with three hours allotted from a precious agenda, how to best use this time? The approach taken was to introduce the importance of developing a linked and connected schema of needs followed by group activities. Here’s what followed:

Backdrop

We began with a summary of the requirements capture, already started at the previous AREA/DMDII workshop. At that session, we captured 96 requirements, split roughly equally between hardware and software. Whilst this was a great start, the outcome resulted in a list of requirements with little context, structure, priority and limited ability to leverage the community to contribute towards them. At the same time, the AREA has collected a number of great use cases which have value to companies wishing to investigate where AR may be applied but the current use cases need more detail to be actionable and linked to derived requirements. More needed to be done!

So, we presented a proposed ‘AREA Schema of Needs, as shown below.

The idea is quite simple. We need to build a hierarchically linked set of needs, in various technology areas, that have bi-directional linkages to the use cases which incorporate the requirements. In turn, the use cases are linked to scenarios which define an end-to-end business process.

These scenarios occur in various settings (including engineering, manufacturing, field service, user operation, etc.) and, ultimately, are relevant in one or more industries (automotive, health care, industrial equipment and other industry verticals).

In order to set the scene, the presenters walked through examples of each of the taxonomy fields. For example, a sample field service scenario was provided as follows:

A field service technician arrives at the site of an industrial generator. They use their portable device to connect to a live data stream of IoT data from the generator to view a set of diagnostics and service history of the generator.

Using the AR device and app they are able to pinpoint the spatial location of the reported error code on the generator. The AR service app suggests a number of procedures to perform. One of the procedures requires a minor disassembly.

The technician is presented with set of step by step instructions, each of which provides an in-context 3D display of the step.

With a subsequent procedure, there is an anomaly which neither the technician nor the app is able to diagnose. The technician makes an interactive call to a remote  subject matter expert who connects into the live session. Following a discussion, the SME annotates visual locations over the shared display, resulting in a successful repair.

The job requires approximately one hour to perform. The device should allow for uninterrupted working during the task.

With the job finished, the technician completes the digital paperwork and marks the job complete (which is duly stored in the on-line service record of the generator).

In this example, the items in blue are links to underlying use cases which need to be supported in order to enable this scenario. Similarly, examples were presented for use cases and requirements needs.

We also introduced the notion of “Levels of Maturity.” This is a useful concept as it enables both users and suppliers of technology solutions to identify roadmap progression, with an eye on future, richer adoption or delivery. Alternatively, not all users of the technology need the most advanced solution now, but they can identify what might make business sense to them in the shorter term.

Group Exercise

With the backdrop complete, we moved into the interactive portion of the session. The audience was split into 17 table groups, each with a mix of people from industrial users, commercial suppliers and academics. The idea was to get a blend of perspectives for the group activity.

Delegates hard at work!

Armed with a set of templates furnished by Glen, the 17 teams were set the following exercise:Delegates hard at work!

  1. Choose your industry and setting
  2. Provide a written definition of the scenario
  3. Highlight the “use case” chunks that form the scenario
  4. Describe at least three of the supporting use cases
  5. Capture some of the derived requirements/needs
  6. Construct a maturity model
  7. BONUS: Describe the value proposition of using AR in this scenario

Whilst each team was given a high-level scenario (e.g. “manufacturing operation” or “design review”), they were free to choose their own, if they wished.

It was great to see the level of discussion taking place across all of the tables! One of our objectives for the exercise was to use the outputs from the team as further content for helping populate a future database. However, the primary point of the exercise was to mix the attendees and have them focused on articulating scenarios, use cases and requirements in a structured way that can also be tied back to business value.

At the end of the session, a spokesperson for each team stood up and summarised the results of their work.

Outcome

Each team duly handed in their handwritten efforts, which were transcribed into a more usable digital form and are now available to AREA members by following this link below and opening up the transcription of the group’s outputs:

Augmented Reality Functional Requirements

So, what did we learn?

The teams have supplied an impressive amount of ideas which are summarised in the PDF. One unfortunate aspect of this is that we were unable to capture what were clearly detailed and illuminating discussions that were taking place across all of the tables. In some ways, perhaps, the ability to openly discuss these topics was possibly more valuable to the teams than what was written down.

The scenarios discussed included (but were not limited to) the following:

  • Remote design review
  • City building planning
  • Factory floor – optics manufacturing
  • Optimising manufacturing operations
  • Onsite field service task
  • New product training – customer
  • New equipment commissioning
  • Domestic owner repair procedure
  • Assembly assistance
  • Maintenance for new staff
  • Collaborative landing gear inspection
  • ‘Unusual’ field service tasks
  • Construction design change optimisation
  • Multi-stakeholder design review

Additionally, these scenarios were described within a number of industries and settings.

Furthermore, we received some very positive anecdotal feedback from the delegates. One person stated, “This exercise was worth the trip in itself!

One of the aims of the AREA Requirements Committee is to develop an online database to enable community participation in defining these needs and use cases. This exercise was a great incremental step in that journey. We look forward to building out this model for the benefit of the AR ecosystem and encouraging all to participate.

Acknowledgements

Thanks to the DMDII team for onsite support and to all of the workshop delegates for making this a highly productive exercise.




Huawei’s Farhad Patel on Taking AR Beyond the Pilot

Requirements Committee

Since its founding 31 years ago, Huawei has grown to become the largest telecommunications equipment manufacturer in the world and the world’s second largest smartphone manufacturer. With $75 billion in revenue, the global giant supports R&D operations in 21 countries around the world. After Huawei recently rejoined the AREA, we had a chance to talk with Farhad Patel, Technical Communication Manager based in Plano, Texas.

AREA: What has driven Huawei’s interest in augmented reality to date?

FARHAD PATEL: I work in the Innovation and Best Practices group in Huawei’s technical communication department. We’re responsible for developing and delivering the technical information and communication that goes out to our customers in the form of user guides, technical manuals, and instructional guides. A couple of years ago, as part of our innovation objectives, Rhonda Truitt, Director of Huawei’s Innovation and Best practices in Technical Communication group decided to research and pilot AR because she thought it could be very useful for information delivery. During our research we discovered that quite a few AR experiences involved showing someone how to perform a task and that’s what technical communicators do. We knew that technical communicators were the best people to deliver AR content and saw this as hugely beneficial to our audience. The thing that really appealed to us was that equipment could be automatically identified and content could be automatically displayed without searching. Contextually, this is what a person is looking at, and this is what he or she probably needs. So, it would save our customer’s time not to have to search and locate information.

AREA: Are your customers consumers or businesses?

FARHAD PATEL: Both. Huawei develops a full range of telecommunications equipment: switches, routers, servers, software, and the cell phone itself. We sell to telecommunications providers as well as consumers.

AREA: How would you describe the state of Huawei’s internal adoption of AR today?

FARHAD PATEL: I can’t speak for the rest of Huawei but in the documentation area we have done several very small, very targeted AR projects both in the US and in China. It’s not something that we’re doing corporate-wide for documentation, and it’s not something that we’re even doing product line-wide. So, if there is a certain product, like a power converter or a server, that we want to publicize to our customers, we may turn to AR or use AR as one of the channels to show its features and what it can do. We’ve also used AR at tradeshows but only in terms of certain products. So, we have done multiple small projects, but we haven’t scaled up to include a full product line since not all content is suitable for AR delivery.

AREA: Do you have a timetable in place for more widespread deployment?

FARHAD PATEL: It’s probably not on the near horizon because AR still has significant challenges that need to be addressed. And one of them is not even related to AR but more the content itself: how do we minimize the content to fit onto the smaller screen of a tablet or smartphone? And we have long procedures, filled with images and tables. That is a challenge. Another challenge is that the tracking and recognition technology is still not where it could be. And ideally, with AR, you could work hands-free. Just put on your glasses and you’d be able to see work instructions superimposed right next to the equipment. We’re still waiting for smart glasses technology to improve before large-scale adoption. So, those are the challenges that we’re struggling with as we try to ramp up the adoption of AR.

AREA: Among those eight pilots was a field technician installation instruction application with HyperIndustry, correct?

FARHAD PATEL: Yes, we’ve done three or four pilots with Inglobe Technologies. The other company that we’ve used is 3D Studio Blomberg, an AREA member. And we have developed AR experiences internally in China. We have also used another AR company, EasyAR.

AREA: What have you learned from the AR pilots that you’ve conducted so far?

FARHAD PATEL: We’ve improved our knowledge about AR technology – what it can do, what it can’t do, its limitations, its challenges. We’ve come a long way from where we started out. We have also increased the awareness of AR technology and expertise within the company itself. Now, we have many more product lines aware of AR technology and how powerful and successful it can be, because we’ve had a few customer pilots and we’ve demonstrated AR in these pilots, and the user groups and customers have been very appreciative of this new delivery channel for information. But at the same time, we’ve also identified what we cannot do with AR. So, we’re in a wait-and-see mode to see how best to proceed to take AR enabled information corporate-wide or even product line-wide for appropriate content.

AREA: How are you hoping to benefit from your membership in the AREA?

FARHAD PATEL: The main thing we want to get from the AREA is knowledge. We’re hoping to be able to share what we have learned. And I hope that other AREA members will reciprocate and tell us what they have learned. I would hope to learn more about their best practices, their challenges, what works for them, what doesn’t work for them. For example, if an AREA member has figured out how to minimize the content so that it can be visible on smaller screens, I’m hoping that they share that information.

Networking is another benefit. You know, we learned about 3D Studio Blomberg from the AREA, so getting in touch with other like-minded people to work with their technology. And of course, the best practices, white papers, and webinars that the AREA puts out. For example, recently the AREA developed an ROI calculator. Useful information like that will go a long way in validating our membership fees to the AREA.

AREA: What AREA activities do you expect to participate in?

FARHAD PATEL: The Research Committee is of great value and we try to participate in AREA programs and activities as much as possible.




3 Things You Need to Know About the New Atheer/Toshiba AR Partnership

Toshiba_dynaEdge

Last week, AREA member Atheer announced a new partnership with Toshiba -pairing Atheer’s AiR Enterprise solution with Toshiba’s new dynaEdge smart glasses. We checked in with Atheer’s Director of Marketing Communications, Geof Wheelwright, to get the story behind the story.

AREA: What was it about the Toshiba dynaEdge smart glasses that made Atheer want to pursue this partnership?

Wheelwright: We know from talking with customers that no one type of smart glasses will meet all the diverse needs of a given enterprise. It is inevitable that the average organization’s AR ecosystem will consist of a variety of smart glasses, devices and OS platforms. Atheer’s strategy is to be ubiquitous across this diversity of customer needs. So when Toshiba approached us last summer about being their software partner for a new range of smart glasses, we jumped at the opportunity to partner with them.  Toshiba is, of course, one of the world’s mobile computing leaders and its entry into the Enterprise Augmented Reality market is a pivotal moment in validating the potential and promise of the technology. Toshiba’s vast experience in working with enterprise customers on mobile solutions – and its keen understanding of their needs – makes it a great partner in delivering our flagship AiR™ Enterprise solution to Windows 10 users on new Toshiba’s dynaEdge™ AR Smart Glasses. Together, we believe that we can bring a much stronger awareness of what AR solutions can achieve in industrial enterprise roles such as field service, dealer service, manufacturing and repair operations, assembly line management, technician and expert training, warehouse picking, asset inspection and repair as well as remote visualization and support.

Toshiba_dynaEdge

AREA: What does the pairing of AiR Enterprise and DynaEdge smart glasses bring to the enterprise AR market that it didn’t have before?

Wheelwright: There’s a new set of capabilities represented in both the Toshiba dynaEdge AR Smart Glasses – and Atheer’s implementation of AiR Enterprise on them. To begin with, these are the first AR smart glasses designed for Windows 10. Then you have the fact that Toshiba’s innovative dynaEdge Mobile Mini PC – a fully functioning Windows 10 PC in a tiny form factor – is part of the solution and ensures that users have access to all their standard Windows enterprise applications while using the dynaEdge AR Smart Glasses. When you add AiR Enterprise to the mix, you get a fully-featured, Windows-based enterprise-grade AR solution that provides touchless, gesture-based interaction, remote subject matter calling, contextual documentation, barcode scanning and step-by-step task flows.

AREA: What are the plans for this partnership beyond this announcement?

Wheelwright: This announcement is just the first step of our partnership. The next step is that Toshiba and Atheer are taking this combined solution to a number of Early Access Partners (EAPs) that represent some of Toshiba’s most strategic industrial customers.  These and other customer engagements will guide how we continue to work closely together to drive even greater integration with relevant, Windows-based infrastructure and key enterprise applications. As we see how customers use the solution, we will get – and be able to apply – essential insights into the specific needs of enterprise users who want a Windows 10-based AR solution to our development roadmap.




Leading Water Utility in Wales Turns to AR to Reduce Errors and Improve Service

Dŵr Cymru Welsh Water (DCWW) supplies drinking water and wastewater services to most of Wales and parts of western England. A public utility with 3,000 employees serving 1.4 million homes and businesses, DCWW maintains 26,500km of water mains and 838 sewage treatment works. DCWW recently launched a pilot project to develop a mobile solution with AR capabilities to replace thousands of pages of operations and maintenance manuals. The AREA spoke recently with DCWW’s Gary Smith and Glen Peek to learn more about the solution, which they call the Interactive Work Operations Manual (IWOM).

AREA: What problem were you trying to solve with this solution?

DCWW: We need to provide our operational teams with comprehensive information on how to operate and maintain our assets, which was traditionally delivered in the form of an operations and maintenance manual, which could run to thousands of printed pages. We wanted a solution that could deliver that complex information in a robust device, utilising sensing technology and augmented and virtual reality. We wanted something that could tell users their location on an asset, allow for items of equipment to be interrogated via a readable code or tag, and provide information including running state and maintenance requirements. We felt a solution like that could help users make better-informed, accurate decisions, helping to reduce errors and risk and improve customer service.

AREA: What is the IWOM?

DCWW: The Interactive Work Operations Manual is a smart, tablet-based electronic manual that can be worn or held by users, delivering complex technical information including schematics, process flows, electrical drawings, maintenance records and service requirements in a simple to use, intuitive device. The IWOM uses near field communication (NFC) and QR codes to identify equipment and presents users with information about the equipment. The IWOM is intrinsically safe so can be used in hazardous, outdoors, wet and dusty environments. It uses AR to overlay an augmented layer. This layer shows process information and instructions such as directional information for valves and switches, or ideal operating ranges for gauges and dials – information hovers in front of users and does not obstruct their view, minimising risk but enhancing what they see, helping users make the right decisions and record operating information.

The IWOM delivers this information in an interactive and visually attractive way. Site inductions include videos, highlighting areas of high hazard and daily dynamic risks. Users sign the device and these records interface with SAP so that records can be retrieved either on a site-by-site or per-user basis to help prove compliance.

AREA: How is the IWOM integrated with “lean” operating principles?

DCWW: The device automates the delivery of lean rounds and prompts for user action at the required timeframes. Users undertake rounds and record findings, which are automatically and seamlessly synched with the central lean records. This removes the need for paper records and multiple handing of the same data, thereby saving time, driving efficiency and reducing the likelihood of operator error in data transfer.

AREA: How were you able to ensure user acceptance?

DCWW: Several ways. First, we engaged with users from the start. Our development team have worked side by side with the operational teams to ensure that they developed what users want. User feedback to the initial proof of concept has been excellent and there is a real hunger now to widen the pilot scope and show what we can achieve with the smart application of the technology, driving efficiencies and increasing safety and reliability.

We also engaged with industry and emerging technology developers. The development team realised they were leading the way in the water and sewage sector in the application and use of this technology. The team have engaged with industrial and commercial sectors to share their ideas with universities and other groups, including the AREA, the British Computer Society, and the UK Wearable Technology Development Group.

During the later stages of the pilot device development, additional teams were drawn into the assessment of beta test versions and the ideas were showcased at Works Management Board, Directors Briefings, team meetings and innovation groups to measure its acceptability and usability. Without this involvement, the programme would not have been a success.

Our development team is now developing a similar version of the IWOM for the Waste Water Treatment process, and a second pilot version is under development and will be delivered before the end of March 2018.

AREA: How long did it take to develop the IWOM?

DCWW: The development has progressed from an idea generated and showcased in the spring of 2017 to a live working application developed over a nine month period and now installed on a ruggedized handheld tablet or wearable device, allowing the users to access the content from any location in a safe environment with the option of either hands-free or handheld technology.

AREA: Does the IWOM include a Digital Twin?

DCWW: Yes, the IWOM’s digital twin is a replica of our physical assets, processes and systems. In the pilot program, we digitally mapped the inside and outside of the pilot site with 3D Point Cloud laser scanning equipment and these data points were used to create a digital image. It is possible to virtually walk or fly through this digital model, allowing users to view any area of the pilot site from the tablet or remotely. One significant advantage of this process is that it enables a user to view the building, structure or an item in a plant from any location and direct an operator on the site without physically being present in the same location.

AREA: How does this tie into the Internet of Things?

DCWW: The IWOM is a driving force in Welsh Water’s IoT strategy. We are working to connect all the devices in our environment that are embedded with electronics, sensors, and actuators so that they can exchange data. The IWOM leads the way in how the IoT is being developed in the water and sewage sector.

AREA: What makes the IWOM unique?

DCWW: The development team have been able to take the best of the existing paper documentation and merge it with cutting-edge technology at a very low unit cost to deliver an intuitive product to operators in the field. For example, site inductions and dynamic risk assessments are delivered interactively and are crucial to helping reduce risk and ensure employees and visitors return home safely at the end of their working day. The IWOM is also one of a very few industrial AR applications that is entirely self-contained in a handheld device.

AREA: What benefits has the IWOM delivered?

DCWW: The IWOM resolves the need to reproduce complex operating manuals in a paper format. Updates to the operational manuals are presently delivered by a team of CAD technicians who transcribe the design data into a rendered 3D digital format rather than stripping out detail to produce a simple 2D printed image. It’s much more efficient. Also, by eliminating paper manual updates, we’ve saved the operators the trouble and error associated with manually removing and replacing pages. The IWOM is updated centrally and an electronic copy pushed to users so they always have the most up-to-date version of the manual at the point they need it.

AREA: How easy is it to replicate the IWOM?

DCWW: The IWOM has not been resource intensive. Much of the development has been done in-house, by enthusiastic skilled amateurs and in the team’s own time in the evening and weekends. Welsh Water purchased a tablet computer to display the output and secured a “two for one” deal on the wearable device so we have kept the total hardware costs down. A development partner was secured by tender to help us design the bespoke software and they were so committed to the project that they contributed to the development costs. Companywide implementation would require a bespoke package to be developed for each site. We believe we can develop these in-house as our skill set grows, and rather than place the software package on a bespoke device on each site, we believe we can host it on our own servers. That would mean that the information on any asset where an IWOM had been created would be available to any user with the right credentials and user access.

AREA: How can our readers learn more about your work?

DCWW: Welsh Water is presenting at the April 2018 Welsh Innovation Event, the STEM event in February, Smart Water 2018, the Wearable Technology Conference, and at a hosted seminar for the British Computer Society in the autumn of this year.

AREA: What would like to say to AREA members and the broader AR ecosystem?

DCWW: The IWOM development team would like to share their ideas with others. We want to continue to explore what other industries are doing in this area and share best practices. In particular, we would like to hear from other developers in the AR, IoT and haptic fields of expertise.

Gary Smith is Head of IMS and Asset Information, Glen Peek is WOMS Manager, and Ben Cale is the Data Analyst at Dŵr Cymru Welsh Water.”




How AREA Research Projects are Furthering the Adoption of Enterprise AR

Research

With the recent publication of our Measuring AR ROI Calculator and Best Practices Report, we at the AREA are demonstrating our commitment to addressing obstacles that enterprises face when introducing and expanding their AR initiatives.

The report, calculator and case study are the output of the second in an ongoing series of research projects aimed at addressing the critical questions faced by enterprises seeking to launch and expand AR initiatives.

“The ROI report and the detailed case study prepared by Strategy Analytics for the AREA offer the most detailed explanations of the factors that must be considered when preparing a complete ROI analysis on AR and help to pinpoint where impacts will be greatest,” said Christine Perey, PEREY Research & Consulting, and the chair of the AREA Research Committee. “The calculator with instructions is the first tool of its kind and can be used immediately by business planners and AR project managers.”

Selected and funded directly by members, the AREA research projects offer tangible value, not only to enterprises developing their AR strategies, but also AR solution providers and non-commercial members. Member-exclusive research results include:

For more information, please email [email protected].




Recapping the AREA/DMDII 2nd Enterprise AR Workshop

The Augmented Reality Enterprise Alliance (AREA) and the Digital Manufacturing and Design Innovation Institute (DMDII), a UI LABS collaboration recently hosted the 2nd Enterprise AR workshop at the UI Labs in Chicago. With over 110 attendees from enterprises who have purchased and are deploying AR solutions, to providers offering leading-edge AR solutions, to non-commercial organisations, such as universities and government agencies

“The goal of the workshop is to bring together practitioners of Enterprise AR to enable open and wide conversation on the state of the ecosystem and to identify and solve barriers to adoption,” commented Mark Sage, the Executive Director of the AREA.

Hosted at the excellent UI LABS and supported by the AREA members, the attendees enjoyed two days of discussions, networking, and interactive sessions.

Here’s a brief video summary capturing some of the highlights.

Introduction from the Event Sponsors

Sponsored by Boeing and Upskill, the workshop was kicked off by Paul Davies, Associate Technical Fellow at Boeing and the AREA President. His introduction focused on the status of the Enterprise AR ecosystem, highlighting the benefits gained from AR and some of the challenges that need to be addressed.

Summary of AR Benefits

Mr Davies added, “We at Boeing are pleased to be Gold sponsors of this workshop. It was great to listen to and interact with other companies who are working on AR solutions. The ability to discuss in detail the issues and potential solutions allows Boeing and the ecosystem to learn quickly.”

 Developing the Enterprise AR Requirements Schema

The rest of the day focused on brainstorming and developing a set of use cases that the AREA will build on to create the AREA requirements / needs database and ultimately be added to the AREA Marketplace. The session was led by Glen Oliver, Research Engineer from AREA member Lockheed Martin, and Dr. Michael Rygol, Managing Director of Chrysalisforge.

The attendees were organized into 17 teams and presented with an AR use case (based on the use cases documented by the AREA). The teams were asked to add more detail to the use case and define a scenario (a definition of how to solve the business problems often containing a number of use cases and technologies).

The following example was provided:

  • A field service technician arrives at the site of an industrial generator. They use their portable device
    to connect to a live data stream of IoT data from the generator to view a set of diagnostics and
    service history of the generator.
  • Using the AR device and app they are able to pinpoint the spatial location of the reported error code
    on the generator. The AR service app suggests a number of procedures to perform. One of the
    procedures requires a minor disassembly.
  • The technician is presented with a set of step-by-step instructions, each of which provides an in-context 3D display of the step.
  • With a subsequent procedure, there is an anomaly which neither the technician nor the app is able to diagnose. The technician makes an interactive call to a remote subject matter expert who connects into the live session. Following a discussion, the SME annotates visual locations over the shared display, resulting in a successful repair.
  • The job requires approximately one hour to perform, meaning the portable device should function without interruption throughout the task.
  • With the job complete, the technician completes the digital paperwork and marks the job complete
    (which is duly stored in the on-line service record of the generator).

*blue = use case

The tables were buzzing with debate and discussion with a lot of excellent output. The use of a maturity model to highlight the changes in scenarios was a very useful tool. At the end of the session the table leaders were asked to present their feedback on how useful the conversations had been.

Technology Showcase and Networking Session

The day ended with a networking session where the following companies provided demos of their solutions:

Day 2: Focus on Barriers to AR Adoption

The second day of the workshop started with an insightful talk from Jay Kim, Chief Strategy Officer at Upskill (event Sliver sponsors) who outlined the benefits of Enterprise AR and how to avoid “pilot purgatory” (i.e., the continual cycle of delivering pilots with limited industrialisation of the solution).

Next, Lars Bergstrom, at Mozilla Research, provided a look into how enterprises will soon be able to deliver AR experiences to any AR device via a web browser. The attendees found the session very interesting to understand the potential of WebAR and how it might benefit their organisations.

Barriers to Enterprise AR Adoption – Safety and Security

The next two sessions generated discussion and debate on two of the key barriers to adoption of Enterprise AR. Expertly moderated by the AREA Committee chairs for:

  • Security – Tony Hodgson, Bob Labelle and Frank Cohee of Brainwaive LLC
  • Safety – Dr. Brian Laughlin, Technical Fellow at Boeing

Both session provided an overview of the potential issues for enterprises deploying AR and providers building AR solutions. Again, many attendees offered contributions on the issues, possible solutions and best practice in these fields.

The AREA will document the feedback and share the content with the attendees, as well as using it to help inform the AREA committees dedicated to providing insight, research and solutions to these barriers.

Barriers to Enterprise AR Adoption – Change Management

Everyone was brought back together to participate in a panel session focusing on change management, both from an organisation and human perspective.

Chaired by Mark Sage, the panel included thought leaders and practitioners:

  • Paul Davies – Associate Technical Fellow at Boeing
  • Mimi Hsu – Corporate Digital Manufacturing lead at Lockheed Martin
  • Beth Scicchitano – Project Manager for the AR Team at Newport News Shipbuilding
  • Jay Kim – Chief Strategy Officer at Upskill
  • Carl Byers – Chief Strategy Officer at Contextere

After a short introduction, the questions focused on “if AR should be a topic discussed at the CEO level or by the IT / Innovation teams.” After insightful comments from the panel, the audience was asked to provide their input.

Questions then focused on how to convince the workforce to embrace AR. Boeing, Newport News Shipbuilding and Lockheed Martin provided practical and useful examples.

There followed a range of questions from the audience with the panel members offering their experiences in how their organisations have been able to overcome some of the change management challenges when implementing AR solutions.

Final Thoughts

The general feedback on the two days was excellent. The ability to share, debate and discuss the potential and challenges of Enterprise AR was useful for all attendees.

The AREA; the only global, membership-funded, non-profit alliance dedicated to helping accelerate the adoption of Enterprise AR by supporting the growth of a comprehensive ecosystem and its members to develop thought leadership content, reduce the barriers to adoption and run workshops to help enterprises effectively implement Augmented Reality technology to create long-term benefits.

Will continue to work with The Digital Manufacturing and Design Innovation Institute (DMDII), where innovative manufacturers go to forge their futures. In partnership with UI LABS and the Department of Defense, DMDII equips U.S. factories with the digital tools and expertise they need to begin building every part better than the last. As a result, more than 300 partners increase their productivity and win more business.

If you are interested in AREA membership, please contact Mark Sage, Executive Director.

To inquire about DMDII membership, please contact  Liz Stuck, Director of Membership Engagement




Take the AREA 2018 Enterprise AR Ecosystem Survey Now

The Augmented Reality (AR) marketplace is evolving so rapidly, it’s a challenge to gauge the current state of market education, enterprise adoption, provider investment, and more. What are the greatest barriers to growth? How quickly are companies taking pilots into production? Where should the industry be focusing its efforts?

To answer these and other questions and better measure trends and momentum, we at the AREA are pleased to launch our second annual ecosystem survey.

The survey takes only five minutes to complete. Submissions will be accepted through February 16, 2018. We’ll compile the responses and share the results as soon as they’re available.

Take the survey here.

Make sure your thoughts and observations are captured so our survey will be as comprehensive and meaningful as possible. Thank you!




Insights on Enterprise AR from CES 2018

2017 was the year that Augmented Reality emerged from the trough of disillusionment. Enterprise AR, with nuts-and-bolts use cases and revenue, became the fastest-growing category within the AR/VR universe. According to ARtillry intelligence, hardware and software spending in 2017 was $3 billion, more than triple that of 2016.

Some of the most compelling Enterprise AR products and business strategies were on display at CES 2018. The world’s largest consumer electronics convention was an excellent opportunity for companies to exhibit in the public eye. At the forefront this year was the convergence of trends enabling the next stage of AR: hardware development and miniaturization, user-centric design, and business model innovation.

Enterprise AR’s primary focus is on using visual data to increase the capabilities of workers. It has been found that out of the five senses, sight constitutes 83% of information processed by the brain. The value proposition for augmenting visual information is real; workers are 30% more productive with AR-information delivered in context, according to Jim Heppelmann and Mike Campbell of PTC. Hence most applications gaining traction at the moment revolve around the delivery or production of visual information.

CES 2018 also revealed some of the gaps that need to be filled for the AR movement to accelerate. First, “the world is seriously devoid of AR talent,” as Jim Heppelmann noted. Secondly, the nature of spatially-based visuals requires complex, high-resolution objects to be delivered to the user. These are generally too large and dynamic to be contained within static apps on a local client and thus need to be web streamed live. The developer community needs to establish protocols for real-time AR asset streams as it has done for web VR in the past.

Wearable displays present a different paradigm for interaction and control. A killer app may be lacking the killer interaction method. Currently the most prevalent input methods are voice, swiping, and RGB and IR camera-based gesture recognition. These will leave you wanting in adverse physical environments and when performing complex tasks such as web navigation and emails. One possibility would be leveraging micro-movements as input in the same way game controllers respond to millimeter actions: small actions allow for high ergonomic efficiency and bandwidth. This type of work is being pursued by Pison and other human computer interface firms. Other firms are experimenting with multimodal combinations of brain, eye, voice, and bioactivity signals to enable context awareness.

At CES 2018, the following companies demonstrated compelling lessons for how to find edge in a rapidly ascendant industry:

Realwear – dominate with a differentiated product

Realwear’s industrial headset has high quality voice recognition and works in environments with over 95db of noise. The headset performed well on the loud CES conference floor with no false positives, even with whispered commands. The basic list of voice commands are processed on-board for smoother operations compared to internet-enabled engines like Alexa. The HMT-1 product launched in Oct 2017 and has such rapid uptake that it will be one of the top three AR headsets in use in 2018. Over 200 customers and 75 solution partners are using HMT-1 already.

In the crowded field of headset companies, Realwear has been able to achieve quick adoption and growth rates by catering to a specific user base. The company primarily serves rugged industries where using hands to do work is critically important. Matt Firlik, Head of Marketing & Business Operations, says, “The number one application is remote mentor, which gives field workers access to experts located on the other side of the site, or other side of the world.”  With that use case, remote experts can annotate what users see on their micro-display, and coach them through complex maintenance or assembly procedures. Other use cases enable users to complete work orders, view documents like complex schematics, and engage with IoT data. According to Firlik, “the HMT-1 gives workers in the field a voice that keeps them connected to their colleagues, the back office, and the work they have to do since they will never have to pick up a tablet or clipboard to do their job again.”

The company’s success is a testament to the power of product differentiation and strong focus. Realwear’s technology bets on the interaction method of on-board voice as a competitive advantage. There is no dependence on technologies utilized by other headsets such head tracking, swiping, hand grasping, and cloud-processed voice. Even so, voice recognition as a whole faces a difficult journey to become a robust, standalone modality. A significant portion of users find the experience frustrating especially older workers or those with thick accents. The challenge for Realwear will be to expand rapidly and become deeply entrenched in enterprise workflows before competitors are able to catch up in voice recognition quality. (Realwear is an AREA member and participates in many of the committees seeking to reduce barriers to AR adoption.)

Augmen.tv – how AR can leverage existing huge markets

Augmen.tv is the first camera-vision and AR streaming app for TV augmentation. Content is detected on the TV and synchronized at the millisecond time frame. Key to the user experience is the multitude of interactive AR content that extends and enriches the viewing experience. The comprehensive demo included characters jumping out of the scene, sports players and statistics displaying around the TV, and placing the viewer in an immersive 360 scene.

The company stands to build upon a successful debut on European TV and test in the US on preprogrammed as well as live shows. The app was number one in the App Store in Germany with nearly a million downloads. Users were incredibly driven to experience the tech despite the app having a massive download size; the launch iOS version was 1.1 GB! The next generation will offload content to cloud and edge servers for lighter storage on user devices. The ability to call these assets in real-time will be a major technological innovation for the entire AR industry.

The business pathway for Augmen.tv could be akin to that of Amazon Web Services (AWS). Amazon built AWS out of necessity to scale internal computing capability up and down throughout the year. The excess capacity during down times presented an opportunity to sell processing power to enterprises as a standalone offering. The challenge for a young company like Augmen.tv is to manage content creation while building first-class camera vision and asset streaming capabilities. If the company achieves the balance of being both a media and tech company, then it stands to benefit from two huge markets.

Proglove – the simplest products can be highly lucrative

Proglove is a company that surprised in both its technological simplicity and its rate of success. The entire product consists of a simple bar code scanner that is worn on the back of a partial glove with an in-palm trigger. The scanner is used mainly at car manufacturing plants and package shipping warehouses. Some use cases pair it with smart glasses for assistive reality. After just two years, Proglove is already being used in every European BMW factory. The wearable, available as a complete system for $3000, saves three seconds off every task. At car plants where each worker performs one task thousands of times a day for 300 days a year, the ROI is highly significant.

The minimalist functionality of the product was the result of paying attention to customer feedback. Proglove originally sought to develop a glove with a range of features including RFID, bending sensors, motion tracking, and displays. “We found out most of the customers would be happy with a bar code scanner. That was our MVP. Simple to use. In industry, they need time to adopt. If you have something really radical, then it might kill you as a startup until you see first revenue,” explains founder Paul Gunther. It is impressive how Proglove found a way to charge a high price for ubiquitous technology. The challenge for them will be to avoid competition leading to pricing pressures. One tactic to mitigate this is to become the official supplier for clients as the company has done with BMW.

Mira – holistic design is key

Mira has developed a technologically simple yet extremely well-designed iPhone-based headset. It is highly comfortable for the user and also very socially friendly due to its transparent and open display. Others can easily see the user’s face as well as the contents being displayed. Seeing the headset in action from the outside conveys a feeling of curiosity versus enclosed VR and AR headsets. Reinforcing this feeling is the lighthearted nature of the company’s demos, which are focused on entertainment and social collaboration games. The content is app-based and allows experiences to be shared simultaneously on both the headsets and phones.

The most critical aspect of Mira’s innovation may be the design of the holistic user experience. By presenting easy-to-use, full-experience tech in a non-geeky manner, Mira has created a beautiful product that could greatly accelerate the wide scale adoption of AR. It is easy to imagine the next social AR hit like Pokemon Go being played on a Mira headset. The challenge for a company with a technologically minimalist product like theirs is to build a competitive moat around the full stack ecosystem and software environment and find ways to enforce use of their headset versus the eventual knockoffs.

It is important for the Enterprise AR community to recognize the collective value in developing and validating solutions for AR’s current shortcomings. In an industry experiencing triple-digit growth each year, there is impetus to join the rising tide or risk being left behind. As Jay Sumit, Deloitte Independent Vice Chairman, said, “In 2018 you will see a bifurcation of businesses that embrace AR and those that cease to exist.” The companies that actively learn from shared resources and membership organizations stand to gain the most from the AR movement.

___

Dexter Ang is CEO of Pison, a company building the future of human computer interaction and bioelectric control. The company develops full-stack wearable technology for intuitive and powerful gesture control of augmented reality smart glasses. Pison has developed and patented electroneurography, or ENG, the first process for sensing peripheral nerve firings on the surface of the skin. Vertically-integrated solutions combine hardware, software, machine learning, and UI for AR industries. Investors and partners include Oculus, MIT, Draper, National Science Foundation, and HHS.




CES 2018 Recap: Atheer on the Flex AR Reference Design

One of the highlights of CES 2018 earlier this month was the introduction of an enterprise AR reference design from Flex. We spoke about it recently with Geof Wheelwright, director of marketing communications for Atheer, AREA member and a partner in the Flex announcement.

AREA: What is the purpose of the Flex enterprise AR reference design unveiled at CES?

WHEELWRIGHT: The purpose of the Flex AR reference design is to reduce time to market for companies making AR devices for enterprise and consumer applications. It includes a complete product specification, including a head-mounted display (HMD), an external processing unit (EPU), a gesture-based software platform (developed with Atheer) to manage interaction, and pre-installed Enterprise AR software. By customizing the rugged, stable and high-quality Flex AR reference design versus developing their own AR hardware, companies can significantly reduce product development costs and quickly scale manufacturing.

AREA: What is the significance of this announcement to the enterprise AR market?

WHEELWRIGHT: The significance of this announcement is that it provides a new standard for AR hardware and interaction – and a very real path to a much broader range of participants in the enterprise AR hardware market. It also goes beyond a mere hardware specification by including an interaction model that is multi-modal (i.e., it supports head motion, voice control and gestures) and a 30-day trial of Atheer AiR™ Enterprise. That means customers can immediately start using remote expert collaboration (“see what I see”) and authoring and deliver workflows and step-by-step task guidance for their unique needs. In addition, Flex will provide a full software development kit (SDK) to customers who are building on Android Nougat. The sum of all those parts means that OEMs have access to an AR offering that can provide real value to enterprise customers right out of the box.

Flex designed augmented reality headset and belt pack reference design (PRNewsfoto/Flex)

AREA: Can you give us an example of how the reference design reduces time to market?

WHEELWRIGHT: A typical hardware development cycle would involve bringing together a number of key standardized components (including operating system, processor, specialized hardware) around a particular design for a particular purpose. Hardware designers would then build and test prototypes, refine those prototypes (and then retest them as they add new components), field-test and debug the prototype. They would then have to figure out how they would manufacture the device. And all of that is before you run a single piece of third-party software on your new device.

Manufacturers using the Flex AR reference design get the advantage of a pre-designed system that is already tested and already works – cutting out a lot of the time typically involved in new hardware development. It includes cutting-edge technology from partners, including the Snapdragon 835 mobile platform from Qualcomm, designed to deliver full-color, 1080p augmented reality experiences. The Snapdragon 835 draws 25 percent less power than previous models, using an advanced 10-nanometer design.

AREA: What is Atheer’s role in the reference design?

WHEELWRIGHT: Atheer came to this project with unique experience in having designed our own smart glasses (the well-received Atheer AiR glasses) and was able to bring that to bear on helping Flex create the Flex AR reference design. Specifically, Atheer contributed our standardized multi-modal interaction model. “We know the challenge of designing a cutting-edge platform that can be mass produced,” said Soulaiman Itani, Chief Executive Officer and founder of Atheer, in his comments on the Flex announcement. “Through our work with Flex, we’ve seen their capabilities, and we’re pleased to help provide a UI system that supports gestures, voice, head motion and Bluetooth wearables for hands-free operation. We are looking forward to Flex enterprise customers being able to experience the out-of-the-box Augmented Reality tools in Atheer’s AiR Enterprise™ productivity solution for augmented reality.”

AREA: Why has Atheer partnered with Flex?

WHEELWRIGHT: Flex has the global reach, experience and respect in the electronics hardware manufacturing industry to help make our interaction model an industry standard – and bring enterprise users the real and immediate safety and productivity benefits of our flagship Atheer AiR™ Enterprise software.

AREA: Does this represent a change or an evolution in the Atheer business strategy?

WHEELWRIGHT: It represents an evolution. In 2012, Atheer was founded on a belief that AR technology could make a significant and measurable difference in how workers at industrial enterprises do their work. In the company’s initial stages, the Atheer team explored the ideal hardware needed to create impactful enterprise AR applications. It also affirmed the idea that, in order to be really useful, AR hardware would need to be based on popular, well-supported mobile operating system platforms (starting with Android).

That work led initially to the development of Atheer AiR Glasses, which later become the foundation for a reference design platform called AiR Experience that Atheer now sells (combined with a multi-modal interaction platform and access to Atheer’s partner engineering team) and is a key element of the work with Flex. The company now offers Air Experience alongside its flagship Atheer AiR™ Enterprise software, which provides real and immediate benefit for customers such as Porsche Cars North America, Inc. (PCNA). PCNA announced late last year the introduction of “Tech Live Look,” an AR technology designed to improve technical services at Porsche dealerships in the United States. “Tech Live Look” uses AiR Enterprise™ in conjunction with lightweight smart glasses.

AREA: Can we expect other similar partnerships to be announced in the near future?

WHEELWRIGHT: We are continually evaluating other partnership opportunities to help grow the market for AR solutions in the enterprise that leverage our experience and help bolster the development of key interaction standards for the AR industry.

AREA: How will this and other partnerships accelerate the adoption of AR in the enterprise?

WHEELWRIGHT: Enterprises want measurable value, power, interaction standards that make sense – as well as proven enterprise-grade applications using hardware from manufacturers they trust on operating systems they know. Our platform delivers all of those elements and helps to significantly lower barriers to adoption in a way that should move customers from limited, line of business-driven “proof of concept” lab trials to serious IT-supported evaluations that can be rolled out broadly throughout an enterprise.




Behind the UK’s £33 Million Investment in AR/VR

When the British government published its Industrial Strategy White Paper last November, one of the report’s major announcements was a £33m investment in a challenge designed to “bring creative businesses, researchers and technologists together to create striking new experiences that are accessible to the general public” using immersive technologies, such as AR and VR. The goal is to “create the next generation of products, services and experiences that will capture the world’s attention and position the UK as the global leader in immersive technologies.”

One of the people guiding the effort is Tom Fiddian, Innovation Lead at Innovate UK, with whom the AREA spoke recently. Innovate UK is the UK’s innovation agency, a non-governmental public body that seeks to “drive productivity and growth by supporting businesses to realize the potential of new technologies, develop ideas and make them a commercial success.”

“The general fund is looking at challenges that can be solved by innovation,” explained Fiddian. “My job is to look after the creative industries: publishing, art, culture, film, and music. Not often do we have such an opportunity, where an emerging technology is going to disrupt the market across so many different creative sectors.”

The government investment is being allocated to several areas.

“The vast majority of the money will be available for businesses to apply for under different headings, running large-scale demonstrations in the creative industries,” said Fiddian. “We’re also looking at lowering the cost of creating content to help grow the market.”

While the £33 million is earmarked for the creative industries and not Enterprise AR, the fact that the UK government is investing so significantly in AR and VR is a testament to how much the nation’s leaders view the importance of AR and VR to Britain’s future economic position in the world.

“This is all about expanding the general AR/VR market,” said Fiddian. “I have no doubt that, even though we are focused on the creative industries, the overspill of new technologies and new methodologies will benefit the enterprise AR market, as well.”

Proposals and business cases will be evaluated over the next several months. Fiddian expects specific announcements of projects being funded will be coming in April of this year.

Tom Fiddian noted that while Innovate UK has sponsored other projects that were more broadly focused on immersive technologies for the enterprise, this new challenge is capturing significant interest.

“With the size of the investment, it’s definitely putting AR and VR on the map,” he said.