1

“Urgency” Will Drive AR Adoption in the COVID-19 Era

In addition to being CMO and President of RE’FLEKT, Dirk Schart is an industry expert who has been involved with Augmented Reality (AR) to one extent or another for a decade. Recently, he drew upon this perspective to explore what he sees as an inflection point in the continuing history of AR. His thesis, as detailed in his personal blog, is that a newfound urgency driven by COVID-19 is leading to an imminent upswing in AR adoption. We spoke to Dirk to discuss his ideas further. 

AREA: What is different about this moment in the history of AR? What has changed?

Schart: Two things have changed. One is that we’re not focusing on technology anymore, but rather on solutions for enterprises. Second, we’re seeing a higher level of urgency that we hadn’t seen before. These factors, to use the term from Geoffrey Moore’s 30-year-old classic book, suggest that AR is “Crossing the Chasm.” We’re moving from experimental early adopters to a more mainstream market that expects a ready-to-use, ready-to-integrate solution. COVID-19 is acting as the accelerator. COVID is driving people to use AR in real scenarios in their daily work lives. So now we have real users coming back with more specific requirements for their solutions. We’re not crossing the chasm yet, but we’re coming closer to it.

AREA: Is the pandemic accelerating certain use cases more than others?

Schart: In the past, when we talked about use cases, we talked about maintenance, operations, and training – a very high-level description of use cases. Now, we’re talking about, for example, onboarding new employees for a product launch. It’s much more concrete. We’re seeing use cases that are driven by the travel bans caused by the pandemic. The most popular one is remote support. You don’t need to have any specific hardware; you just take your phone, call, and you get guidance immediately. There are others, as well: as I mentioned, onboarding of new operators, as well as production line changeovers, as companies move production from one facility to another.

AREA: If a working vaccine is developed for COVID-19, hopefully in the near future, do you think the adoption of AR will continue at the same pace, or will companies go back to the way they did things before?

Schart: It’s a fair question. A major factor is human behavior – and humans don’t like change. But people are seeing now that they can handle things more easily with AR. And all of the managers at these companies are realizing, “We don’t need all that travel. We can save a lot of money by not traveling.” Now they realize they have the tools to do it without traveling. But it will take time. This year I expect remote support will be the catalyst for everything. But by 2021 or 2022, I think you’ll have more use cases with AR as it starts to deliver more value than existing tools. There’s also a big focus now on making more tasks digital and automated by leveraging AR and AI. That will have a big impact. At RE’FLEKT, we’ve seen a 300% increase in our monthly active users, even hitting a peak of 600% – and they’re still using it. That gives me the confidence that we’re finally showing the value. Of course, there are still problems to solve; content creation has to be easier, smart glasses are not ready, but I’m confident that we will see new use cases next year.

To read the full text of Dirk Schart’s article, please visit his blog page.




Augmented Reality in Medical & Pharma: Industry challenges in medical device manufacturing and how to tackle them with AR solutions

This editorial has been developed as part of the AREA Thought Leaders Network content, in collaboration with selected AREA members.


Corporations in the medical and pharmaceutical industry need to adhere to the highest standards of quality, with accuracy and precision being the keys to success. If organizations experience equipment errors or healthcare workers make mistakes, they not only put human life at risk but also incur significant consequences for payers, including financial and credibility loss. To reduce error rates and increase quality, businesses across the industry are turning to the latest technologies – including Augmented Reality (AR).

AR and VR technology is already being used and significantly improving processes in medical device manufacturing. This editorial discusses two major use cases in which AR solutions simplify workflows to reduce human error:

  1. Medical device assembly
  2. Production line changeover

Let’s take a closer look at exactly how AR technology can tackle key challenges in both cases, with the help of some first-hand insights from William Harding of industry leader Medtronic, recently interviewed by RE’FLEKT.

Key challenges in medical device manufacturing

Medical device manufacturing typically involves a variety of manual, semi-automatic and automatic processes which makes production particularly vulnerable to error – especially as large manufacturers need to employ the same processes across multiple facilities, often without standardization of production data. In addition, if there is a lack of training among operators it can increase the risk of mistakes made during manual tasks when medical equipment is assembled and configured.

William Harding, Distinguished Fellow at Medtronic, reveals which factors medical device manufacturers need to consider when introducing changes on the production floor:

“If I add a new process to a production line, many questions need to be addressed: How do I get the process to integrate seamlessly (e.g., communication protocols, data aggregation, and data transformation)? How do I accomplish that without using paper-based systems? The goal is to speed up efficiencies and reduce scrap while also reducing human error. When we create a new process in lean manufacturing, we need to establish the most ergonomic way for an operator to perform their tasks within a sterile environment. We also want them to complete these tasks in the most efficient way possible, while delivering a high-quality product. There are many factors to be considered.”

Prior to introducing a new manufacturing process, operators need to be trained on how to perform each step to ensure maximum efficiency and minimum error rates during production. William further explains how Medtronic originally used a cardboard replica of their manufacturing line for training purposes and what challenges came along with it:

“It used to take us two and a half weeks to build a cardboard set-up with five process stations. For one training session, we also needed at least eight to ten people off the production floor, who then weren’t engaged in manufacturing products while they were in training. It would cost us about $30,000 for one training effort with the cardboard set-up. We usually require five sessions in total to get everything right, and by the time we decide that everything is ready, we’re making changes five minutes later.”

Simplified training and operations with AR solutions

With AR technology, medical device manufacturers like Medtronic can not only manage the challenges listed above, but also benefit from significant operational improvements, as the following two use cases reveal.

1. Enhanced AR Training for device assembly and set-up 

Training around medical device set-up and configuration is traditionally based on Operating Procedure (OP) documentation that is not user-friendly. Extensive manuals, including complicated 2D diagrams and text-based instructions, make it challenging to find the right information quickly for device operators. Consequently, onboarding is time consuming and devices may be set up incorrectly and/or not used to their full potential.

Many leaders in the medical sector, including Medtronic, are turning to AR to train employees to set up and assemble their equipment. With results that decrease human errors by 90% and improve training times by 60% (see this white paper for further info) the reasons are obvious. AR training solutions allow device operators to visualize complicated OP documentation in a simple way with the right mixture of videos, text, and images that appear directly in context with the real object. This ensures that device operators always have training content available instantly on their mobile devices, tablets, or smart glasses, thereby experiencing fewer errors during device assembly and set-up.

William from Medtronic shares how AR training guides have replaced the cardboard replica during operator training at Medtronic:

“With content creation platforms like REFLEKT ONE, we can now create AR applications that allow operators to learn a new process by walking through engaging training guides on a tablet instead of using our cardboard model.”

2. Lean production line changeovers with AR-based procedures

When switching the production line from one product to the next, every minute of changeover time comes at the cost of missed revenue as production is down while teams rearrange, set-up, and configure the equipment for the next production cycle. Lean manufacturing strategies can help solve the dilemma to shorten downtime and increase the final output.

AR guidance during changeover procedures results in 40% fewer errors and a 25% faster changeover speed (see this white paper for further info). The interactive guides show operators the ideal state of the task at hand in AR next to the actual state. This way operators can always see what needs to be done as they are working. As augmented instructions guide operators through each step, the risk of error is ultimately reduced for all manual stages of the changeover.

This digitalized process is faster and more reliable as William confirms from his own experience creating AR solutions at Medtronic:

“Recently I created a solution to train operators on a manufacturing process for our Linq II battery bond (an implantable 2 lead EKG data recorder for patients). I made the content available to them online, where they could walk through it themselves and learn how to perform the process using gestures in AR. It’s a very fast and effective way of training because it saves resources and is so close to the real manufacturing environment.”

Outlook: The future of XR technology in the medical sector

These two use cases are great examples of how AR technology is already making a measurable difference in tackling key challenges in training and operations within medical device manufacturing. For the future, William forecasts a growing adoption of AR and ultimately Mixed Reality solutions at Medtronic as well as across the industry:

“Through the use of this technology in the future, I know that Medtronic will be able to more quickly understand the needs of patients, healthcare professionals, and payer’s needs, such that the lifecycles of innovation are reduced in addressing those needs. That same point can be made within the medical device manufacturing industry, specifically as it relates to product and process transfers as well as in the training of the individuals responsible for completing the assembly of those devices. However, it is my belief that AR and eventually MR technology will make the use of VR less important because users will prefer the more relatable MR environments.”




Equipping the AR workforce of tomorrow

As part of the AREA’s mission to help accelerate the adoption of Enterprise Augmented Reality (AR) by supporting the growth of a comprehensive ecosystem, we are further engaging with academic institutions to provide feedback on how they can help equip the graduates of tomorrow with the AR skills needed to positively contribute to the workforce.

The AREA, together with our academic partners, has created a very short survey to capture your perspectives on educational needs for future graduates.

We would gratefully appreciate you completing this survey – it should take no longer than 10 minutes to complete.

The survey is available HERE and runs until July 31st 2020. All contributors will receive a report summarising the findings. If you do have any questions, please contact [email protected].

Thank you for helping shape the educational future of our workforce.

The AREA Team




Progress Report on AREA 3D Asset Reuse Research Project

Researcher Eric Lyman of 3XR has provided an update on the progress of the AREA’s 7th research project. Eric and his team are tasked with examining barriers to, and recommending approaches for, using existing enterprise 3D assets in AR experiences. The project will also test the ingestion and use of enterprise 3D assets in a set of limited but representative environments. 

Research began in April when all enterprise AREA members were contacted to provide sample 3D models for testing and participate in an interview with Eric. Designed to help ascertain the most popular tools, the most compelling 3D AR use cases, and the most important 3D optimization criteria, interviews have been conducted with representatives of Boeing, Newport News Shipbuilding, Merck, and the AEC Hackathon. Eric has also interviewed AR providers Theorem, ARVizio, InstaLOD, and Hexagon, as well as NIST and MakeSEA/Catapult. 

Three organizations have generously contributed 3D CAD files to the project: Boeing, DIME Lab, Medtronic, Newport News Shipbuilding, and NIST. 

The following AR tools will be used to test reuse of the 3D CAD files: Rapid Compact DGG, InstaLOD, Simplygon, PiXYZ, Meshlab, and possibly ARVizio. 

When completed, the research project is intended to reveal: 

  • The most popular AR execution and rendering engines and frameworks that support dynamic 3D asset ingestion 
  • The key toolchains being used to generate 3D assets for AR applications 
  • Which formats (inputs and outputs) the toolchains and frameworks support 
  • Which standards are supported by the 3D and AR toolchains and frameworks
  • Any failures or incompatibilities that arise when using a subset of toolchains and delivering the final models to a few specific AR devices used in enterprise. 

The final research project report, for AREA members only, will deliver an overview of the most optimal conversion processes to bring 3D assets into AR platforms. This will include: 

  • A full overview of steps required, while illustrating the degree of success with each process and format tested; 
  • A table graph that clearly illustrates the advantages / disadvantages of these processes both from the perspective of conversion ease, and final usability; 
  • An analysis of pre-existing commercial platforms, and the creation of a table graph illustrating the pros / cons of each. 

Eric expects the work to be completed by the end of July. 




COVID-19: How Augmented Reality is helping mitigate business impact

This editorial has been developed as part of the AREA Thought Leaders Network content, in collaboration with selected AREA members.


Short of time? Listen to the accompanying podcast (~10 minutes) available here.

An imperative to overcome limitations

The COVID-19 pandemic has unleashed an unprecedented impact across the global business landscape. Over recent months, many countries have implemented various forms of lockdown, severely limiting the ways that companies can do business, and, in many cases causing operations to cease. This crisis is likely to have an ongoing impact in the months ahead as we transition to a “new normal” and beyond.

This editorial discusses ways in which Augmented Reality (AR) can help mitigate the societal and business impact while supporting business continuity through the pandemic.

The restrictions placed upon both individuals and organizations have resulted in an upsurge in the use of digital technologies to support a variety of activities, including online shopping, digital and contactless payments, remote working, online education, telehealth, and entertainment. The ability to support these activities is heavily reliant upon the availability of “digital-ready” infrastructure and services.

Enterprise AR builds upon this digital infrastructure by offering the ability to juxtapose digital content over a live view of the physical world to support business processes. So how can AR help?

First, let’s examine the impacts that COVID-19 and subsequent responses have had upon business and society:

  1. Social distancing measures hinder our ability to have traditional face-to-face interactions in addition to often limiting the size of groups able to gather.
  2. The inability to travel and prevalence of key staff working from home are viewed as impacting the ability to conduct business, manage effective team operations, and provide local expertise where it is needed, amongst others.
  3. Fewer on-site staff due to illness, self-isolation and financial restrictions impedes an organization’s ability to continue operations “as before.”
  4. A lack of classroom and hands-on training makes it difficult to quickly upskill new staff or train existing staff on products and processes.
  5. Disrupted supply chains are requiring manufacturing and sourcing processes to become more flexible to help ensure continuity of production.
  6. The potential for virus transmission has caused a reluctance among workers to touch surfaces and objects that may have been touched by others.

Clearly, to help address these challenges, new or enhanced tools and ways of working are required. At the AREA, we believe that AR can play an effective role in mitigating a number of these obstacles and, at the same time, offering new opportunities to provide long-term business improvements.

AR can help address COVID-19 restrictions with remote assistance

A key use case of Enterprise AR is in the realm of remote assistance.  AR-enhanced remote assistance provides a live video-sharing experience between two or more people. This differs from traditional videoconferencing in that such tools use computer vision technology to “track” the movements of the device’s camera across the scene. This enables the participants to add annotations (such as redlining or other simple graphics) that “stick” onto elements in the scene and therefore remain in the same place in the physical world as viewed by the users. Such applications support highly effective collaboration between, for example, a person attending a faulty machine and a remote expert, who may be working from home. This use case helps mitigate impacts of travel reduction, reduced staffing, and, of course, social distancing.

 

AR-enhanced remote assistance for medical equipment procedures (YouTube movie). Image and movie courtesy of RE’FLEKT.

 

Sarah Reynolds, Vice President of Marketing, PTC comments, “As organizations look to maintain business continuity in this new normal, they are embracing AR to address travel restrictions, social distancing measures, and other challenges impacting their front-line workers’ ability to go on-site and operate, maintain, and repair machines of all kinds. Even when equipment or product experts can’t address technical issues in person, AR-enhanced remote assistance enables them to connect with on-site employees and even end customers to offer them contextualized information and expert guidance, helping them resolve these issues quickly and ultimately reduce downtime. AR-enabled remote assistance marries the physical and the digital worlds – allowing experts and front-line workers to digitally annotate the physical world around them to improve the clarity, precision, and accuracy of their communication and collaboration.”

AR-enhanced remote assistance enables business continuity for machine operations, servicing and repair. Image courtesy of PTC.

AR enables no-touch product interaction via virtual interfaces

A key capability of AR is the ability to superimpose a digital virtual user interface on physical equipment that may have a limited or non-existent user interface. The user is able to, depending upon the technology used, select actions by tapping on the screen of the device or, alternatively, use hand gestures or verbal commands to interact with the equipment via the AR-rendered “proxy” user interface. The provision of such abstracted interactions is key to reducing the amount of touching required by physical objects that may be used by numerous people.

There are many ways in which such AR capabilities can help medical professionals carry out their duties during the current pandemic and beyond. The BBC has reported on one such application that helps reduce the amount of physical contact between doctor and patient, while still enabling them to communicate with colleagues outside of the COVID-19 treatment area. Here, a doctor wearing a Mixed Reality headset is able to interact with medical content such as x-rays, scans or test results using hand gestures while others are able to participate in the consultation from a safe location. The article points out that this way of working also reduces the need for Personal Protective Equipment (PPE) as colleagues are able to participate from a safe distance.

Example of a virtual user interface projected into the physical world. Image courtesy of Augumenta.

 

Eve Lindroth, Marketing and Communications at Augumenta, comments, “Today, the devices and applications can be controlled hands-free. This also addresses the problem of being able to work hygienically. You do not need to touch anything to get data in front of your eyes, control processes, or to document things. You can simply use gestures or voice to tell the device what to do. Tap air, not a keyboard.”

AR can help medical equipment training

AR can also be used to help assist medical professionals by providing highly efficient and interactive training methods that can streamline the process of learning new equipment and other necessary procedures. This is critical when experienced staff are unwell and replacements need to be trained as quickly as possible.

Harry Hulme, Marketing and Communications Manager at RE’FLEKT, comments, “We’re seeing that AR is a key tool for healthcare workers during these testing times. For medical training and equipment changeovers, AR solutions substantially reduce the risk of human error while significantly reducing training and onboarding times. Moreover, the time-critical process of equipment changeover is accelerated with AR-enhanced methods.”

 

AR-based training with REFLEKT ONE and Microsoft HoloLens in medical and healthcare. Image courtesy of RE’FLEKT.

 

AR supports remote collaboration

The remote assistance use case can be generalized further to include remote collaboration.  AR enables users who are physically separated to be able to “inhabit” a shared virtual space, distributed by the AR application. This ability enables the support of numerous use cases, including shared design reviews. In this scenario, multiple users can see the 3D product models and supporting information projected onto their view (and from their relative position) of the physical world, via their AR-enabled devices.

Different design alternatives can be presented and viewed in real-time by all participants, each of whom can position themselves in their physical space to obtain a particular aspect of the digital rendition. Further, users can annotate and redline the shared environment, providing immediate visual feedback to all others.  Such capabilities are key factors in mitigating the restrictions imposed upon travel, the forming of groups and close-proximity human-to-human interaction

 

Immersive collaboration: A design review of a motorbike in 1:1 scale with a remote team. Image courtesy of Masters of Pie.

 

Karl Maddix, CEO of Masters of Pie, comments: “Video conferencing solves the immediate need to bring people together whereas collaboration, as enabled by Masters of Pie, is built for industry to bring both people and 3D data together. Real-time access to the underlying 3D data is imperative for effective collaboration and business continuity purposes.”

AR supports remote sales activities

AR is also proving an effective sales tool, enabling the all-important sales process to continue during the pandemic. Home shoppers can examine digital renditions of home appliances, furniture, etc. presented within their own physical space, for example. Moreover, the use of rich and interactive sales demonstrations facilitated by AR allow the potential buyer to understand the form, fit and function of a product without the need for travel, touch or close interaction with a salesperson.

AR enriches the remote shopping experience, allowing buyers to place and interact with products in their own physical environment. Image courtesy of PTC.

 

Sarah Reynolds of PTC comments, “AR experiences improve the end-to-end customer experience, improve purchase confidence, and ultimately streamline sales cycles, especially when customers are not able to shop in person.”

Take the next steps

In this editorial we’ve discussed a number of ways in which AR technology can help ensure business and healthcare continuity by mitigating the impacts of the various restrictions placed on the way we work. Recognizing this, many AREA member companies have introduced special offers and services to help industry during the pandemic and we applaud their support. Learn more about them here.

We invite you to discover more about how Enterprise AR is helping industry improve its business processes at The AREA.




SAS Institute is Bringing “Intelligent Realities” to the Enterprise

The SAS Institute has been a world leader in analytics software for more than four decades. Today, the privately-held North Carolina-based company is expanding its reach into Augmented Reality (AR). We recently spoke with Michael Thomas, SAS Systems Architect, to learn more about his company’s approach to AR and what he refers to as “Intelligent Realities.”

AREA: What is driving the SAS Institute’s interest in Augmented Reality?

MICHAEL THOMAS: We’ve always sought to deliver our data and analytics capabilities via new devices and user interfaces as they’ve become available. In the ‘80’s, we brought them to the PC. In the ‘90’s, we brought them to the Web, and then tablets. And now we’re on to this new user interface that’s penetrating the enterprise. It’s the next place for us to provide our Artificial Intelligence (AI) and analytical data value. As a Systems Architect, I’ve been looking at these emerging technologies to figure out, at an architectural level, how they fit. As part of that, I’ve been developing AR and VR for commercial use cases

AREA: Can you tell us about some of the use cases you’ve been involved in?

MICHAEL THOMAS: One topical use case we’re tracking involves using AR for germ-fighting, along with the Internet of Things (IoT) and AI. IoT sensors are used to detect areas meriting closer scrutiny due to germ-spreading behavior, such as coughing. Custodians assigned to keeping those areas clean can then focus their efforts by using either headset AR or a spatial AR approach. Another example is in manufacturing – being able to use AR combined with IoT data and AI to give technicians the ability to more rapidly repair and proactively address issues to keep manufacturing equipment available and online. That also involves tying in remote experts. But while many remote expertise use cases are built around the idea that the expert sees the video that the proximate user is gathering with their headset, we go beyond that to take the IoT data from that piece of equipment, analyze it in real time, and give the most pertinent information to that remote expert. They can then use VR technology to better advise the remote technician.

AREA: In one of your blog posts, you argue that enterprises should not fixate on head-mounted AR devices and rather think more in broader terms of what you call “intelligent realities.” What do you mean by that?

MICHAEL THOMAS: Intelligent realities for workers means you improve work by making the reality of work better. SAS is not an AR vendor so, rather than thinking in terms of devices, we look at what form factor will enable us to manifest our value and make the customer better. It’s wide open. Does a tablet do what you need to do? If so, that’s great. We’ve had customers who have experimented with head-mounted devices and been disappointed. So they’ve shifted to pursuing other ways to make those realities more intelligent. That gets them into spatial AR, but also more pedestrian things like using transparent LED screens or projected light. As headsets get better, we expect some of that resistance to go away. But we’re just taking a broader perspective on how to make that reality better that isn’t just the latest technology.

AREA: What do you see as the next significant milestone in the adoption of AR?

MICHAEL THOMAS: I think this year will be a good year for headsets. We’re getting to a second generation of Mixed Reality headsets with a form factor where you can actually expect people to wear them for a long time. And then from there, as we get focused on commercial AR, we at SAS have the technology and the ability to give you the content that’s going to improve your reality right now. That’s our piece. And it’s going to be very exciting to see that new growth develop.

Michael Thomas has authored several thought leadership publications on Intelligent Realities that we would like to share with AREA readers. They include:

 




AREA Members Offer Pandemic Support

As organizations throughout the world cope with the quarantining and work-from-home restrictions necessitated by the global coronavirus pandemic, AREA members are springing into action to help. Many of them have launched special offers that enable organizations to use their AR tools to overcome limitations to collaboration and business continuance. 

Here are some of the offers AREA member companies have told us about: 

  • Atheer has offered free licenses of its Atheer AR platform until the end of June 2020. All licenses, onboarding, and support will be provided by Atheer for free with no commitment of any type required. 
  • Augmentir has announced it is offering free use of its Remote Assist tool for the remainder of 2020. Remote Assist provides a remote collaboration and support solution that can be adopted in less than 60 minutes. 
  • Iristick is offering its smart glasses with three months of free software use (remote assistance). In addition, AREA members can receive a 10% discount on the company’s Z1. Essential and Z1.Premium products. 
  • PTC is making its remote assistance product, Vuforia Chalk, available for free so employees can collaborate in operation, maintenance, and repair. 
  • Theorem Solutions will provide free CAD translation services to any organization that has switched to producing ventilators and has found itself working with incompatible data formats.
  • The Advanced Manufacturing Research Centre (AMRC) Design and Prototyping Group have responded to Britain’s call to produce more Personal Protective Equipment for healthcare workers by using technologies such as 3D printing and laser cutting to make up to 1,000 face shields per week. The face shields are being distributed to area hospitals. 
  • Scope AR has created a Quick Start Program that supports organizations limiting travel by connecting technical experts to hands-on workers. The program leverages visual remote assistance to enable diagnoses, repairs, and upgrades, as well as bringing training to remote employees and clients via AR. 
  • XMReality now offers a free premium version of Remote Guidance which will make it possible for anyone to try out Remote Guidance. The new offering provides businesses interested in learning about the benefits of remote guidance an easy and free way to see how remote guidance can improve service functions.

 We applaud these companies for their efforts and will continue to share additional AREA member company offers as we hear about them. 




Enterprise Augmented Reality Solutions – Build or Buy?

Short of time? Listen to the accompanying podcast (~8 minutes) available here.

This is a question with no simple answers. As enterprises contemplate deploying AR solutions, one of the first questions to confront them is a fundamental one: should we build or buy? This AREA editorial explores the factors that may help organizations answer this critical question.

The build-or-buy decision essentially boils down to determining the relative priorities of cost, control, solution availability and time to market.  In traditional solution deployments, the advantages and disadvantages of each approach can be summarised as in the table below.

Consideration for Enterprise Augmented Reality

When it comes to enterprise AR deployment, the build-or-buy deliberations need to take in additional considerations. These may include some or all of the following:

  • Implementations on novel or unfamiliar hardware
  • Development of advanced computer vision capabilities
  • Application development based upon AR toolkits
  • Data processing, protection and optimisation (e.g. for 3D models)
  • Integration into enterprise business systems
  • Development of custom content for new methods of deployment and user interaction
  • Customisation of the base solution to meet specific needs

A previous AREA editorial explored how AR should be considered within the scope of a technology strategy. For the purposes of this editorial, we shall omit custom hardware development from the discussion, but rather, focus on the software build-or-buy decision. There may also be wider implications if significant customisations are needed or content must be created by internal or external personnel.

Here’s a typical set of steps leading to the decision-making phase:

  1. Identify business use case, perform investment analysis and secure budget.
  2. Define weighted requirements for the solution to the identified business problems or opportunities.
  3. Identify potential vendors and their commercial solutions.
  4. Perform a gap analysis between commercial offerings and solution requirements.
  5. Identify whether gaps can be closed with customisation or custom development.
  6. Perform cost analysis of internal/external development versus commercial solution.
  7. Evaluate options and make strategy decision.

Target use cases are an important factor

It’s important to understand that enterprise AR-based solution needs may vary significantly according to the target use case. For example, the table below provides a view on how aspects of the solution needs vary across four example applications of AR for business use cases:

Such factors may play an important influence in the build-or-buy process. Take the AR-enhanced product demonstrator (sales) use case above, for example. The low levels of integration with business systems and data that this solution requires, coupled with other factors such as time criticality and reduced longevity needs, may make it appropriate to subcontract all software development and content creation to a third party.  

If your use case is unusual, then you may need to consider purchasing an AR platform that allows custom development (whether via drag’n’drop authoring, coding or other mechanisms).

Example checklists

Typical questions to consider when making the build-or-buy decision are as follows:

  • Have you identified the business applications (or problems to solved)?
  • Have you developed the requirements needed to address the business problem?
  • Are there commercial offerings claiming to provide a solution for your use cases?
  • Are you confident that the solution meets your functional requirements?
  • Would more than one commercial product be needed to provide the solution?
  • Are you confident of the solution provider’s financial viability?
  • Are there gaps between the commercial solution and your requirements? Are these gaps important and/or able to be closed? Are there other edge cases to be considered?
  • Do you have the required skills in-house? Alternatively, are there vendors who can supply the skills within budget?
  • What toolkits are available that can help provide the underpinnings of a custom solution?
  • Is complete control and ownership of the solution important to your business (for reasons of market differentiation, security or others)?

The following table offer some additional important considerations more specific to an AR-based solution:

Choose wisely – and consult experts

This editorial has explored a number of considerations that are important when seeking to adopt AR in an enterprise setting. Companies may be tempted to develop prototype applications when first investigating AR, perhaps using one or more of the commercially available toolkits. However, there are clearly a number of important aspects to consider in reaching a build-or-buy decision.

It is unlikely that an industrial company will develop an in-house AR application from the ground up, as this requires significant expertise in numerous areas, including computer vision, 3D computer graphics, mobile device management, etc. If your use case is truly unique and there are no commercial products that support the use case, then your only option may be to develop the solution this way.

Far more likely, however, is the decision to purchase a commercial-off-the-shelf solution. As we’ve discussed, and depending upon your target use case, there may be significant requirements on systems integration, data processing, content creation and other forms of customisation required prior to considering a deployable solution.

As discussed above, the decision is often driven by requirements of cost, control and timing. If cost and timing are a higher priority, then a commercial offering is likely the more appropriate solution. If control is most important, then it is perhaps better to pursue internal development or, more likely, contracting the work to a third party.

Ultimately, the decision is yours. However, prior to making that decision, we recommend that you look at the offerings of the AREA solution provider members who will be happy to discuss and hopefully meet your requirements.




Mixing and Matching Standards to Ease AR Integration within Factories

AREA member Bill Bernstein of the National Institute of Standards and Technology (NIST) shares his organization’s early work to improve AR interoperability.  

Today, most industrial Augmented Reality (AR) implementations are based on prototypes built in testbeds designed to determine if some AR components are sufficiently mature to solve real world challenges. Since manufacturing is a mature industry, there are widely accepted principles and best practices. In the real world, however, companies “grow” their factories organically. There’s a vast mixing and matching of domain-specific models (e.g., machining performance models, digital solid models, and user manuals) tightly coupled with domain-agnostic interfaces (e.g., rendering modules, presentation modalities, and, in a few cases, AR engines)  

As a result, after organizations have spent years developing their own one-off installations, integrating AR for visualizing these models is still largely a pipedream. Using standards could ease the challenges of integration, but experience with tying them all together in a practical solution is severely lacking.  

To address the needs of engineers facing an array of different technologies under one roof, standards development organizations, such as the Institute of Electrical and Electronics Engineers (IEEE)the Open Geospatial Consortium (OGC)and the Khronos Group, have proposed standard representations, modules, and languages. Since the experts of one standards development organization (SDO) are often isolated from the experts in another domain or SDO when developing their specifications, the results are not easily implemented in the real world where there is a mixture of pre-existing and new standards. The problem of low or poor communications between SDOs during standard development is especially true for domain-agnostic groups (e.g., the World Wide Web Consortium (W3C) and Khronos Group) communicating with domain-heavy groups (e.g., The American Society of Mechanical Engineers, the MTConnect Institute, and the Open Platform Communications (OPC) Foundation).  

However, both perspectives – domain-specific thinking (e.g., for manufacturing or field maintenance) and AR-specific and domain-agnostic concerns (e.g., real-world capture, tracking, or scene rendering) – are vital for successfully introducing and producing long term value from AR.  

Smart Manufacturing Environments 

In the case of smart manufacturing systems (SMS), SMS-specific standards (e.g., MTConnect and OPC-Unified Architecture) provide the necessary semantic and syntactic descriptions of concepts, such as information about devices, people, and materials. Figure 1 showcases the current state of an industrial AR prototype with examples of standards to inform processes.  

 

Figure 1: General workflow for generating industrial AR prototypes. The dotted purple lines signify flows that are currently achieved through significant human labor and expertise.  

From a high-level view, the AR community is focused on two separate efforts: 

  • Digitizing real-world information (shown on the left of Figure 1); 
  • Rendering and presenting AR scenes to the appropriate visualization modalities (shown on the right of Figure 1).  

To produce successful and meaningful AR experiences, it is vital to connect to domainspecific models with domain-neutral technologiesIn the current state of AR development where few or no standards have been implemented by vendors, this task is expert-driven and requires many iterations, human hours, and experience. There are significant opportunities for improvement if these transformations (indicated by the purple dashed lines in Fig. 1) could be automated.  

In the Product Lifecyle Data Exploration and Visualization (PLDEV) project at NIST, we are experimenting with the idea of leveraging standards developed in the two separate worlds: geospatial and smart manufacturing or industry 4.0. One project, shown in Figure 2, integrates both IndoorGML, a standard to support indoor navigation, and CityGML, a much more detailed and expressive standard that can be used for contextually describing objects in buildings, with MTConnect, a standard that semantically defines manufacturing technologies, such as machine tools. All these standards have broad support in their separate communities. Seemingly every day, supporting tools that interface directly with these representations are pushed to public repositories.  

Figure 2: One instance of combining disparate standards for quick AR prototype deployment for situational awareness and indoor navigation in smart manufacturing systems.  

In Figure 2, we show the use of IndoorGML and CityGML in a machine shop that has previously been digitalized according to the MTConnect standard. In doing so, we leverage existing AR visualization tools to render the scene. We then connect to the streaming data from the shop to indicate whether a machine is available (green), unavailable (yellow), or in-use (red). Though this is a simple example, it showcases that when standards are appropriately implemented and deployed, developers can acquire capabilities “for free.” In other words, we can leverage domain-specific and -agnostic tools that are already built to support existing standards, helping realize a more interoperable AR prototyping workflow.  

Future Research Directions 

This project has also demonstrated significant future research opportunities in sensor fusion for more precise geospatial alignment between the digital and real worlds. One example is leveraging onboard sensors from automated guided vehicles (AGVs) and more contextually defined, static geospatial models described using OGC standards IndoorGML and CityGML  

Moving forward, we will focus on enhancing geospatial representations with additional context.  For example, (1) leveraging such context for AGVs to treat task-specific obstacles (like worktables) differently than disruptive ones (like walls and columns) and (2) helping avoid safety hazards for human operators equipped with wearables by more intelligent rendering of digital objects.  We are currently collaborating with the Measurement Science for Manufacturing Robotics program at NIST to investigate these ideas.  

If successfully integrated, we will be able to demonstrate what we encourage others to practice: adoption of standards for faster and lower cost integrations as well as safer equipment installations and factory environments. Stay tuned for the next episode in this mashup of standards!  

Disclaimer 

No endorsement of any commercial product by NIST is intended.  Commercial materials are identified in this report to facilitate better understanding.  Such identification does not imply endorsement by NIST nor does it imply the materials identified are necessarily the best for the purpose. 




Masters of Pie Fulfills a Growing Need for Immersive Collaboration

As its website proclaims, new AREA member Masters of Pie offers “the only industry-ready solution that provides heavy-duty immersive collaboration with end-to-end encrypted sharing of real-time data, across all devices, for all the team.” We spoke recently with the London-based company’s co-founder and CEO Karl Maddix to learn more.  

AREA: Tell us how Masters of Pie got started.  

KARL MADDIX: Matthew Ratcliffe and I founded Masters of Pie in 2011. We both have backgrounds in 3D real-time technologies. Matt was working in real time visualization for architecture, whereas I was doing animation and character art for games and short films. We met in 2009 at a London agency that had a contract for what was a very early digital twin prototype project for a water treatment plant. Matt and I made basically a digital twin of the physical site using laserscan data which was plumbed into streamed sensor data from the plant itself that would then be able to be seen and interacted with. It was really ahead of its time and we pioneered a lot of the processes and techniques to make it viable for industry.  

Masters of Pie was spawned from that project. The concept was simply to apply our expertise in the real-time world to the enterprise. We started as a service provider, doing R&D using game engine technology for interactive applications, prototypes, and products. We did things like making interactive CAD portfolios for engineering companies who have big industrial presses that they wanted to interact with. We were also careful not to just build shallow self-contained apps; we always tried to drive them with actual industry data. We were learning how to make play nice with real time engines. Masters of Pie did some early showcases for Siemens around interactive data sets and this introduced us to the engineering world and got us exposure among Siemens end users, such as Volkswagen, Ford, and Rolls Royce. That was when we started to identify the big problems that we wanted to tackle with our own products when we made the switch from service to product.  

When the Oculus Rift DK1 appeared on Kickstarter, we immediately saw its value for what we were doing, which was putting big CAD models into 3D real-time engines. Luckily for us, the DK1 arrived a couple of weeks before we were due to go to Germany to meet with Siemens about a mobile-based project. So, just a few weeks after the DK1 was released worldwide, we were in Siemens offices showing them something impressive with it, something they had never seen before. That was a pivotal moment when we saw the excitement generated from that meeting.  

With access to their customers like Volkswagen, we were able to test out ideas for our own product. Before, I’d been able to show them a full-sized car, but it was apparent they had no way of getting their data into that application without a great deal of pain. The VR element was nice and all, but it was the complexity of this data which was stuck in silos that was the real issue. We explored that concept. How does data get from where it is created in a CAD package or in a Product Lifecycle Management system so it can be shared across different teams, efficiently and quickly, while it is still live data and not outdated by two weeks because it was sent offshore to be refractured or reformatted in some way? We wanted to enable the sharing of actual live or real time data among disparate teams.  

That is the core problem statement that Masters of Pie decided to tackle. Our approach to address this challenge was to develop a fully extensible and modular software framework called Radical to integrate deep into where the live data resides. The decision to take this direction was made in 2016, when we turned off the tap of our service work and became a software product company. All of our previous profits were ploughed into building the first generation of our “Radical” platform.  

AREA: That was a leap of faith.  

KARL MADDIX: Yes. We’ve always been like that. Our real motivators are solving big industry problems such as enabling realtime collaboration on large and complex 3D data. Because we also had such great access to industry leaders, such as Rolls Royce, we had very good feedback and indicators that we were on the right track. They told us this was a real problem for them and nobody was even trying to solve it.  

AREA: What made you think that an SDK or software framework was the way to productize what you needed to do?   

KARL MADDIX: One of the early prototypes we made was based on using the open API from a CAD software package and integrating the Radical software ourselves. The result was like having a Radical button within the host software. When you clicked it, it instantly brought the CAD data into our environment. More importantly, it was still bidirectionally linked to the CAD package and so all the associated metadata was available and enabled powerful functionality such as the ability in VR to complete accurate measurement.  

The large manufacturing customers automatically saw the value and wanted to proceed, however would prefer if the software was fully integrated via an established and entrenched technology partner. This feedback was critical in pivoting the business model focus to an indirect sales process versus building out a large direct sales team. Masters of Pie would instead concentrate on the technology as an extensible software framework to license to companies who built the host packages, such as CAD providers, who sell directly to the target end customer. Siemens was the first OEM partner to integrate the software and has been delivering Radicalenabled immersive functionality since 2017 to their installed base.     

 Masters of Pie software is not just about the CAD/PLM market. Any company offering software that generates really complex data or holds complex 3D data is a potential target customer. We built Radical to be flexible enough to work with multiple data types. We are certainly not building it as just a CAD solutionRadical doesn’t care what data type is pushed into it. We are just as happy with other formats such as point clouds, MRI scans or any other complex data. Instead, what we are building is what we call a “collaborative thread framework.” The concept is that we will be the connective tissue between multiple pieces of the ecosystem that are starting to bubble up. People will soon want to work freely across factory floors or in the field using AR, VR or mobile devices – however, this is not enabled by any one group. It will be a complex landscape of offerings. But it all starts with getting the live data.  

Masters of Pie secures data access by being integrated to the CAD or PLM packages, but we also want to be integrated to the IIoT platforms, and pull IoT data that you can then surface in our environment, alongside the CAD. We are talking to cloud service providers so we can start looking to connect teams in larger spaces such as factories, and have spatial anchorings support from Microsoft, for example, so you can walk around the factory and know spatially exactly where you are. It’s this concept of, okay, you’ve got 5G coming, you’ve got cloud service providers wanting to stream to multiple devices, viable AR that is going to be coming pretty soon, VR is fairly established. All of these little pieces, we are looking to tie together with our singular platform pushing live data to connected teams. That’s why we call it a collaborative thread. 

 AREA: You make it sound very easy. But hasn’t it been a big problem to pull data from all these different sources and to do it so quickly? What’s your secret?  

KARL MADDIX: I’ll be honest, the easier component of what we do is the technology. We’ve got a very highly skilled team and we are pretty good at what we do. We had a good insight early on, which I think gave us a good head start over the rest of the market. And we’ve got some very established relationships which help with some of these big players. The more difficult part is the business side; for example, securing an OEM agreement, bringing in technology partners, and building strategic partnerships with key industry leaders. We’re talking to people at Microsoft, AWS, Nvidia, Ericsson, Vodafone and we’ve just closed a funding round which included Bosch and Williams Advanced Engineering.  

AREA: As we evolve more toward integrating all of these different pieces – Augmented Reality, Artificial Intelligence, the Internet of Things – it seems as if you’re in a good spot to be the glue that pulls it all together.   

KARL MADDIX: What Matt and I realized was that there is no real clear killer VR or AR application that is going to change the world yet. How do we mitigate our risk given that? The approach of an extensible software framework product does help us. Our customers don’t necessarily need to know what that killer application is yet. All they need to know is that by adopting our platform, they gain the ability to build products quickly, integrate it into their current portfolio, and be ready as and when these use cases appear. We don’t need to worry about what exact AR device our customer is going to be using in five years’ time. All we need to know is that, in order for that to happen, you need a holistic architectural approach, like Radical, to get the data flowing, pulling people together and connecting these moving parts. Industry needs that infrastructure now. Large Industry software providers, such as Siemens, want to be ready for the next generation of products they are going to be putting out whilst upgrading their existing products so that they can stay connected and relevant. That is basically the value Masters of Pie provides to these software providers – the confidence to enable immersive collaborative products today while ensuring the approach will adapt to meet the challenges of tomorrow. We are providing them with the building blocks to prepare for the next wave of products and features, right now with the Radical software framework.  

AREA: As the current coronavirus pandemic has made very clear, organizations need tools that help disparate, dispersed teams collaborate. How does Radical support that kind of collaboration?  

KARL MADDIX: Yes indeed. Although it is obviously a terrible time for the world right now, it does highlight how unfit for purpose the traditional collaboration infrastructure is within enterprise. Webex and Teams are fine for connecting people in real time but not their data. If there ever was a time to show industry the way forward, then a global pandemic is it, even though I feel very guilty about saying it out loud. I think that once this virus starts to recede and people are going back to work, the first item on the agenda will be how to better prepare for the future should this threat appear again. That for us will be our golden hour as there really is not anything else as robust and flexible as Radical out there that can be adopted quickly and used wholesale across disparate software products within a portfolio. Unlike other solutions that may make a lot more noise than we do in the market, we are not vapourware or a shiny proof of concept, we are in-market right now with real product, trusted by industry and delivering value.   

AREA: Tell us why you joined the AREA and what you hope to get from your membership?   

KARL MADDIX: It is more on the technology side. The drive came from Matt Ratcliffe, our co-founder and Chief Product Officer. What we are looking to do is to get more direct access to end customers. We are striving to get better and more accurate, direct feedback from end users. Matt and the team felt that the AREA would be a good way to get our message out there, to start talking about our vision for the collaborative thread as Masters of Pie, and try to get more insight on whether we are doing the right things from the AREA members.