1

Augumenta’s Eve Lindroth on Shop Floor AR, Taiwan and the Future


When AREA member Augumenta participated in an AREA webinar about implementing AR on factory shop floors recently, we thought it would be worth catching up on the company and its activities. So we spoke the Eve Lindroth, the company’s head of Marketing Communications. Here’s our conversation.

AREA: Augumenta has distinguished itself as a leader in industrial shop floor uses of AR. To what do you attribute your success so far?

Lindroth: We have a large number of big and well-known industrial companies as our clients, and within these projects, our solutions have been adopted with very few changes. That tells us that we are taking the right approach to developing solutions for the industry. Our clients also praise the ease-of-use of our applications, and appreciate that there is no steep learning curve to start using them. Quite the opposite, they are considered easy to learn.

AREA: What’s a typical Augumenta client?

Lindroth: Most of our business is outside Finland. We have many manufacturing customers in France and Germany, for example, such as Siemens. We also have a presence in Japan and Taiwan which is important considering our focus on the Asian markets and the key customer projects we have ongoing there.

A typical client is a larger industrial company that is active in developing their operations – or during the pandemic, companies that are simply looking for the most efficient and practical ways to keep operating.

AREA: Speaking of that, in October, you announced a partnership with IISI of Taiwan. Tell us about the partnership, its goals, and its progress to date.

Lindroth: IISI is a system integrator and they have a very strong customer base in the fields of manufacturing and government. In our partnership, Augumenta acts as a technology/applications provider and the IISI experts do the final customization and integration with the end customer’s backend systems. Both companies can focus on their key strengths: we on the cutting-edge AR technology, and IISI on developing and managing the overall systems.

We started working together in the springtime and we have finalized all the customization needed for the end customer, a major semiconductor factory in Taiwan. We continue working in close cooperation with IISI and believe we are in a good position to advance enterprise AR in Taiwan together with them.

AREA: What do you see as the most significant barriers to AR adoption, and what is Augumenta doing to overcome them?

Lindroth: We have seen in many pilot projects that the organization has identified the problem they are looking to solve with a pilot, but for example, there are difficulties in defining the current status with an accurate number. For example, there’s downtime – how much there is and which factors exactly are causing it? That can be hard to come by. Another issue is user acceptance, but that can often be tackled by involving the people in planning the solutions from an early stage.

At Augumenta, we’re working to address those issues. For industrial pilots, for example, we created a simple checklist, just to remind the project managers and team leaders responsible for the pilot to consider the factors we have learned to be essential for an AR pilot’s success. These are related to things like target setting, planning together with your people and getting them involved throughout the process, or measuring the results. The checklist is available on our website.

AREA: What can we expect from Augumenta in 2021?

Lindroth: In the future, we believe that discrete industrial AR applications will become more integrated solutions. That means, for example, that there aren’t separate apps for alerting a user and guiding a user in tasks. There will be one solution that can do all of this – without the end user even noticing that there are many use cases included in the app. At some point, things like AI will make the end user’s job even easier by guiding him to the right data or expert automatically, for example.

A key success factor in such a solution is usability. Apps have to integrate seamlessly and be simple and intuitive to use independent of the use case at hand.

The pandemic has meant growth in demand for our services along with our clients’ need to find new ways to do things. In 2021, you’ll see closer integration of our apps. We’re working with new app features that are enabling efficient and sustainable working methods in the new normal. We’ll keep you posted with the latest developments during 2021.

AREA: Finally, how has Augumenta benefitted from its membership in the AREA?

Lindroth: The AREA has provided us with access to research, and there have been some great and very interesting research projects completed. We have also made many new contacts within the ecosystem via the AREA, and it’s always great to see and hear what’s going on with other ecosystem members. The AREA updates its social media channels very actively, and we appreciate the visibility they provide us.




Podcast – Getting Started in Enterprise Augmented Reality – Insights from Theorem Solutions

In this latest AREA Thought Leaders Podcast, AREA Executive Director Mark Sage poses these questions and more to Stuart Thurlby, CEO of Theorem Solutions, a UK-based company that has provided solutions to the world’s leading engineering and manufacturing companies for more than 25 years. As the leader of a firm that helps companies extract greater value from their 3D CAD assets, Mr. Thurlby understands the biggest challenges companies must overcome to deploy AR/XR successfully.

Don’t miss this must-listen, 15-minute conversation filled with insights into how to get started in Enterprise XR.

You can listen to the Getting started in Enterprise AR podcast here.

View The AREA’s other podcasts, videos and webinars here.




Wanda Manoth-Niemoller on KLM’s AR Venture, NUVEON

KLM, the flag carrier airline of the Netherlands, traces its history back to 1919 when Queen Wilhelmina gave it her royal stamp, making KLM one of the world’s first commercial airlines. Today, KLM’s fleet of 116 aircraft flies to 145 destinations worldwide generating more than €10 billion in revenues. In June of last year, KLM Engineering & Maintenance and Royal Netherlands Aerospace Centre (NLR) officially launched a joint venture, NUVEON, for the development of new AR products for MRO (maintenance, repair & overhaul).

NUVEON’s initial solutions address training needs by using Microsoft HoloLens to bring a complete virtual aircraft into the classroom. As part of our Thought Leaders Network program, we spoke recently with Wanda Manoth-Niemoller, Director of Commercial Development for NUVEON, to learn how the new venture is progressing.

 AREA: What was the motivation for starting NUVEON?

Manoth-Niemoller: Around 2016, the KLM training department wanted to know if AR was mature enough that we could use it. We started off with a proof of concept to see if we could benefit from AR and we chose NLR as our development partner. We built and tested one module and determined it was mature enough to use in training. Our learning experts felt it was especially useful for explaining system behavior, which is very difficult to do in a classroom setting. Showing how a system works is much more effective than simply reviewing a schematic. We built two more modules and saw the potential to do more with AR, so that’s when we decided to start NUVEON.

AREA: When you were looking into AR, why did you feel that you had to develop your AR software product?

Manoth-Niemoller: Because there was no existing AR product that we could use, and we wanted to commercialize whatever we created and make it available to other companies.

AREA: You launched NUVEON last June. Where do things stand now?

Manoth-Niemoller: The original proof of concept has been accepted for training by EASA, the European Aviation Safety Agency, which is the European equivalent of the FAA. That means our software can now be used to sign off on practical tasks for MRO training. We now offer several solutions for training on the Boeing 777 and 787. And there are more products on the way.

AREA: What have been your biggest challenges so far?

Manoth-Niemoller: The biggest initial challenge was the time it took to develop the product, because it had to be an exact copy of reality for the EASA to approve it.

AREA: What’s the next big hurdle for NUVEON?

Manoth-Niemoller: The next hurdle is to extend the range of use cases we support. Our current applications are now in day-to-day use in training, and we plan to support more systems and also extend beyond training into other use cases in MRO.

AREA: What kinds of reactions have you had from the users?

Manoth-Niemoller: It depends. The reaction often has to do with what the person is accustomed to. Some people first refuse to use it until they put it on their heads. We’ve introduced it to engineers who had already been doing the job for many, many years and were not used to innovative tools like this. They don’t see the advantage of it right away – until they put it on their heads, see what it can do, and then say, “Whoa!” and they can’t stop. They want to try everything. They see that, with AR, they can do much more than they could ever do with an operational aircraft. It actually delivers a deeper level of training. It’s effective because we enable several engineers to share a single image. They then have to solve a task together, as they would do in real life, but they’re able to get into much more detail than they’d be able to do in real life.

AREA: How are you making the NUVEON solution available to other companies?

Manoth-Niemoller: We can do it in several ways. We can conduct a training course for them, because we can use the system on location. We can also sell them the tool. Or we can develop something customized to their individual specifications.

To learn more about NUVEON’s solutions – including videos – please visit the NUVEON website.




“Urgency” Will Drive AR Adoption in the COVID-19 Era

In addition to being CMO and President of RE’FLEKT, Dirk Schart is an industry expert who has been involved with Augmented Reality (AR) to one extent or another for a decade. Recently, he drew upon this perspective to explore what he sees as an inflection point in the continuing history of AR. His thesis, as detailed in his personal blog, is that a newfound urgency driven by COVID-19 is leading to an imminent upswing in AR adoption. We spoke to Dirk to discuss his ideas further. 

AREA: What is different about this moment in the history of AR? What has changed?

Schart: Two things have changed. One is that we’re not focusing on technology anymore, but rather on solutions for enterprises. Second, we’re seeing a higher level of urgency that we hadn’t seen before. These factors, to use the term from Geoffrey Moore’s 30-year-old classic book, suggest that AR is “Crossing the Chasm.” We’re moving from experimental early adopters to a more mainstream market that expects a ready-to-use, ready-to-integrate solution. COVID-19 is acting as the accelerator. COVID is driving people to use AR in real scenarios in their daily work lives. So now we have real users coming back with more specific requirements for their solutions. We’re not crossing the chasm yet, but we’re coming closer to it.

AREA: Is the pandemic accelerating certain use cases more than others?

Schart: In the past, when we talked about use cases, we talked about maintenance, operations, and training – a very high-level description of use cases. Now, we’re talking about, for example, onboarding new employees for a product launch. It’s much more concrete. We’re seeing use cases that are driven by the travel bans caused by the pandemic. The most popular one is remote support. You don’t need to have any specific hardware; you just take your phone, call, and you get guidance immediately. There are others, as well: as I mentioned, onboarding of new operators, as well as production line changeovers, as companies move production from one facility to another.

AREA: If a working vaccine is developed for COVID-19, hopefully in the near future, do you think the adoption of AR will continue at the same pace, or will companies go back to the way they did things before?

Schart: It’s a fair question. A major factor is human behavior – and humans don’t like change. But people are seeing now that they can handle things more easily with AR. And all of the managers at these companies are realizing, “We don’t need all that travel. We can save a lot of money by not traveling.” Now they realize they have the tools to do it without traveling. But it will take time. This year I expect remote support will be the catalyst for everything. But by 2021 or 2022, I think you’ll have more use cases with AR as it starts to deliver more value than existing tools. There’s also a big focus now on making more tasks digital and automated by leveraging AR and AI. That will have a big impact. At RE’FLEKT, we’ve seen a 300% increase in our monthly active users, even hitting a peak of 600% – and they’re still using it. That gives me the confidence that we’re finally showing the value. Of course, there are still problems to solve; content creation has to be easier, smart glasses are not ready, but I’m confident that we will see new use cases next year.

To read the full text of Dirk Schart’s article, please visit his blog page.




Augmented Reality in Medical & Pharma: Industry challenges in medical device manufacturing and how to tackle them with AR solutions

This editorial has been developed as part of the AREA Thought Leaders Network content, in collaboration with selected AREA members.


Corporations in the medical and pharmaceutical industry need to adhere to the highest standards of quality, with accuracy and precision being the keys to success. If organizations experience equipment errors or healthcare workers make mistakes, they not only put human life at risk but also incur significant consequences for payers, including financial and credibility loss. To reduce error rates and increase quality, businesses across the industry are turning to the latest technologies – including Augmented Reality (AR).

AR and VR technology is already being used and significantly improving processes in medical device manufacturing. This editorial discusses two major use cases in which AR solutions simplify workflows to reduce human error:

  1. Medical device assembly
  2. Production line changeover

Let’s take a closer look at exactly how AR technology can tackle key challenges in both cases, with the help of some first-hand insights from William Harding of industry leader Medtronic, recently interviewed by RE’FLEKT.

Key challenges in medical device manufacturing

Medical device manufacturing typically involves a variety of manual, semi-automatic and automatic processes which makes production particularly vulnerable to error – especially as large manufacturers need to employ the same processes across multiple facilities, often without standardization of production data. In addition, if there is a lack of training among operators it can increase the risk of mistakes made during manual tasks when medical equipment is assembled and configured.

William Harding, Distinguished Fellow at Medtronic, reveals which factors medical device manufacturers need to consider when introducing changes on the production floor:

“If I add a new process to a production line, many questions need to be addressed: How do I get the process to integrate seamlessly (e.g., communication protocols, data aggregation, and data transformation)? How do I accomplish that without using paper-based systems? The goal is to speed up efficiencies and reduce scrap while also reducing human error. When we create a new process in lean manufacturing, we need to establish the most ergonomic way for an operator to perform their tasks within a sterile environment. We also want them to complete these tasks in the most efficient way possible, while delivering a high-quality product. There are many factors to be considered.”

Prior to introducing a new manufacturing process, operators need to be trained on how to perform each step to ensure maximum efficiency and minimum error rates during production. William further explains how Medtronic originally used a cardboard replica of their manufacturing line for training purposes and what challenges came along with it:

“It used to take us two and a half weeks to build a cardboard set-up with five process stations. For one training session, we also needed at least eight to ten people off the production floor, who then weren’t engaged in manufacturing products while they were in training. It would cost us about $30,000 for one training effort with the cardboard set-up. We usually require five sessions in total to get everything right, and by the time we decide that everything is ready, we’re making changes five minutes later.”

Simplified training and operations with AR solutions

With AR technology, medical device manufacturers like Medtronic can not only manage the challenges listed above, but also benefit from significant operational improvements, as the following two use cases reveal.

1. Enhanced AR Training for device assembly and set-up 

Training around medical device set-up and configuration is traditionally based on Operating Procedure (OP) documentation that is not user-friendly. Extensive manuals, including complicated 2D diagrams and text-based instructions, make it challenging to find the right information quickly for device operators. Consequently, onboarding is time consuming and devices may be set up incorrectly and/or not used to their full potential.

Many leaders in the medical sector, including Medtronic, are turning to AR to train employees to set up and assemble their equipment. With results that decrease human errors by 90% and improve training times by 60% (see this white paper for further info) the reasons are obvious. AR training solutions allow device operators to visualize complicated OP documentation in a simple way with the right mixture of videos, text, and images that appear directly in context with the real object. This ensures that device operators always have training content available instantly on their mobile devices, tablets, or smart glasses, thereby experiencing fewer errors during device assembly and set-up.

William from Medtronic shares how AR training guides have replaced the cardboard replica during operator training at Medtronic:

“With content creation platforms like REFLEKT ONE, we can now create AR applications that allow operators to learn a new process by walking through engaging training guides on a tablet instead of using our cardboard model.”

2. Lean production line changeovers with AR-based procedures

When switching the production line from one product to the next, every minute of changeover time comes at the cost of missed revenue as production is down while teams rearrange, set-up, and configure the equipment for the next production cycle. Lean manufacturing strategies can help solve the dilemma to shorten downtime and increase the final output.

AR guidance during changeover procedures results in 40% fewer errors and a 25% faster changeover speed (see this white paper for further info). The interactive guides show operators the ideal state of the task at hand in AR next to the actual state. This way operators can always see what needs to be done as they are working. As augmented instructions guide operators through each step, the risk of error is ultimately reduced for all manual stages of the changeover.

This digitalized process is faster and more reliable as William confirms from his own experience creating AR solutions at Medtronic:

“Recently I created a solution to train operators on a manufacturing process for our Linq II battery bond (an implantable 2 lead EKG data recorder for patients). I made the content available to them online, where they could walk through it themselves and learn how to perform the process using gestures in AR. It’s a very fast and effective way of training because it saves resources and is so close to the real manufacturing environment.”

Outlook: The future of XR technology in the medical sector

These two use cases are great examples of how AR technology is already making a measurable difference in tackling key challenges in training and operations within medical device manufacturing. For the future, William forecasts a growing adoption of AR and ultimately Mixed Reality solutions at Medtronic as well as across the industry:

“Through the use of this technology in the future, I know that Medtronic will be able to more quickly understand the needs of patients, healthcare professionals, and payer’s needs, such that the lifecycles of innovation are reduced in addressing those needs. That same point can be made within the medical device manufacturing industry, specifically as it relates to product and process transfers as well as in the training of the individuals responsible for completing the assembly of those devices. However, it is my belief that AR and eventually MR technology will make the use of VR less important because users will prefer the more relatable MR environments.”




COVID-19: How Augmented Reality is helping mitigate business impact

This editorial has been developed as part of the AREA Thought Leaders Network content, in collaboration with selected AREA members.


Short of time? Listen to the accompanying podcast (~10 minutes) available here.

An imperative to overcome limitations

The COVID-19 pandemic has unleashed an unprecedented impact across the global business landscape. Over recent months, many countries have implemented various forms of lockdown, severely limiting the ways that companies can do business, and, in many cases causing operations to cease. This crisis is likely to have an ongoing impact in the months ahead as we transition to a “new normal” and beyond.

This editorial discusses ways in which Augmented Reality (AR) can help mitigate the societal and business impact while supporting business continuity through the pandemic.

The restrictions placed upon both individuals and organizations have resulted in an upsurge in the use of digital technologies to support a variety of activities, including online shopping, digital and contactless payments, remote working, online education, telehealth, and entertainment. The ability to support these activities is heavily reliant upon the availability of “digital-ready” infrastructure and services.

Enterprise AR builds upon this digital infrastructure by offering the ability to juxtapose digital content over a live view of the physical world to support business processes. So how can AR help?

First, let’s examine the impacts that COVID-19 and subsequent responses have had upon business and society:

  1. Social distancing measures hinder our ability to have traditional face-to-face interactions in addition to often limiting the size of groups able to gather.
  2. The inability to travel and prevalence of key staff working from home are viewed as impacting the ability to conduct business, manage effective team operations, and provide local expertise where it is needed, amongst others.
  3. Fewer on-site staff due to illness, self-isolation and financial restrictions impedes an organization’s ability to continue operations “as before.”
  4. A lack of classroom and hands-on training makes it difficult to quickly upskill new staff or train existing staff on products and processes.
  5. Disrupted supply chains are requiring manufacturing and sourcing processes to become more flexible to help ensure continuity of production.
  6. The potential for virus transmission has caused a reluctance among workers to touch surfaces and objects that may have been touched by others.

Clearly, to help address these challenges, new or enhanced tools and ways of working are required. At the AREA, we believe that AR can play an effective role in mitigating a number of these obstacles and, at the same time, offering new opportunities to provide long-term business improvements.

AR can help address COVID-19 restrictions with remote assistance

A key use case of Enterprise AR is in the realm of remote assistance.  AR-enhanced remote assistance provides a live video-sharing experience between two or more people. This differs from traditional videoconferencing in that such tools use computer vision technology to “track” the movements of the device’s camera across the scene. This enables the participants to add annotations (such as redlining or other simple graphics) that “stick” onto elements in the scene and therefore remain in the same place in the physical world as viewed by the users. Such applications support highly effective collaboration between, for example, a person attending a faulty machine and a remote expert, who may be working from home. This use case helps mitigate impacts of travel reduction, reduced staffing, and, of course, social distancing.

 

AR-enhanced remote assistance for medical equipment procedures (YouTube movie). Image and movie courtesy of RE’FLEKT.

 

Sarah Reynolds, Vice President of Marketing, PTC comments, “As organizations look to maintain business continuity in this new normal, they are embracing AR to address travel restrictions, social distancing measures, and other challenges impacting their front-line workers’ ability to go on-site and operate, maintain, and repair machines of all kinds. Even when equipment or product experts can’t address technical issues in person, AR-enhanced remote assistance enables them to connect with on-site employees and even end customers to offer them contextualized information and expert guidance, helping them resolve these issues quickly and ultimately reduce downtime. AR-enabled remote assistance marries the physical and the digital worlds – allowing experts and front-line workers to digitally annotate the physical world around them to improve the clarity, precision, and accuracy of their communication and collaboration.”

AR-enhanced remote assistance enables business continuity for machine operations, servicing and repair. Image courtesy of PTC.

AR enables no-touch product interaction via virtual interfaces

A key capability of AR is the ability to superimpose a digital virtual user interface on physical equipment that may have a limited or non-existent user interface. The user is able to, depending upon the technology used, select actions by tapping on the screen of the device or, alternatively, use hand gestures or verbal commands to interact with the equipment via the AR-rendered “proxy” user interface. The provision of such abstracted interactions is key to reducing the amount of touching required by physical objects that may be used by numerous people.

There are many ways in which such AR capabilities can help medical professionals carry out their duties during the current pandemic and beyond. The BBC has reported on one such application that helps reduce the amount of physical contact between doctor and patient, while still enabling them to communicate with colleagues outside of the COVID-19 treatment area. Here, a doctor wearing a Mixed Reality headset is able to interact with medical content such as x-rays, scans or test results using hand gestures while others are able to participate in the consultation from a safe location. The article points out that this way of working also reduces the need for Personal Protective Equipment (PPE) as colleagues are able to participate from a safe distance.

Example of a virtual user interface projected into the physical world. Image courtesy of Augumenta.

 

Eve Lindroth, Marketing and Communications at Augumenta, comments, “Today, the devices and applications can be controlled hands-free. This also addresses the problem of being able to work hygienically. You do not need to touch anything to get data in front of your eyes, control processes, or to document things. You can simply use gestures or voice to tell the device what to do. Tap air, not a keyboard.”

AR can help medical equipment training

AR can also be used to help assist medical professionals by providing highly efficient and interactive training methods that can streamline the process of learning new equipment and other necessary procedures. This is critical when experienced staff are unwell and replacements need to be trained as quickly as possible.

Harry Hulme, Marketing and Communications Manager at RE’FLEKT, comments, “We’re seeing that AR is a key tool for healthcare workers during these testing times. For medical training and equipment changeovers, AR solutions substantially reduce the risk of human error while significantly reducing training and onboarding times. Moreover, the time-critical process of equipment changeover is accelerated with AR-enhanced methods.”

 

AR-based training with REFLEKT ONE and Microsoft HoloLens in medical and healthcare. Image courtesy of RE’FLEKT.

 

AR supports remote collaboration

The remote assistance use case can be generalized further to include remote collaboration.  AR enables users who are physically separated to be able to “inhabit” a shared virtual space, distributed by the AR application. This ability enables the support of numerous use cases, including shared design reviews. In this scenario, multiple users can see the 3D product models and supporting information projected onto their view (and from their relative position) of the physical world, via their AR-enabled devices.

Different design alternatives can be presented and viewed in real-time by all participants, each of whom can position themselves in their physical space to obtain a particular aspect of the digital rendition. Further, users can annotate and redline the shared environment, providing immediate visual feedback to all others.  Such capabilities are key factors in mitigating the restrictions imposed upon travel, the forming of groups and close-proximity human-to-human interaction

 

Immersive collaboration: A design review of a motorbike in 1:1 scale with a remote team. Image courtesy of Masters of Pie.

 

Karl Maddix, CEO of Masters of Pie, comments: “Video conferencing solves the immediate need to bring people together whereas collaboration, as enabled by Masters of Pie, is built for industry to bring both people and 3D data together. Real-time access to the underlying 3D data is imperative for effective collaboration and business continuity purposes.”

AR supports remote sales activities

AR is also proving an effective sales tool, enabling the all-important sales process to continue during the pandemic. Home shoppers can examine digital renditions of home appliances, furniture, etc. presented within their own physical space, for example. Moreover, the use of rich and interactive sales demonstrations facilitated by AR allow the potential buyer to understand the form, fit and function of a product without the need for travel, touch or close interaction with a salesperson.

AR enriches the remote shopping experience, allowing buyers to place and interact with products in their own physical environment. Image courtesy of PTC.

 

Sarah Reynolds of PTC comments, “AR experiences improve the end-to-end customer experience, improve purchase confidence, and ultimately streamline sales cycles, especially when customers are not able to shop in person.”

Take the next steps

In this editorial we’ve discussed a number of ways in which AR technology can help ensure business and healthcare continuity by mitigating the impacts of the various restrictions placed on the way we work. Recognizing this, many AREA member companies have introduced special offers and services to help industry during the pandemic and we applaud their support. Learn more about them here.

We invite you to discover more about how Enterprise AR is helping industry improve its business processes at The AREA.




AREA Members Offer Pandemic Support

As organizations throughout the world cope with the quarantining and work-from-home restrictions necessitated by the global coronavirus pandemic, AREA members are springing into action to help. Many of them have launched special offers that enable organizations to use their AR tools to overcome limitations to collaboration and business continuance. 

Here are some of the offers AREA member companies have told us about: 

  • Atheer has offered free licenses of its Atheer AR platform until the end of June 2020. All licenses, onboarding, and support will be provided by Atheer for free with no commitment of any type required. 
  • Augmentir has announced it is offering free use of its Remote Assist tool for the remainder of 2020. Remote Assist provides a remote collaboration and support solution that can be adopted in less than 60 minutes. 
  • Iristick is offering its smart glasses with three months of free software use (remote assistance). In addition, AREA members can receive a 10% discount on the company’s Z1. Essential and Z1.Premium products. 
  • PTC is making its remote assistance product, Vuforia Chalk, available for free so employees can collaborate in operation, maintenance, and repair. 
  • Theorem Solutions will provide free CAD translation services to any organization that has switched to producing ventilators and has found itself working with incompatible data formats.
  • The Advanced Manufacturing Research Centre (AMRC) Design and Prototyping Group have responded to Britain’s call to produce more Personal Protective Equipment for healthcare workers by using technologies such as 3D printing and laser cutting to make up to 1,000 face shields per week. The face shields are being distributed to area hospitals. 
  • Scope AR has created a Quick Start Program that supports organizations limiting travel by connecting technical experts to hands-on workers. The program leverages visual remote assistance to enable diagnoses, repairs, and upgrades, as well as bringing training to remote employees and clients via AR. 
  • XMReality now offers a free premium version of Remote Guidance which will make it possible for anyone to try out Remote Guidance. The new offering provides businesses interested in learning about the benefits of remote guidance an easy and free way to see how remote guidance can improve service functions.

 We applaud these companies for their efforts and will continue to share additional AREA member company offers as we hear about them. 




Mixing and Matching Standards to Ease AR Integration within Factories

AREA member Bill Bernstein of the National Institute of Standards and Technology (NIST) shares his organization’s early work to improve AR interoperability.  

Today, most industrial Augmented Reality (AR) implementations are based on prototypes built in testbeds designed to determine if some AR components are sufficiently mature to solve real world challenges. Since manufacturing is a mature industry, there are widely accepted principles and best practices. In the real world, however, companies “grow” their factories organically. There’s a vast mixing and matching of domain-specific models (e.g., machining performance models, digital solid models, and user manuals) tightly coupled with domain-agnostic interfaces (e.g., rendering modules, presentation modalities, and, in a few cases, AR engines)  

As a result, after organizations have spent years developing their own one-off installations, integrating AR for visualizing these models is still largely a pipedream. Using standards could ease the challenges of integration, but experience with tying them all together in a practical solution is severely lacking.  

To address the needs of engineers facing an array of different technologies under one roof, standards development organizations, such as the Institute of Electrical and Electronics Engineers (IEEE)the Open Geospatial Consortium (OGC)and the Khronos Group, have proposed standard representations, modules, and languages. Since the experts of one standards development organization (SDO) are often isolated from the experts in another domain or SDO when developing their specifications, the results are not easily implemented in the real world where there is a mixture of pre-existing and new standards. The problem of low or poor communications between SDOs during standard development is especially true for domain-agnostic groups (e.g., the World Wide Web Consortium (W3C) and Khronos Group) communicating with domain-heavy groups (e.g., The American Society of Mechanical Engineers, the MTConnect Institute, and the Open Platform Communications (OPC) Foundation).  

However, both perspectives – domain-specific thinking (e.g., for manufacturing or field maintenance) and AR-specific and domain-agnostic concerns (e.g., real-world capture, tracking, or scene rendering) – are vital for successfully introducing and producing long term value from AR.  

Smart Manufacturing Environments 

In the case of smart manufacturing systems (SMS), SMS-specific standards (e.g., MTConnect and OPC-Unified Architecture) provide the necessary semantic and syntactic descriptions of concepts, such as information about devices, people, and materials. Figure 1 showcases the current state of an industrial AR prototype with examples of standards to inform processes.  

 

Figure 1: General workflow for generating industrial AR prototypes. The dotted purple lines signify flows that are currently achieved through significant human labor and expertise.  

From a high-level view, the AR community is focused on two separate efforts: 

  • Digitizing real-world information (shown on the left of Figure 1); 
  • Rendering and presenting AR scenes to the appropriate visualization modalities (shown on the right of Figure 1).  

To produce successful and meaningful AR experiences, it is vital to connect to domainspecific models with domain-neutral technologiesIn the current state of AR development where few or no standards have been implemented by vendors, this task is expert-driven and requires many iterations, human hours, and experience. There are significant opportunities for improvement if these transformations (indicated by the purple dashed lines in Fig. 1) could be automated.  

In the Product Lifecyle Data Exploration and Visualization (PLDEV) project at NIST, we are experimenting with the idea of leveraging standards developed in the two separate worlds: geospatial and smart manufacturing or industry 4.0. One project, shown in Figure 2, integrates both IndoorGML, a standard to support indoor navigation, and CityGML, a much more detailed and expressive standard that can be used for contextually describing objects in buildings, with MTConnect, a standard that semantically defines manufacturing technologies, such as machine tools. All these standards have broad support in their separate communities. Seemingly every day, supporting tools that interface directly with these representations are pushed to public repositories.  

Figure 2: One instance of combining disparate standards for quick AR prototype deployment for situational awareness and indoor navigation in smart manufacturing systems.  

In Figure 2, we show the use of IndoorGML and CityGML in a machine shop that has previously been digitalized according to the MTConnect standard. In doing so, we leverage existing AR visualization tools to render the scene. We then connect to the streaming data from the shop to indicate whether a machine is available (green), unavailable (yellow), or in-use (red). Though this is a simple example, it showcases that when standards are appropriately implemented and deployed, developers can acquire capabilities “for free.” In other words, we can leverage domain-specific and -agnostic tools that are already built to support existing standards, helping realize a more interoperable AR prototyping workflow.  

Future Research Directions 

This project has also demonstrated significant future research opportunities in sensor fusion for more precise geospatial alignment between the digital and real worlds. One example is leveraging onboard sensors from automated guided vehicles (AGVs) and more contextually defined, static geospatial models described using OGC standards IndoorGML and CityGML  

Moving forward, we will focus on enhancing geospatial representations with additional context.  For example, (1) leveraging such context for AGVs to treat task-specific obstacles (like worktables) differently than disruptive ones (like walls and columns) and (2) helping avoid safety hazards for human operators equipped with wearables by more intelligent rendering of digital objects.  We are currently collaborating with the Measurement Science for Manufacturing Robotics program at NIST to investigate these ideas.  

If successfully integrated, we will be able to demonstrate what we encourage others to practice: adoption of standards for faster and lower cost integrations as well as safer equipment installations and factory environments. Stay tuned for the next episode in this mashup of standards!  

Disclaimer 

No endorsement of any commercial product by NIST is intended.  Commercial materials are identified in this report to facilitate better understanding.  Such identification does not imply endorsement by NIST nor does it imply the materials identified are necessarily the best for the purpose. 




Masters of Pie Fulfills a Growing Need for Immersive Collaboration

As its website proclaims, new AREA member Masters of Pie offers “the only industry-ready solution that provides heavy-duty immersive collaboration with end-to-end encrypted sharing of real-time data, across all devices, for all the team.” We spoke recently with the London-based company’s co-founder and CEO Karl Maddix to learn more.  

AREA: Tell us how Masters of Pie got started.  

KARL MADDIX: Matthew Ratcliffe and I founded Masters of Pie in 2011. We both have backgrounds in 3D real-time technologies. Matt was working in real time visualization for architecture, whereas I was doing animation and character art for games and short films. We met in 2009 at a London agency that had a contract for what was a very early digital twin prototype project for a water treatment plant. Matt and I made basically a digital twin of the physical site using laserscan data which was plumbed into streamed sensor data from the plant itself that would then be able to be seen and interacted with. It was really ahead of its time and we pioneered a lot of the processes and techniques to make it viable for industry.  

Masters of Pie was spawned from that project. The concept was simply to apply our expertise in the real-time world to the enterprise. We started as a service provider, doing R&D using game engine technology for interactive applications, prototypes, and products. We did things like making interactive CAD portfolios for engineering companies who have big industrial presses that they wanted to interact with. We were also careful not to just build shallow self-contained apps; we always tried to drive them with actual industry data. We were learning how to make play nice with real time engines. Masters of Pie did some early showcases for Siemens around interactive data sets and this introduced us to the engineering world and got us exposure among Siemens end users, such as Volkswagen, Ford, and Rolls Royce. That was when we started to identify the big problems that we wanted to tackle with our own products when we made the switch from service to product.  

When the Oculus Rift DK1 appeared on Kickstarter, we immediately saw its value for what we were doing, which was putting big CAD models into 3D real-time engines. Luckily for us, the DK1 arrived a couple of weeks before we were due to go to Germany to meet with Siemens about a mobile-based project. So, just a few weeks after the DK1 was released worldwide, we were in Siemens offices showing them something impressive with it, something they had never seen before. That was a pivotal moment when we saw the excitement generated from that meeting.  

With access to their customers like Volkswagen, we were able to test out ideas for our own product. Before, I’d been able to show them a full-sized car, but it was apparent they had no way of getting their data into that application without a great deal of pain. The VR element was nice and all, but it was the complexity of this data which was stuck in silos that was the real issue. We explored that concept. How does data get from where it is created in a CAD package or in a Product Lifecycle Management system so it can be shared across different teams, efficiently and quickly, while it is still live data and not outdated by two weeks because it was sent offshore to be refractured or reformatted in some way? We wanted to enable the sharing of actual live or real time data among disparate teams.  

That is the core problem statement that Masters of Pie decided to tackle. Our approach to address this challenge was to develop a fully extensible and modular software framework called Radical to integrate deep into where the live data resides. The decision to take this direction was made in 2016, when we turned off the tap of our service work and became a software product company. All of our previous profits were ploughed into building the first generation of our “Radical” platform.  

AREA: That was a leap of faith.  

KARL MADDIX: Yes. We’ve always been like that. Our real motivators are solving big industry problems such as enabling realtime collaboration on large and complex 3D data. Because we also had such great access to industry leaders, such as Rolls Royce, we had very good feedback and indicators that we were on the right track. They told us this was a real problem for them and nobody was even trying to solve it.  

AREA: What made you think that an SDK or software framework was the way to productize what you needed to do?   

KARL MADDIX: One of the early prototypes we made was based on using the open API from a CAD software package and integrating the Radical software ourselves. The result was like having a Radical button within the host software. When you clicked it, it instantly brought the CAD data into our environment. More importantly, it was still bidirectionally linked to the CAD package and so all the associated metadata was available and enabled powerful functionality such as the ability in VR to complete accurate measurement.  

The large manufacturing customers automatically saw the value and wanted to proceed, however would prefer if the software was fully integrated via an established and entrenched technology partner. This feedback was critical in pivoting the business model focus to an indirect sales process versus building out a large direct sales team. Masters of Pie would instead concentrate on the technology as an extensible software framework to license to companies who built the host packages, such as CAD providers, who sell directly to the target end customer. Siemens was the first OEM partner to integrate the software and has been delivering Radicalenabled immersive functionality since 2017 to their installed base.     

 Masters of Pie software is not just about the CAD/PLM market. Any company offering software that generates really complex data or holds complex 3D data is a potential target customer. We built Radical to be flexible enough to work with multiple data types. We are certainly not building it as just a CAD solutionRadical doesn’t care what data type is pushed into it. We are just as happy with other formats such as point clouds, MRI scans or any other complex data. Instead, what we are building is what we call a “collaborative thread framework.” The concept is that we will be the connective tissue between multiple pieces of the ecosystem that are starting to bubble up. People will soon want to work freely across factory floors or in the field using AR, VR or mobile devices – however, this is not enabled by any one group. It will be a complex landscape of offerings. But it all starts with getting the live data.  

Masters of Pie secures data access by being integrated to the CAD or PLM packages, but we also want to be integrated to the IIoT platforms, and pull IoT data that you can then surface in our environment, alongside the CAD. We are talking to cloud service providers so we can start looking to connect teams in larger spaces such as factories, and have spatial anchorings support from Microsoft, for example, so you can walk around the factory and know spatially exactly where you are. It’s this concept of, okay, you’ve got 5G coming, you’ve got cloud service providers wanting to stream to multiple devices, viable AR that is going to be coming pretty soon, VR is fairly established. All of these little pieces, we are looking to tie together with our singular platform pushing live data to connected teams. That’s why we call it a collaborative thread. 

 AREA: You make it sound very easy. But hasn’t it been a big problem to pull data from all these different sources and to do it so quickly? What’s your secret?  

KARL MADDIX: I’ll be honest, the easier component of what we do is the technology. We’ve got a very highly skilled team and we are pretty good at what we do. We had a good insight early on, which I think gave us a good head start over the rest of the market. And we’ve got some very established relationships which help with some of these big players. The more difficult part is the business side; for example, securing an OEM agreement, bringing in technology partners, and building strategic partnerships with key industry leaders. We’re talking to people at Microsoft, AWS, Nvidia, Ericsson, Vodafone and we’ve just closed a funding round which included Bosch and Williams Advanced Engineering.  

AREA: As we evolve more toward integrating all of these different pieces – Augmented Reality, Artificial Intelligence, the Internet of Things – it seems as if you’re in a good spot to be the glue that pulls it all together.   

KARL MADDIX: What Matt and I realized was that there is no real clear killer VR or AR application that is going to change the world yet. How do we mitigate our risk given that? The approach of an extensible software framework product does help us. Our customers don’t necessarily need to know what that killer application is yet. All they need to know is that by adopting our platform, they gain the ability to build products quickly, integrate it into their current portfolio, and be ready as and when these use cases appear. We don’t need to worry about what exact AR device our customer is going to be using in five years’ time. All we need to know is that, in order for that to happen, you need a holistic architectural approach, like Radical, to get the data flowing, pulling people together and connecting these moving parts. Industry needs that infrastructure now. Large Industry software providers, such as Siemens, want to be ready for the next generation of products they are going to be putting out whilst upgrading their existing products so that they can stay connected and relevant. That is basically the value Masters of Pie provides to these software providers – the confidence to enable immersive collaborative products today while ensuring the approach will adapt to meet the challenges of tomorrow. We are providing them with the building blocks to prepare for the next wave of products and features, right now with the Radical software framework.  

AREA: As the current coronavirus pandemic has made very clear, organizations need tools that help disparate, dispersed teams collaborate. How does Radical support that kind of collaboration?  

KARL MADDIX: Yes indeed. Although it is obviously a terrible time for the world right now, it does highlight how unfit for purpose the traditional collaboration infrastructure is within enterprise. Webex and Teams are fine for connecting people in real time but not their data. If there ever was a time to show industry the way forward, then a global pandemic is it, even though I feel very guilty about saying it out loud. I think that once this virus starts to recede and people are going back to work, the first item on the agenda will be how to better prepare for the future should this threat appear again. That for us will be our golden hour as there really is not anything else as robust and flexible as Radical out there that can be adopted quickly and used wholesale across disparate software products within a portfolio. Unlike other solutions that may make a lot more noise than we do in the market, we are not vapourware or a shiny proof of concept, we are in-market right now with real product, trusted by industry and delivering value.   

AREA: Tell us why you joined the AREA and what you hope to get from your membership?   

KARL MADDIX: It is more on the technology side. The drive came from Matt Ratcliffe, our co-founder and Chief Product Officer. What we are looking to do is to get more direct access to end customers. We are striving to get better and more accurate, direct feedback from end users. Matt and the team felt that the AREA would be a good way to get our message out there, to start talking about our vision for the collaborative thread as Masters of Pie, and try to get more insight on whether we are doing the right things from the AREA members. 




AREA Member Augmentir Offers Free Remote Assist Tool During Pandemic

As organizations everywhere cope with the travel restrictions and work-from-home policies put in place to combat COVID-19, AREA member Augmentir is stepping up to help ensure business continuance and support employee health and safety.  

Augmentir has announced it is offering free use of its Remote Assist tool for the remainder of 2020. Augmentir’s Remote Assist tool provides a remote collaboration and support solution that can be adopted in less than 60 minutes, so that workers, technicians, and customers can get the support they need to do their jobs without compromising health, safety, or productivity.  

To learn more about Augmentir’s Remote Assist tool and how to get started for free, please visit the Augmentir blog page.