1

Blippar brings AR content creation and collaboration to Microsoft Teams

LONDON, UK – 14 June 2022 – Blippar, one of the leading technology and content platforms specializing in augmented reality (AR), has announced the integration of Blippbuilder, its no-code AR creation tool, into Microsoft Teams.

Blippbuilder, the company’s no-code AR platform, is the first of its type to combine drag and drop-based functionality with SLAM, allowing creators at any level to build realistic, immersive AR experiences. Absolute beginners can drop objects into a project, which when published will stay firmly in place using Blippar’s proprietary surface detection. These experiences will serve as the foundation of the interactive content that will make up the metaverse.

Blippbuilder includes access to tutorials and best practice guides to familiarise users with AR creation, taking them from concept to content. Experiences are built to be engaged with via browser – known as WebAR – removing the friction of, and reliance on dedicated apps or hardware. WebAR experiences can be accessed through a wide range of platforms, including Facebook, Snapchat, TikTok, WeChat, WhatsApp, alongside conventional web and mobile browsers.

Teams users can integrate Blippbuilder directly into their existing workflow. Designed with creators and collaborators in mind, whether they be product managers, designers, creative agencies, clients, or colleagues, organisations can be united in their approach and implementation – all within Teams. The functionality of adaptive cards, single sign-on, and notifications, alongside real-time feedback and approvals,  provides immediate transparency and seamless integration from inception to distribution. The addition of tooltips, support features, and starter projects also allows teams to begin creating straightaway.

“The existing process for creating and publishing AR for businesses, agencies, and brands is splintered. Companies are forced to use multiple tools and services to support collaboration, feedback, reviews, updates, approvals, and finalization of projects,” said Faisal Galaria, CEO at Blippar. “By introducing Blippbuilder to Microsoft Teams, workstreams including team channels and group chats, we’re making it easier than ever before for people to collaborate, create and share amazing AR experiences with our partners at Teams”.

Utilizing the powerful storytelling and immersive capabilities of AR, everyday topics, objects, and content, from packaging, virtual products, adverts, and e-commerce, to clothing and artworks, can be ‘digitally activated’ and transformed into creative, engaging, and interactive three-dimensional opportunities.

Real-life examples include:

  •  Bring educational content to life, enabling collaborative, immersive learning
  •  Visualise and discuss architectural models and plans with clients
  •  Allowing product try-ons and 3D visualization in e-commerce stores
  •  Create immersive onboarding and training content
  •  Present and discuss interior design and event ideas
  •  Bring print media and product packaging to life
  •  Artists and illustrations can redefine the meaning of three-dimensional artworks

In today’s environment of increasingly sophisticated user experiences, customers are looking to move their technologies forward efficiently and collaboratively. Having access to a comprehensive AR creation platform is a feature that will keep Microsoft Teams users at the forefront of their industries. Blippbuilder in Teams is the type of solution that will help customers improve the quality and efficiency of their AR building process.

Blippar also offers a developer creation tool, its WebAR SDK. While Blippbuilder for Teams is designed to be an accessible and time-efficient entry point for millions of new users, following this validation of AR, organisations can progress to building experiences with Blippar’s SDK. The enterprise platform boasts the most advanced implementation of SLAM and marker tracking, alongside integrations with the key 3D frameworks, including A-Frame, PlayCanvas, and Babylon.js.




Factory layout Experience – Theorem Solutions

Optimize designs in immersive XR

The Factory Layout Experience enables a planning or layout engineer, working independently or with a group of colleagues, locally or in remote locations, to optimize Factory layouts through the immersive experience of eXtended Reality (XR) technologies. Seeing your data at full scale, in context, instantly enables you to see the clashes, access issues and missing items which a CAD screen cannot show.

On the shop floor there are literally 1000’s of pieces of equipment- much of it bought in and designed externally. Building designs may only exist as scans or in architectural CAD systems, and robot cells may be designed in specialist CAD systems. There will be libraries of hand tools, storage racks and stillage equipment designed in a range of CAD systems, and product data designed in house in mechanical CAD. To understand the factory and assess changes, all of that has to be put together to get a full picture of where a new line, robot cell or work station will fit.

A catalogue of 3D resources can leverage 2D Factory layouts by being snapped to these layouts to quickly realize a rich 3D layout. Advanced positioning makes it very easy to move, snap and align 3D data. Widely used plant and equipment is readily available, there is no need to design it from scratch for every new layout. Simplified layout tools enable you to position, align and snap layout objects quickly, which can be used by none CAD experts, enabling all stakeholders to be involved in the process, improving communication.

Testing Design and Operational Factors

Human centred operations can be analysed using mannequins that can be switched to match different characteristics. You can test design and operational aspects of a variety of human factors, to determine reachability, access and injury risk situations, ensuring compliance with safety and ergonomic standards.

It enables companies to avoid costly layout redesign by enabling all parties involved to review the layout collaboratively, make or recommend changes, and capture those decisions for later review by staff who could not attend the session.




Building an immersive pharma experience with XR technology

In the world of pharma manufacturing, precision is key. To execute flawlessly, pharmaceutical scientists and operators need the proper training and tools to accomplish the task. User-friendly augmented reality (AR) and mixed reality (XR) technology that can provide workflow guidance to operators is invaluable, helping name brand companies get drugs, vaccines, and advanced therapies to patients faster.

AR has been a cost-effective way to improve training, knowledge transfers, and process execution in the lab during drug discovery and in the manufacturing suite during product commercialization. Apprentice’s AR Research Department is now seeing greater demand within the pharma industry for XR software capabilities that allow life science teams to use 3D holograms to accomplish tasks.

For example, operators are able to map out an entire biomanufacturing suite in 3D using XR technology. This allows them to consume instructional data while they work with both hands, or better understand equipment layouts. They can see and touch virtual objects within their environment, providing better context and a much more in-depth experience than AR provides.

Users can even suspend metadata in a 3D space, such as the entrance to a room, so that they can interact with their environment in a much more complete way, with equipment, objects and instruments tethered to space. Notifications regarding gowning requirements or biohazard warnings for example will automatically pop up as the operator walks in, enriching the environment with information that’s useful to them.

“It’s all about enhancing the user experience,” Linas Ozeratis, Mixed Reality Engineer at Apprentice.io. “At apprentice, our AR/XR Research Team has designed pharma-specific mixed-reality software for the HoloLens device that will offer our customers an easier, more immersive experience in the lab and suite.”

Apprentice’s XR/AR Research Team is currently experimenting with new menu design components for the HoloLens device that will reshape the future of XR user experiences, making it easier for them to interact with menus using just their fingers.

Apprentice’s “finger menu” feature allows users to trigger an action or step by ‘snapping’ together the thumb and individual fingers of the same hand. Each finger contains a different action button that can be triggered at any time during an operator’s workflow.

“Through our research, we’ve determined that the fingers are an ideal location for attaching AR buttons, because it allows users to trigger next steps without their arm or hand blocking the data they need,” Ozeratis added.  It’s quite literally technology at your fingertips.”

Why does the pharma industry want technology like this? Aside from the demand, there are situations where tools like voice commands are simply not feasible. The AR Research Team also learned that interactive finger menus feel more natural to users and can be mastered quickly. Life science teams are able to enhance training capabilities, improve execution reliability and expand the types of supporting devices they can apply within their various environments.

“Introducing these exciting and highly anticipated XR capabilities is just one stop on our roadmap,” Ozeratis adds. “There are bigger and bolder things ahead that we look forward to sharing as the pharma industry continues to demand more modern, intelligent technologies that improve efficiency and speed.”




A Talk with Christine Perey About the AREA Interoperability & Standards Program

 

AREA: How long have you been involved in standardization activities?

Perey: My role in standardization activities began in 1994 when I joined the ITU-T committee standardizing video conferencing. Seeing needs for interoperability in AR as early as 2010, I formed and led a grassroots community advocating for development of standards for AR. I have chaired dozens of meetings and workshops, and given dozens of webinars on the topics of projects and/or standards that could contribute to the advancement and adoption of open interfaces and AR interoperability. I work directly with a wide range of standards development organizations (SDO). As a member, a working group chair or co-chair, or as an invited expert, I currently contribute to nearly 20 standards. Outreach and coordination between SDOs is another passion of mine. On October 4, 2021, I chaired a tutorial coordinated with Khronos Group and ETSI ISG ARF about AR interoperability and standards in the context of the ISMAR 2021 conference. I encourage people interested in this topic and seeking to better understand what’s available to explore the tutorial website.

AREA: Tell us more about the AREA Interoperability & Standards program.

Perey: Through the Interoperability & Standards program, the AREA seeks to increase knowledge about the benefits and approaches to achieving interoperability and to advance the development of standards or other approaches to interoperability. That entails: informing AREA members and the enterprise AR ecosystem about existing standards for interoperable AR solutions through development of thought leadership content; supporting the identification of interoperability requirements in customer organizations; supporting the identification of interfaces in AR components that, through implementations, provide interoperability in enterprise AR solutions and services; engaging with organizations and members, including those dedicated to standards development and promotion of standards to provide requirements; and building a base of AR professionals who are well versed in the implementation of existing standards for AR, and promote the development and adoption of extensions to existing standards as well as new standards.

AREA: Why are standards so important to enterprise AR adoption?

Perey: The motivations for adopting standards depend on the segment of the ecosystem to which a company belongs. Let’s take the customer segment, because when technology buyers are successful, so are their partners and providers. Today, when companies begin evaluating enterprise AR use cases they do so with isolated projects (products are not integrated with enterprise systems) and using products of one or a few technology providers. In companies that are advanced in their study of AR, there can be partial or full testbeds of multiple AR technology providers, but they are often isolated from other AR projects and are not integrated with enterprise systems.

A company seeking to maintain and expand its testing within a specific technology segment (e.g., comparing multiple providers or models of hardware) or to implement at scale in their enterprise confronts significant obstacles. It has been demonstrated in other industries that when standards or open source interfaces and guidelines have been widely accepted and implemented across an ecosystem, higher technology interoperability can: reduce barriers to deployment of multivendor or multi-product solutions (also known as “integration”); lower costs of ownership; reduce risks of vendor lock-in; and increase innovation and opportunities for new sales through provider specialization. Barriers are removed and everyone benefits.

AREA: What’s on the horizon for the AREA Interoperability & Standards program?

Perey: We will continue to develop thought leadership content, through hosted webinars, white papers, and blog articles, as well as participation in relevant conferences and events. As the awareness of interoperability as a key to success rises, we will work with large enterprises deploying AR to develop their interoperability requirements and integration needs and bring them to the attention of SDOs and the AR technology providers. We will act as a conduit from SDOs to AREA member companies – providers as well as customer segment members – to share SDO draft specifications and gather and deliver feedback to them. And, where there are implementations and testing suites, we will work to support the testing of products and services that comply with international standards in real-world settings.

AREA: Why should AREA members consider participating in the Interoperability & Standards program?

Perey: This is a program that can only thrive when AR customers are actively sharing their requirements and real-world experiences. So we’re looking for AREA members to contribute to the program by preparing blog posts on topics that will share their thought leadership and raise awareness about specific or general challenges. Topics could include: key interoperability and standards requirements for enterprise AR; developing best practices for safety, human factors, and more; sharing their experiences in standards development; and recounting their experiences implementing one or more standards in specific use cases or products. AR component and solution providers will increasingly be able to showcase interoperability through AREA programs to advance interoperability such as plug-fests and testbeds. Now is the time, while AR standards are under development, to make sure your voice is heard, your needs are being considered, and your experiences are being shared.

If you’re an AREA member and would like more information about participating in the AREA Interoperability & Standards program, please contact Christine Perey. If you’re not yet an AREA member but want to see an AR ecosystem that derives the full benefit of standardization and interoperability efforts, please consider joining us. You can find membership information here.

 

 




Using Theorem-XR and HoloLens 2 for Engineering Reviews

You can watch the full webinar on Using Theorem-XR and HoloLens 2 for Engineering Reviews.

Key highlights

Theorem has also picked out 5 key benefits of using the Microsoft HoloLens 2 in Engineering which appear in full detail on their blog.

Visualize your models at full scale.
Work collaboratively with other engineers.
Make better factory planning decisions.
Work with large datasets using Azure Remote Rendering.
You can still work with colleagues that are using other XR technologies, or none at all.




New AR deal to help steel industry protect vital skills and move towards net zero

The project will initially use Vuforia Studio technology to overlay live data – taken from the ThingWorx® industrial platform – to various points of the facility, so that operators moving around will be able to make informed decisions on changes to casting and melting lines or troubleshoot issues before they happen.

It is anticipated that Augmented Reality will make it easier for staff to hae the right information at exactly the right point they need it, whilst the use of HoloLens and RealWear glasses will mean the individual has both hands free to complete tasks.

This project will contribute to the sector’s longer-term desire to move towards a net zero steel works by 2050 and is part of the £22m PRISM steel and metals sector research and innovation programme being delivered by the Materials Processing Institute with funding provided through Innovate UK, part of UK Research and Innovation.

“The successful implementation of digital technologies has the potential to save tens of £millions every year,” explained Chris Oswin, Group Manager of Digital Technologies at the Materials Processing Institute.

“We are taking responsibility for exploring IIoT platforms and AR and working out how we can get the most out of them in a live steel plant, learning from testing and trials to identify best use cases.”

He continued: “This means we absorb a lot of the time and remove the initial expenditure that could act as a barrier to entry for companies in our industry, hopefully encouraging digital adoption as we will have proved it works and how it can be applied to businesses.

“PRISM is guided by a team of industry leaders on our Industrial Advisory Board, including the Aluminium Federation, British Manufacturing Plant Constructors’ Association, British Steel, Celsa Steel, Liberty Steel, Outokumpu Stainless Steel, Sheffield Forgemasters, Swansea University, Tata Steel and the UK Metals Council.”

The Materials Processing Institute has a long-term relationship with PTC, with the latest project following on from the introduction of ThingWorx as part of the £10m programme to explore how digital technologies can be implemented in brownfield manufacturing sites.

In addition to optimising processes and introducing new efficiency improvements, Augmented Reality will also be used to capture some of the traditional skills in the sector that could be lost if the knowledge of older workers is not retained before they retire.

This will be achieved by using PTC’s Vuforia ® software, with Vuforia Expert Capture allowing operators and technicians to film their daily tasks in step-by-step instructions, in situ of when and where they do their work.

This will be uploaded to ‘the Cloud’, which can then be accessed by new starters or people switching roles, using HoloLens or RealWear to get a real hands-on experience, or other devices such as mobiles, tablets or on desktop computers.

Furthermore, for problem resolution and live ‘on the job’ support, there is Vuforia Chalk. Using mobile devices, digital eyewear or seated at a desk – experts can connect with on and off-site employees and customers and collaborate in real-time. It combines live video, audio and the ability for remote and local participants to annotate their live shared view and mark-up the real-world environment.

“If we don’t act soon, we stand to lose so much knowledge from the industry and AR gives us a cost effective and easy way to retain skills and experience in a virtual library for generations to come,” added Chris.

“Working closely with PTC’s experts, we can tailor how we capture information, footage and skills in what is a very demanding and intense environment. We believe we’ve got the initial framework to start the roll-out and will continue to adapt the processes as we understand more about how digital technologies can play a role.”

David Grammer, general manager for UKI for PTC, went on to add: “Covid-19 has definitely thrust the digital thread into the spotlight, but there is still a resistance to adoption due to a lack of awareness of how it will deliver a genuine business benefit.

“This project with the Materials Processing Institute gives an entire sector the opportunity to explore how AR can be applied and developed in a real live steel plant without the potential disruption and cost of trying it in their own facilities.

“Businesses will be involved in the roll-out and informing some of the test cases and our team will be on hand to support experts at the Institute to get the most out of our technology and software.

“The end goal is that we will have proven business cases on how steel and metals companies can optimise processes using Augmented Reality and live data, not to mention protecting vital skills for the steel workers of the future.”

PTC, which has bases in the UK and Ireland, provides a host of technology solutions to help industrial companies create value for themselves and the rest of the world.

This is achieved through a combination of Augmented Reality, Industrial IoT, Product Lifecycle Management and CAD solutions.




PTC expands spatial computing capabilities with Vuforia Engine area targets

Through the use of Area Targets, industrial organisations can create AR interfaces within their facilities to enable employees to better engage with machinery and understand how the environment is being utilised.

More information can be found here https://library.vuforia.com/features/environments/area-targets.html

PTC says that with support from Matterport and Leica 3D scanners, along with NavVis’s indoor mobile mapping systems, Area Targets users can generate “photorealistic, survey-grade digital twins, empowering them to create digital canvases of spaces such as factories, malls, or offices for advanced spatial computing applications”.

As one of the leading emerging technologies, spatial computing powers digital twin renderings to support the activities of machines and people, as well as environments in which they operate.

When deployed across the industrial enterprise, spatial computing enables seamless interactions between employees through AR, enabling companies to close the loop on performance management, improve machine learning capabilities with spatial analytics, and optimise design and factory floor operations,” notes PTC.

“Vuforia Engine Area Targets is a one-of-a-kind solution for large, persistent AR experiences,” said Mike Campbell, Executive Vice President and General Manager of Augmented Reality, PTC.

“Whether users are looking to add navigation to their office building or view in-context data on a factory floor, Area Targets is the answer. We’re pleased to be expanding such a key capability and component of PTC’s spatial computing vision.”

The release of Vuforia Engine Area Targets marks the second Vuforia offering to deploy spatial computing in the form of area targets within the industrial setting, the first being the Vuforia Spatial Toolbox platform.

Combined with the Vuforia ChalkVuforia Expert Capture, and Vuforia Studio AR products, the Vuforia AR Enterprise Platform provides what PTC says is a “robust set of offerings that enables users to increase workforce safety and efficiency, improve customer experiences, and reduce costs”.

 




Augumenta’s Eve Lindroth on Shop Floor AR, Taiwan and the Future


When AREA member Augumenta participated in an AREA webinar about implementing AR on factory shop floors recently, we thought it would be worth catching up on the company and its activities. So we spoke the Eve Lindroth, the company’s head of Marketing Communications. Here’s our conversation.

AREA: Augumenta has distinguished itself as a leader in industrial shop floor uses of AR. To what do you attribute your success so far?

Lindroth: We have a large number of big and well-known industrial companies as our clients, and within these projects, our solutions have been adopted with very few changes. That tells us that we are taking the right approach to developing solutions for the industry. Our clients also praise the ease-of-use of our applications, and appreciate that there is no steep learning curve to start using them. Quite the opposite, they are considered easy to learn.

AREA: What’s a typical Augumenta client?

Lindroth: Most of our business is outside Finland. We have many manufacturing customers in France and Germany, for example, such as Siemens. We also have a presence in Japan and Taiwan which is important considering our focus on the Asian markets and the key customer projects we have ongoing there.

A typical client is a larger industrial company that is active in developing their operations – or during the pandemic, companies that are simply looking for the most efficient and practical ways to keep operating.

AREA: Speaking of that, in October, you announced a partnership with IISI of Taiwan. Tell us about the partnership, its goals, and its progress to date.

Lindroth: IISI is a system integrator and they have a very strong customer base in the fields of manufacturing and government. In our partnership, Augumenta acts as a technology/applications provider and the IISI experts do the final customization and integration with the end customer’s backend systems. Both companies can focus on their key strengths: we on the cutting-edge AR technology, and IISI on developing and managing the overall systems.

We started working together in the springtime and we have finalized all the customization needed for the end customer, a major semiconductor factory in Taiwan. We continue working in close cooperation with IISI and believe we are in a good position to advance enterprise AR in Taiwan together with them.

AREA: What do you see as the most significant barriers to AR adoption, and what is Augumenta doing to overcome them?

Lindroth: We have seen in many pilot projects that the organization has identified the problem they are looking to solve with a pilot, but for example, there are difficulties in defining the current status with an accurate number. For example, there’s downtime – how much there is and which factors exactly are causing it? That can be hard to come by. Another issue is user acceptance, but that can often be tackled by involving the people in planning the solutions from an early stage.

At Augumenta, we’re working to address those issues. For industrial pilots, for example, we created a simple checklist, just to remind the project managers and team leaders responsible for the pilot to consider the factors we have learned to be essential for an AR pilot’s success. These are related to things like target setting, planning together with your people and getting them involved throughout the process, or measuring the results. The checklist is available on our website.

AREA: What can we expect from Augumenta in 2021?

Lindroth: In the future, we believe that discrete industrial AR applications will become more integrated solutions. That means, for example, that there aren’t separate apps for alerting a user and guiding a user in tasks. There will be one solution that can do all of this – without the end user even noticing that there are many use cases included in the app. At some point, things like AI will make the end user’s job even easier by guiding him to the right data or expert automatically, for example.

A key success factor in such a solution is usability. Apps have to integrate seamlessly and be simple and intuitive to use independent of the use case at hand.

The pandemic has meant growth in demand for our services along with our clients’ need to find new ways to do things. In 2021, you’ll see closer integration of our apps. We’re working with new app features that are enabling efficient and sustainable working methods in the new normal. We’ll keep you posted with the latest developments during 2021.

AREA: Finally, how has Augumenta benefitted from its membership in the AREA?

Lindroth: The AREA has provided us with access to research, and there have been some great and very interesting research projects completed. We have also made many new contacts within the ecosystem via the AREA, and it’s always great to see and hear what’s going on with other ecosystem members. The AREA updates its social media channels very actively, and we appreciate the visibility they provide us.




Podcast – Getting Started in Enterprise Augmented Reality – Insights from Theorem Solutions

In this latest AREA Thought Leaders Podcast, AREA Executive Director Mark Sage poses these questions and more to Stuart Thurlby, CEO of Theorem Solutions, a UK-based company that has provided solutions to the world’s leading engineering and manufacturing companies for more than 25 years. As the leader of a firm that helps companies extract greater value from their 3D CAD assets, Mr. Thurlby understands the biggest challenges companies must overcome to deploy AR/XR successfully.

Don’t miss this must-listen, 15-minute conversation filled with insights into how to get started in Enterprise XR.

You can listen to the Getting started in Enterprise AR podcast here.

View The AREA’s other podcasts, videos and webinars here.




Wanda Manoth-Niemoller on KLM’s AR Venture, NUVEON

KLM, the flag carrier airline of the Netherlands, traces its history back to 1919 when Queen Wilhelmina gave it her royal stamp, making KLM one of the world’s first commercial airlines. Today, KLM’s fleet of 116 aircraft flies to 145 destinations worldwide generating more than €10 billion in revenues. In June of last year, KLM Engineering & Maintenance and Royal Netherlands Aerospace Centre (NLR) officially launched a joint venture, NUVEON, for the development of new AR products for MRO (maintenance, repair & overhaul).

NUVEON’s initial solutions address training needs by using Microsoft HoloLens to bring a complete virtual aircraft into the classroom. As part of our Thought Leaders Network program, we spoke recently with Wanda Manoth-Niemoller, Director of Commercial Development for NUVEON, to learn how the new venture is progressing.

 AREA: What was the motivation for starting NUVEON?

Manoth-Niemoller: Around 2016, the KLM training department wanted to know if AR was mature enough that we could use it. We started off with a proof of concept to see if we could benefit from AR and we chose NLR as our development partner. We built and tested one module and determined it was mature enough to use in training. Our learning experts felt it was especially useful for explaining system behavior, which is very difficult to do in a classroom setting. Showing how a system works is much more effective than simply reviewing a schematic. We built two more modules and saw the potential to do more with AR, so that’s when we decided to start NUVEON.

AREA: When you were looking into AR, why did you feel that you had to develop your AR software product?

Manoth-Niemoller: Because there was no existing AR product that we could use, and we wanted to commercialize whatever we created and make it available to other companies.

AREA: You launched NUVEON last June. Where do things stand now?

Manoth-Niemoller: The original proof of concept has been accepted for training by EASA, the European Aviation Safety Agency, which is the European equivalent of the FAA. That means our software can now be used to sign off on practical tasks for MRO training. We now offer several solutions for training on the Boeing 777 and 787. And there are more products on the way.

AREA: What have been your biggest challenges so far?

Manoth-Niemoller: The biggest initial challenge was the time it took to develop the product, because it had to be an exact copy of reality for the EASA to approve it.

AREA: What’s the next big hurdle for NUVEON?

Manoth-Niemoller: The next hurdle is to extend the range of use cases we support. Our current applications are now in day-to-day use in training, and we plan to support more systems and also extend beyond training into other use cases in MRO.

AREA: What kinds of reactions have you had from the users?

Manoth-Niemoller: It depends. The reaction often has to do with what the person is accustomed to. Some people first refuse to use it until they put it on their heads. We’ve introduced it to engineers who had already been doing the job for many, many years and were not used to innovative tools like this. They don’t see the advantage of it right away – until they put it on their heads, see what it can do, and then say, “Whoa!” and they can’t stop. They want to try everything. They see that, with AR, they can do much more than they could ever do with an operational aircraft. It actually delivers a deeper level of training. It’s effective because we enable several engineers to share a single image. They then have to solve a task together, as they would do in real life, but they’re able to get into much more detail than they’d be able to do in real life.

AREA: How are you making the NUVEON solution available to other companies?

Manoth-Niemoller: We can do it in several ways. We can conduct a training course for them, because we can use the system on location. We can also sell them the tool. Or we can develop something customized to their individual specifications.

To learn more about NUVEON’s solutions – including videos – please visit the NUVEON website.