Augmented Reality and Virtual Reality Improve Project Delivery

Before 2020, technologies designed for remote work played an important but mostly supporting role. Project teams used simple video conferencing tools to meet over long distances. Digital twin technologies were evolving, guiding teams towards a deeper understanding of their facilities. Project teams who wanted to squeeze more value from their BIM data were turning towards emerging AR and VR applications. No one doubted the momentum behind these advances, but few could have predicted what came next.

As COVID-19 swept the globe, these digital technologies went from nice-to-have to mission-critical almost overnight. That’s when AR and VR started having their moment—a moment that’s quickly become the new way of working.

The article explores what that “new way” looks like, with a particular focus on:

  • How BIM data and digital twin technologies drive immersive AR and VR experiences
  • How VR drives efficiencies by immersing users in a virtual environment
  • How AR supports smart decision-making by integrating real and digital environments
  • The future of VR and AR technologies in lean project delivery

Explanations with examples are given on:

  • Augmented reality, virtual reality and extended reality
  • Relevant tools including digital twin, smart glasses and remote assistance.
  • The context of Industry 4.0 and Pharma 4.0

The article goes on to explain how teams build interactive, immersive digital models, and walks through AR in action, as well as providing many of the benefits of AR to industry.

You can read the original article here Augmented Reality and Virtual Reality Improve Project Delivery


Digital twinning use cases strengthen with AR, VR

In a nutshell, digital twinning is the process of creating a highly realistic model of a device, system, process or product to use for development, testing and validation. Augmented reality (AR) and virtual reality (VR) come into play as well. For example, AR can show a digital twin on top of a physical machine and provide information a technician wouldn’t otherwise see, and technologists can enter the VR of a digital twin to simulate various issues.

Many folks associate the use of digital twins solely with manufacturing. While it is true that manufacturing has pioneered the use of digital twinning, use cases exist in every industry. Additionally, there are digital twinning use cases in cross-industry applications such as infrastructure and automation.

To better understand the potential uses of digital twinning, AR and VR, take a look at the use cases in a handful of industries.


Aerospace, automotive and general-purpose manufacturing firms use digital twins as part of overall product development. Here are some common uses:

  • creating design mock-ups to show how a finished product will work;
  • fine-tuning product features and capabilities;
  • defining requirements to provide guidance to component suppliers on component specifications, such as bolt size, shape and strength;
  • testing and quality assurance;
  • creating customer-requested modifications and other design personalization;
  • creating operational and performance optimization; and
  • predicting future failure modes so maintenance can be preemptively scheduled, and executing on other predictive maintenance goals.

Healthcare, retail and other human-centric industries

Companies that interact with customers can also benefit from digital twinning, which enables them to optimize patient care and customer service. Some examples include the following:

  • improving operational efficiency, such as using digital twinning to optimize the flow of patients or customers through a facility;
  • improving user experience by using AR and VR focus groups to test how customers or patients experience a physical facility;
  • improving layout and design of facilities; and
  • refining products and services for optimal appeal to customers and patients.
  • Supply chain and logistics industry
  • Companies that are heavily reliant on their supply chain or logistics functions can see particular benefits from digital twinning. Examples include the following:
  • pre-testing the performance of packaging and packaging materials;
  • optimizing routes and delivery processes; and
  • improving handoffs between stages in a supply chain.

General applications of digital twins, AR and VR

One often-overlooked use case for digital twins, AR and VR applies to almost every industry: infrastructure performance and automation.

Digital twins of routers, servers, storage appliances and virtual machines can serve as testing grounds to explore performance or security vulnerabilities. The same goes for testing automated processes.

For example, let’s say an organization is rolling out a new automated process for operating system updates and patch management for infrastructure devices. Network engineers can program the automation tool to roll out the update process on the twinned environment first. In this way, they can gather configuration and performance data and share that with technology operations specialists who can revise the process if it results in unforeseen impacts.

Facilities and technology professionals can also use digital twinning to model the physical environments of systems and appliances. For example, if an organization is building out its own data center, a digital twin can model power, heating and cooling systems against rack layout. Facilities professionals can use this to check for hot spots and make sure that all control panels are easily accessible by facility personnel.

While it is true that manufacturing has pioneered the use of digital twinning, use cases exist in every industry.

Finally, organizations can use the digital twin to optimize human operational processes, such as mapping how technicians walk through the data center.

While these applications may seem futuristic, enterprise organizations are deploying them today. The infrastructure lead at a large financial services firm recently told Nemertes Research that the single most important initiative infrastructure engineers can engage in is implementing AR/VR and digital twinning in their infrastructure environments and automation testing.

A digital twin checklist for infrastructure automation

For enterprise technologists looking to get started with infrastructure digital twinning, what are the next steps? The outline below refers to a network digital twin, but the same recommendations can apply to other infrastructure components such as servers, VMs or containers and storage appliances.

Virtualize and automate all changes to the production network; that is, make sure that all changes to the production network happen through scripts and APIs. Stakeholders should allow no manual configuration.

Using this automation, capture all network configurations in a version-controlled repository, such as Git.

Use a tool like Jenkins to develop and deploy a continuous integration/continuous delivery process and workflow that includes forking the repository, proposing changes and having the tool pull down modifications. This creates scripts to implement the changes on the digital twin network, which consists of the configurations in the version-controlled repository. This involves pushing the changes out, testing that the changes are successful and notifying the network engineer that the changes are successful.

Using Jenkins’ capability, merge the repository and implement the changes on the production network.

Review and sign off on the final state — or, if needed, roll back the changes.

Consider conducting predictive maintenance on the digital twin network regularly to get early warning of any performance, security or other concerns. Where appropriate, deploy proactive patching and upgrades.

Teamviewer and NSF Partner on EyeSucceed

The companies have partnered to pursue a joint goal: the accelerated growth of wearable software EyeSucceed, an augmented reality (AR) application based on TeamViewer’s enterprise AR platform Frontline.

EyeSucceed has the ability to digitalize processes and address critical challenges in the food industry. The commercial agreement leverages TeamViewer’s technological capabilities and NSF’s industry expertise and global customer base.

NSF has successfully integrated EyeSucceed into the daily operations of customers in the food and beverage industry to empower workers with AR-based workflows. For example, a global fast food restaurant chain has equipped its employees in more than 100 restaurants with the solution to ensure a global quality standard in training and onboarding of new employees.

Furthermore, the software is enhanced with artificial intelligence (AI) features for improving food safety — for example, to automatically detect if hygiene gloves are worn and changed during the food production process.

“TeamViewer’s AR specialists have been dedicated and collaborative partners of ours since 2015, when EyeSucceed was first created,” said John Rowley, vice president of the global food division at NSF International. “Together with TeamViewer, we will help food businesses around the world to reduce risk, improve compliance and strengthen their brands. This collaboration will define the standard for AR applications in the global food supply chain.”

Jan Junker, executive vice president solution delivery at TeamViewer, said, “The use of voice- and eye-controlled AR applications giving step-by-step instructions to workers on smart glasses is game-changing for the food industry. Companies can digitalize their workplaces while keeping their workers’ hands 100% free to perform their tasks faster and better and to stick to all hygiene regulations at the same time. Customers who optimize their processes with our Frontline solution confirm double-digit increases in efficiency and close-to-zero error rates. We are looking forward to teaming up with NSF International to bring these benefits to more customers in the food industry and beyond.”

Read AREA member NSF EyeSucceed’s member profile



Apprentice Raises $100 Million to Adapt Pharma Supply Chains to the Omicron Variant

The funding follows a $24 million Series B round raised in November 2020. Throughout 2020 and 2021, Apprentice helped drug makers keep production on-track despite pandemic imposed restrictions and transition their facilities to produce mRNA vaccines.. mRNA vaccines are the first large-scale application of cutting-edge cell therapies which require a profoundly different production process. Apprentice’s cloud-based manufacturing platform helped legacy facilities transform at a radical pace – directly supporting creation of an estimated 370 million COVID-19 vaccine doses and counting. Since the beginning of the pandemic, Apprentice experienced unprecedented growth including 12x annual recurring revenue growth, 516% net customer retention, and 6x increase in employee headcount.

“We had the right technology at the right time to help our customers keep making critical medicine and vaccines in unprecedented conditions. As the world continues to navigate COVID-19, we will help our customers adapt to new challenges” said Angelo Stracquatanio , CEO of Apprentice. “With the advent of cell and gene therapies, pharma is undergoing a transformation while coping with massive supply chain disruption. Our mission is to deliver modern technology that’s flexible, powerful, and regulatory-compliant so manufacturers can make these new therapies better, faster, and more reliably no matter the circumstance – to the benefit of patients everywhere.”

“As evidenced by their phenomenal growth, Apprentice alone has the modern cloud technology and mobile-first approach manufacturers need to keep their supply chains running during the pandemic,” said Abhi Arun, Managing Partner from Alkeon Capital. “We believe that they can bring life sciences into the modern era and help it transition to the therapeutics of the future. We couldn’t be more excited to partner with the team on this bold vision.”

Apprentice will use this capital to support pharma manufacturers for the duration of the pandemic, including scaling its operations, team, and platform to ensure supply chain adaptability. Already used by customers on five continents, the company will further expand in Europe and Asia. Apprentice will also launch the first cloud-native manufacturing execution system (MES), extending the success of its Tandem collaboration product, which gives remote colleagues a first-person perspective into what an operator is seeing and doing in-suite. The MES integrates augmented reality, voice recognition, and artificial intelligence into wearable, mobile, and desktop devices to offer a system that reduces human error and inefficiency across all three stages of production: pre-clinical, clinical, and commercial. Differentiated by its modern cloud infrastructure which enables a mobile-first approach, superior user experience, and no-code configuration, the MES has the flexibility to support the product and process development of the pre-clinical and clinical stages and the power to meet the scale and compliance of the commercial stage. With one manufacturing platform spanning the entire lifecycle, pharma manufacturers are better equipped to make next generation therapies through seamless transfer of process knowledge across teams and sites, real-time quality tracking, and reduction of production downtime.

About Apprentice
Apprentice’s disruptive technology helps pharma manufacturers get medicine to patients faster by providing one platform to turn molecules into medicine. The company’s intelligent cloud platform integrates augmented reality, voice recognition, and artificial intelligence into wearable, mobile, and desktop devices to offer a virtual collaboration application and a robust manufacturing and lab execution system that reduce human error and inefficiency in the drug production process. Learn how 15 of the top 20 US Pharma companies use Apprentice to accelerate high-quality production of drugs for diseases of all types, from COVID to cancer, at www.apprentice.io

Read Apprentice’s AREA member profile 

About Alkeon Capital
Alkeon Capital is a global investment firm that invests in private and public growth and technology companies and category definers. With more than two decades of experience focusing on People and Innovation, Alkeon works closely with disruptive private companies to help them expand their addressable market, scale efficiently, and seamlessly crossover to the public markets. Alkeon’s goal is to be a long-term and accretive partner to all its portfolio companies along their private and public journey.

Rockwell Automation wins Industrial IoT Solution of the Year award

Rockwell Automation is a multiple IoT Breakthrough Award winner, having won “Industrial IoT Innovator of the Year” in 2020 and  “Industrial IoT Company of the Year” in 2021.

With FactoryTalk Edge Gateway, manufacturers can easily access, understand and leverage the data needed to make informed decisions. The solution simplifies and automates collection, contextualization and organization of industrial-equipment data across machines, devices and automation assets at the source itself—enabling high data integrity from the outset. It also  provides the right foundation to drive edge-to-cloud IT/OT convergence at the enterprise level, resulting in informed decision-making.

FactoryTalk Edge Gateway software unifies data from industrial sources and control or automation systems. It is able to integrate with a variety of cloud, IIoT and big-data applications. It also uses OPC-DA, the automation industry’s standard for interoperability, to access KEPServer Enterprise data for third-party connectivity. This maximizes operational insights and provides a 360-degree view of a business, simplifying and automating data ingestion in a single integration solution for IT applications.

“Manufacturing processes and machines create tremendous amounts of data that, in the right place, with the right context, and at the right time can unlock new sources of potential value from analytics, machine learning, connected worker experiences, digital twins, and much more,” said Brian Shepherd, vice president of software & control at Rockwell Automation. “FactoryTalk Edge Gateway software simplifies the collection, contextualization and organization of OT data in a way that builds a high-integrity digital foundation for decision making. That foundation and ability to uncover new insights is what can help manufacturers achieve their performance goals.”

The mission of the IoT Breakthrough Awards program is to recognize the innovators, leaders and visionaries from around the globe in a range of IoT categories, including industrial and enterprise IoT, smart-city technology, connected home and home automation, connected car, and many more. This year’s program attracted more than 3,850 nominations from companies all over the world.

“Industrial enterprises struggle to aggregate operational data from heterogeneous sources and add relevant context from the source to the IT layer. This prevents them from uncovering potentially game-changing insights at the enterprise level,“ said James Johnson, managing director at IoT Breakthrough. “FactoryTalk Edge Gateway not only enriches OT data with critical context where it matters the most—at the edge—but also delivers it in a flexible common information model to IT applications, so that industrial enterprises can derive critical insights for competitive advantage. That makes it our choice for ‘Industrial IoT Solution of the Year.’ Congratulations for the third year in a row to Rockwell Automation.”

Visit Rockwell Automation’s AREA member profile. 


Magic Leap grants healthcare startups access to its new AR headset ahead of mid-2022 release

Another company, Brainlab, wants to make its Mixed Reality Viewer software available on Magic Leap 2.

That Magic Leap is making its latest wearable available to digital healthcare startups first isn’t surprising; CEO Peggy Johnson said as much would happen last April. “Augmented reality may transform healthcare more than any other industry, at least in the near term,” she said at the time, noting also that the company would focus on enterprise customers at launch.

Read Magic Leap AREA member profile

Pandemic Drives New Use Cases for Assisted Reality Wearables

When there’s a need for an expert to inspect an oil rig or train someone to repair a vehicle, companies used to fly the expert in to do the task in person. More and more, though, companies are learning that when workers in the field are equipped with assisted reality wearables, the expert can help them from a remote location.

The “see what I see” capability of these devices with a head-mounted camera and display allow the expert in a remote location to see what someone else is seeing on site. The expert can also give the person wearing the device verbal instructions and visual instructions through the display. In addition, the person in the field can use the device hands-free.

This kind of “remote expert guidance” has been a common use case for RealWear equipment since the company’s founding five years ago, but it was accelerated when the pandemic prevented people from traveling. Now that more companies have experienced the alternative, and the savings in travel costs and experts’ time, many are likely to stick with it.

“We think this is the new normal, that not everybody needs to travel all the time,” Rama Oruganti, chief product officer at RealWear, told PYMNTS. “There are certain tasks that can be done remotely as long as you have the right point of view, you can see the things and you have the tools in place to make remote work possible in that way.”

Providing Information Without Distracting From Hands-On Work 

RealWear launched the latest generation of its industrial-strength wearables on Dec. 8. Assisted reality wearables like the new RealWear Navigator 500 incorporate the digital world but do not immerse the user in it or put it in the user’s field of vision. Instead, with assisted reality, the digital world is right below the user’s field of view, so they need only look down to see it — just as the driver of a car would glance down at the dashboard, Oruganti explained. In industrial uses cases that often include hazardous environments, it’s important that the user’s field of view not be obstructed, he added.

“Industrial frontline workers are the people who can use most of the helpful things that might come out of the metaverse,” Oruganti said. “So, we are trying to take those and put it into the hands of the people who need it the most.”

The RealWear Navigator 500 is two-thirds the weight its predecessor, so it’s easier to wear during an eight-hour shift; it’s two-thirds the width of the earlier device, so that the center of gravity is closer to the user’s temple and the perceived weight is lower; it’s rugged enough to be dropped from a height of 2 meters without damage, and it’s modular so the camera and display can be changed in the future.

Enabling Digital Workflow and Visual Assist 

Another growing use case for these devices involves inspections in which the user is on site and doesn’t need remote assistance. This “digital workflow” use case applies the product’s ability to respond to voice command and to record what the user sees. While inspecting equipment, for example, the user can mark off items on a checklist with verbal commands and can record what they’re seeing. Previously, this would be done by marking the items off on paper and later entering that information into a laptop.

“That was one of the big things we resolve,” Oruganti said. “One, you reduce paperwork errors; two, you save time, and three, you have visual documentation.”

A third growing use case for this equipment is “visual assist.” If someone is repairing a piece of equipment and needs to see the blueprint as they work, they can see it on the display on the wearables. What’s more, they can track around that piece of paper and zoom in on the part they need.

“It’s like someone’s holding a seven-inch tablet at arm’s length with that information for you, so that’s very helpful,” Oruganti said.

Displaying IoT Data When It’s Needed 

Assisted reality wearables can also display information from the sensors on a piece of equipment so that the user can see if a machine is too hot or spinning too fast, for example, without having to read the dial. The Internet of Things (IoT) data is funneled into the display and the user can see it when they want it.

“[Assisted reality] and [virtual reality] are going to be big — they’re already big and they’re growing fast,” Oruganti said. “The big immediate use case for a lot of people in the real world is going to be in industry. There are 100 million industrial frontline workers — the kind of people we are targeting — so we are trying to take the best in class of things that are applicable here without being distracting and without taking away their hands.”


How XR Can Help You Keep It Green By Theorem Solutions

In 2019, the European Manufacturing sector was responsible for an annual total of 880 million tons of carbon dioxide. This makes it one of the largest single emitters of greenhouse gases in Europe. While in the US, manufacturing accounts for almost a quarter (23%) of direct carbon emissions

With the current global spotlight being shone on climate change, countries and large corporations are having to agree to new climate regulations in the bid to reduce their carbon footprint, and are looking to innovative and complementary technologies that will help to reduce emissions, whilst future proofing their operations – including the adoption and deployment of extended reality (XR) technologies.

What is XR

XR (or eXtended Reality) is an umbrella term that encompasses the immersive Augmented (AR), Mixed (MR) and Virtual Reality (VR) technologies.

Understand the differences between AR, MR and VR

XR technologies allow data (for the point of this post, 3D CAD and PLM data) to be interacted with as a 3D digital representation in context, and at full scale.

How can XR make it green

On the surface, using XR devices may not seem like they are making much of a difference to an organisation’s green credentials. But small changes in the early stages of product development can filter through along the product’s lifecycle, from the initial design stages to the finished product rolling off the production line.

So where to use it…

Design reviews and prototyping

The design review process often starts with teams looking at a CAD design on a computer screen. However, you never quite get the full picture through a 2D monitor- it can be hard to imagine the true shape and scale of a product, and whether the ergonomics of a design will translate to a physical product.

Usually the next step is to produce 3D models of the product, sometimes at full scale or sometimes as a smaller representation. But either way, additional materials such as clay or foam are required. If the modelling phase needs more than one iteration of a model, then the amount of material used (and the waste product) adds up. And what happens to the models when they are finished with? Do they just get thrown away or are the materials repurposed?

By using XR technology, design reviews can be conducted in context and at full scale, giving designers and engineers the chance to make changes without having to build additional models. The changes can be made in the CAD system and then re-checked in the XR device.

This saves on physical design iterations and therefore the need for modelling materials, which in return reduces the waste produced as excess.


XR is great for training. By practicing a process (such as the assembly of a component) virtually, if errors are made, materials are not wasted and machinery is not needed to be used. The training process can be repeated as many times as necessary. So when the operator is experienced enough, the materials and machinery should only need to be used once.


Today’s products are rarely developed with the luxury of co-located teams.  Globally distributed design is practically unavoidable, and increased home working is further adding to the challenge on effective collaboration between design teams.

By using the collaboration capabilities of XR to connect with globally located teams for processes such as design reviews and factory layouts, the need for previously required travel (including flights to visit various facilities) is redundant. With air travel being a major contributing factor to CO2 emissions, taking that need for travel away will make a difference to your green credentials (and your wallet!).

Starting small is still a start

Obviously there are changes that need to be made to manufacturing process on a grander scale in the attempt to cut emissions and waste, but these projects could take years to implement and cost a lot of money. Switching to XR for some smaller elements, like those mentioned above, may not make a massive dent in the fight against climate change, but its a good place to start.

AR and VR in Manufacturing

Perhaps the most promising XR technology applications are found in manufacturing and industrial environments. Indeed according to PwC research, the use of VR MR and AR in product development could raise GDP by $360 billion by 2030.

When determining the optimal deployment strategy for these technologies, manufacturing and industrial business leaders should consider the following:

  1. XR has the potential to improve the product design process
  2. XR tech can aid in the process of production planning
  3. AR is suitable for assembly lines

Examples are given in the article about immersive XR use including Google Glass and Microsoft HoloLens with examples from workers at GE. The original article can be read here.

ThirdEye Announces Razor MR Glasses, Expands into Consumer Metaverse with New Lightweight Solution

With the Razor MR Glasses’ lightweight, all day wearable form factor, consumers can experience a total immersive metaverse solution. The applications available on the consumer MR glasses range from gaming and entertainment to telehealth and remote assistance. Game developers are creating multi-player metaverse apps for users wearing Razor MR Glasses, where they can view digital information overlaid onto a cityscape. Users can also watch movies or their favorite TV shows with spatial audio.

Repairs and appointments can be handled via the MR glasses as well. Consumers can use existing ThirdEye software, such as RemoteEye, to get real-time help from maintenance crews for fixing things at home or take an inventory of assets at home for insurance purposes. ThirdEye’s RespondEye platform can also be used to communicate with their doctors or caregivers remotely, allowing the remote doctor to view the patient in real time with AR annotations.

“Through the feedback we’ve received from customers since we launched in 2016, we’ve found there to be a great desire to bring our lightweight solutions and user-friendly applications, like RemoteEye, for home use as well,” said Nick Cherukuri, Founder and CEO of ThirdEye. “For the Razor MR Glasses, we wanted to accommodate a variety of needs. For example, these mixed reality glasses are lightweight and myopia friendly, allowing nearsighted users to adjust the Razor MR Glasses from zero to negative five diopters with a single twist of a knob on the side of the glasses. Now, no one will need to attempt stacking multiple eyewear pieces – as is needed with VR solutions, making it extremely comfortable for daily use.”

In addition, the Razor MR Glasses already support many metaverse applications that users can access in ThirdEye’s app store, including RemoteEye for any remote assistance aid and HIPAA-certified RespondEye for telehealth. The Razor MR Glasses feature a refresh rate of 70 Hz and two noise-canceling microphones to prevent lag and enable clear communication. The Razor MR Glasses can connect with most Android and iOS devices, including all phones that support display port (DP) output, laptops and tablets with a USB-C port, and gaming consoles through HDMI adapters.”

Foldable and lightweight at 85 grams, the Razor MR Glasses are comfortable to wear on the go or at home for extended periods of time. The glasses allow users to remain hands-free in a variety of activities, including interacting on their social media, utilizing a multi-purpose assistant, exercising with a personal trainer via a heads-up display coach, and immersing themselves in mixed reality games. The Razor MR Glasses run on the 9.0 Android operating system, boast a 43-degree field of vision (FOV) (equivalent to a 120″-inch display), and have an 8-hour battery life. Additional features for the Razor MR Glasses include voice control and a dual high-definition (HD) directional sound system.

The Razor MR Glasses have already received preorders from leading consumer and telecom companies.

The new Razor MR Glasses are currently in production and will be shipping later this year. Users can pre-order or receive more information at www.thirdeyegen.com or by contacting sales@thirdeyegen.com.

Read ThirdEye AREA member profile here https://thearea.org/area-members/thirdeye-gen/