1

RealWear Launches Cloud Offering

RealWear Cloud is a new multi-purpose software offering for IT and business operations. Through the new dashboard, IT and Business Operations can remotely and securely streamline control of their RealWear device fleet. As companies grow their fleet of RealWear devices, RealWear Cloud allows for convenient low-touch, over-the-air firmware updates, keeping the devices secure and company data protected. Working alongside organizations’ existing EMM or MDM software such as Microsoft Endpoint Manager (InTune), the offering further provides teams more real-time data and metrics to optimize operational efficiency. RealWear Cloud complements existing EMM/MDM solutions and enables device-specific control and configuration capabilities. Also, it is the only way to gain trusted and secure access to certified third-party apps designed for our product portfolio.

In addition, RealWear is introducing RealWear Cloud Assistance as part of the offering.  RealWear Cloud Assistance provides real-time remote technical support and troubleshooting to frontline workers to quickly identify, diagnose and fix device issues. Reducing device downtime through remote troubleshooting will have a growing impact on company bottom lines. According to VDC research, individual incidences of device failure result in 72 minutes of lost or disrupted productivity for frontline workers. Remote support, firmware updates, and data analytics will not only increase productivity but will be necessary as businesses face ongoing talent shortages, the scarcity of which Gartner notes was exacerbated in 2021.

“As a deployment of RealWear devices grows across sites and countries, it’s critical that we provide great IT tools and real-time metrics for those ultimately responsible for the successful deployment of the devices in the field,” said Andrew Chrostowski, Chairman and CEO of RealWear. “We’re capturing data that will drive better decisions. It’s exciting to see RealWear transitioning from a device-centric company to a platform solution company with the introduction of our first software-as-a-service (SaaS) offering.”

RealWear’s previous lightweight device management tool, will transition to RealWear Cloud. Current Cloud customers will automatically be enrolled in the Basic plan.

“Wearable technologies are becoming more and more mainstream in the enterprise, and making deployments simple and frictionless is one of our key goals,” continued Chrostowski. “Wearables are no longer viewed as a novelty but are now trusted by enterprises to bring value and solve real-world problems.”

About RealWear

As the pioneer of assisted reality wearable solutions, RealWear® works to engage, empower, and elevate the modern frontline industrial worker to perform work tasks more safely, efficiently, and precisely. Supporting over 65,000 devices, RealWear gives workers real-time access to information and expertise while keeping their hands and field of view free for work. Headquartered in Vancouver, Washington and used by 41 of the Fortune 100 companies, RealWear is field-proven in a wide range of industries with thousands of world-class customers, including Shell, Goodyear, Mars, Colgate-Palmolive, and BMW.




Magic Leap 2 – Pricing Released


Magic Leap 2 Base

$3,299 (US only)

Magic Leap 2 Base targets professionals and developers that wish to access one of the most advanced augmented reality devices available. Use in full commercial deployments and production environments is permitted. The device starts at an MSRP $3,299 USD (US only) and includes a 1-year limited warranty.


Magic Leap 2 Developer Pro

$4,099 (US only)

Magic Leap 2 Developer Pro provides access to developer tools, sample projects, enterprise-grade features, and monthly early releases for development and test purposes. Recommended only for internal use in the development and testing of applications. Use in full commercial deployments and production environments is not permitted. Magic Leap 2 Developer Pro will start at an MSRP $4,099 USD (US only) and includes a 1-year limited warranty.


Magic Leap 2 Enterprise

$4,999 (US only)

Magic Leap 2 Enterprise is targeted for environments that require flexible, large scale IT deployments and robust enterprise features. This tier includes quarterly software releases fully manageable via enterprise UEM/MDM solutions. Use in fully commercial deployments and production environments is permitted. Magic Leap 2 Enterprise comes with 2 years of access to enterprise features and updates and will start at an MSRP $4,999 USD (US only) and includes an extended 2-year limited warranty.

Most Immersive

Magic Leap 2 is the most immersive AR device on the market. It features industry leading optics with up to 70° diagonal FOV; the world’s first dynamic dimming capability; and powerful computing in a lightweight ergonomic design to elevate enterprise AR solutions.

Built for Enterprise

Magic Leap 2 delivers a full array of capabilities and features that enable rapid and secure enterprise deployment. With platform-level support for complete cloud autonomy, data privacy, and device management through leading MDM providers, Magic Leap 2 offers the security and flexibility that businesses demand.

Empowering Developers

Magic Leap 2’s open platform provides choice and ease-of-use with our AOSP-based OS and support for leading open software standards, including OpenGL and Vulkan, with OpenXR and WebXR coming in 2H 2022. Our platform also supports your choice of engines and tools and is cloud agnostic. Magic Leap 2’s robust developer portal provides the resources and tools needed to learn, build, and publish innovative solutions.




Magic Leap and NavVis Announce Strategic Partnership to Enable 3D Mapping and Digital Twin Solutions in the Enterprise

Combining Magic Leap’s advanced spatial computing platform with NavVis’s mobile mapping systems and spatial data platform, the two companies aim to enhance the use of AR applications across key industries, including automotive, manufacturing, retail and the public sector.

As part of this strategic partnership, NavVis will bring its NavVis VLX mobile mapping system and NavVis IVION Enterprise spatial data platform to Magic Leap’s new and existing enterprise customers with an initial focus on manufacturing. Magic Leap customers will be able to leverage NavVis’s expansive visualization capabilities to generate photorealistic, accurate digital twins of their facilities at unprecedented speed and scale.

The market opportunity for digital twins and other forms of advanced visualization is significant – with demonstrated potential to transform the world of work as we know it. While attention around the potential of the metaverse has put a greater focus on all types of mixed reality technology, AR represents an immediate opportunity for businesses to enhance productivity and improve operational efficiency. Magic Leap’s open, interoperable platform will also enable the metaverse to scale for enterprise applications.

While the Magic Leap 2 platform offers cutting-edge scanning and localization capabilities in real-time on the device itself, NavVis’s technology will allow Magic Leap customers to pre-map and deploy digital twins in large, complex settings that can cover up to millions of square feet – including but not limited to warehouses, retail stores, offices and factories – for a variety of use cases, such as remote training, assistance and collaboration. Such applications will enable companies to reduce operational costs, enhance overall efficiency and democratize the manufacturing workforce of tomorrow.

“We are seeing significant demand for digital twin solutions from our enterprise customer base and are thrilled to partner with NavVis to make our shared vision for large-scale AR applications a reality,” said Peggy Johnson, CEO of Magic Leap. “Coupled with our Magic Leap 2 platform, NavVis’s advanced visualization capabilities will enable high-quality, large-scale and novel AR experiences that business users demand.”

The NavVis partnership is an essential component of Magic Leap’s strategy to cultivate an ecosystem of best-in-class technology partners that will deliver on the promise of enterprise AR, leveraging Magic Leap 2’s powerful, open platform. With a global customer base of more than 400 companies, including the likes of BMW, Volkswagen, Siemens and Audi, NavVis has a proven track record of delivering immediate and long-term value to enterprises looking to modernize their operations.

“Enterprise AR solutions for larger-scale activations will open the door for greater innovation in the workplace,” said Dr. Felix Reinshagen, CEO and co-founder of NavVis. “Our own experience shows that 3D mapping and digital twins are a fundamental foundation for large-scale persistent AR applications. We’re experiencing strong demand across many verticals with industrial manufacturing as a clear front runner. Magic Leap is a world leader in delivering impactful, innovative experiences in these verticals, and we are excited to collaborate with the company to advance this mission and further enable the future of work.”

About Magic Leap

Magic Leap, Inc.’s technology is designed to amplify human potential by delivering the most immersive Augmented Reality (AR) platform, so people can intuitively see, hear, and touch digital content in the physical world. Through the use of our advanced, enterprise-grade AR technologies, products, platforms, and services, we deliver innovative businesses a powerful tool for transformation.

Magic Leap, Inc. was founded in 2010, is proudly headquartered in South Florida, with eight additional offices across the globe.

About NavVis

Bridging the gap between the physical and digital world, NavVis enables service providers and enterprises to capture and share the built environment as photorealistic digital twins. Their SLAM-based mobile mapping systems generate high-quality data with survey-grade accuracy at speed and scale. And with their digital factory solutions, users are equipped to make better operational decisions, boost productivity, streamline business processes, and improve profitability. Based in Munich, Germany, with offices in the United States and China, NavVis has customers worldwide in the surveying, AEC, and manufacturing industries.




Blippar brings AR content creation and collaboration to Microsoft Teams

LONDON, UK – 14 June 2022 – Blippar, one of the leading technology and content platforms specializing in augmented reality (AR), has announced the integration of Blippbuilder, its no-code AR creation tool, into Microsoft Teams.

Blippbuilder, the company’s no-code AR platform, is the first of its type to combine drag and drop-based functionality with SLAM, allowing creators at any level to build realistic, immersive AR experiences. Absolute beginners can drop objects into a project, which when published will stay firmly in place using Blippar’s proprietary surface detection. These experiences will serve as the foundation of the interactive content that will make up the metaverse.

Blippbuilder includes access to tutorials and best practice guides to familiarise users with AR creation, taking them from concept to content. Experiences are built to be engaged with via browser – known as WebAR – removing the friction of, and reliance on dedicated apps or hardware. WebAR experiences can be accessed through a wide range of platforms, including Facebook, Snapchat, TikTok, WeChat, WhatsApp, alongside conventional web and mobile browsers.

Teams users can integrate Blippbuilder directly into their existing workflow. Designed with creators and collaborators in mind, whether they be product managers, designers, creative agencies, clients, or colleagues, organisations can be united in their approach and implementation – all within Teams. The functionality of adaptive cards, single sign-on, and notifications, alongside real-time feedback and approvals,  provides immediate transparency and seamless integration from inception to distribution. The addition of tooltips, support features, and starter projects also allows teams to begin creating straightaway.

“The existing process for creating and publishing AR for businesses, agencies, and brands is splintered. Companies are forced to use multiple tools and services to support collaboration, feedback, reviews, updates, approvals, and finalization of projects,” said Faisal Galaria, CEO at Blippar. “By introducing Blippbuilder to Microsoft Teams, workstreams including team channels and group chats, we’re making it easier than ever before for people to collaborate, create and share amazing AR experiences with our partners at Teams”.

Utilizing the powerful storytelling and immersive capabilities of AR, everyday topics, objects, and content, from packaging, virtual products, adverts, and e-commerce, to clothing and artworks, can be ‘digitally activated’ and transformed into creative, engaging, and interactive three-dimensional opportunities.

Real-life examples include:

  •  Bring educational content to life, enabling collaborative, immersive learning
  •  Visualise and discuss architectural models and plans with clients
  •  Allowing product try-ons and 3D visualization in e-commerce stores
  •  Create immersive onboarding and training content
  •  Present and discuss interior design and event ideas
  •  Bring print media and product packaging to life
  •  Artists and illustrations can redefine the meaning of three-dimensional artworks

In today’s environment of increasingly sophisticated user experiences, customers are looking to move their technologies forward efficiently and collaboratively. Having access to a comprehensive AR creation platform is a feature that will keep Microsoft Teams users at the forefront of their industries. Blippbuilder in Teams is the type of solution that will help customers improve the quality and efficiency of their AR building process.

Blippar also offers a developer creation tool, its WebAR SDK. While Blippbuilder for Teams is designed to be an accessible and time-efficient entry point for millions of new users, following this validation of AR, organisations can progress to building experiences with Blippar’s SDK. The enterprise platform boasts the most advanced implementation of SLAM and marker tracking, alongside integrations with the key 3D frameworks, including A-Frame, PlayCanvas, and Babylon.js.




Factory layout Experience – Theorem Solutions

Optimize designs in immersive XR

The Factory Layout Experience enables a planning or layout engineer, working independently or with a group of colleagues, locally or in remote locations, to optimize Factory layouts through the immersive experience of eXtended Reality (XR) technologies. Seeing your data at full scale, in context, instantly enables you to see the clashes, access issues and missing items which a CAD screen cannot show.

On the shop floor there are literally 1000’s of pieces of equipment- much of it bought in and designed externally. Building designs may only exist as scans or in architectural CAD systems, and robot cells may be designed in specialist CAD systems. There will be libraries of hand tools, storage racks and stillage equipment designed in a range of CAD systems, and product data designed in house in mechanical CAD. To understand the factory and assess changes, all of that has to be put together to get a full picture of where a new line, robot cell or work station will fit.

A catalogue of 3D resources can leverage 2D Factory layouts by being snapped to these layouts to quickly realize a rich 3D layout. Advanced positioning makes it very easy to move, snap and align 3D data. Widely used plant and equipment is readily available, there is no need to design it from scratch for every new layout. Simplified layout tools enable you to position, align and snap layout objects quickly, which can be used by none CAD experts, enabling all stakeholders to be involved in the process, improving communication.

Testing Design and Operational Factors

Human centred operations can be analysed using mannequins that can be switched to match different characteristics. You can test design and operational aspects of a variety of human factors, to determine reachability, access and injury risk situations, ensuring compliance with safety and ergonomic standards.

It enables companies to avoid costly layout redesign by enabling all parties involved to review the layout collaboratively, make or recommend changes, and capture those decisions for later review by staff who could not attend the session.




Building an immersive pharma experience with XR technology

In the world of pharma manufacturing, precision is key. To execute flawlessly, pharmaceutical scientists and operators need the proper training and tools to accomplish the task. User-friendly augmented reality (AR) and mixed reality (XR) technology that can provide workflow guidance to operators is invaluable, helping name brand companies get drugs, vaccines, and advanced therapies to patients faster.

AR has been a cost-effective way to improve training, knowledge transfers, and process execution in the lab during drug discovery and in the manufacturing suite during product commercialization. Apprentice’s AR Research Department is now seeing greater demand within the pharma industry for XR software capabilities that allow life science teams to use 3D holograms to accomplish tasks.

For example, operators are able to map out an entire biomanufacturing suite in 3D using XR technology. This allows them to consume instructional data while they work with both hands, or better understand equipment layouts. They can see and touch virtual objects within their environment, providing better context and a much more in-depth experience than AR provides.

Users can even suspend metadata in a 3D space, such as the entrance to a room, so that they can interact with their environment in a much more complete way, with equipment, objects and instruments tethered to space. Notifications regarding gowning requirements or biohazard warnings for example will automatically pop up as the operator walks in, enriching the environment with information that’s useful to them.

“It’s all about enhancing the user experience,” Linas Ozeratis, Mixed Reality Engineer at Apprentice.io. “At apprentice, our AR/XR Research Team has designed pharma-specific mixed-reality software for the HoloLens device that will offer our customers an easier, more immersive experience in the lab and suite.”

Apprentice’s XR/AR Research Team is currently experimenting with new menu design components for the HoloLens device that will reshape the future of XR user experiences, making it easier for them to interact with menus using just their fingers.

Apprentice’s “finger menu” feature allows users to trigger an action or step by ‘snapping’ together the thumb and individual fingers of the same hand. Each finger contains a different action button that can be triggered at any time during an operator’s workflow.

“Through our research, we’ve determined that the fingers are an ideal location for attaching AR buttons, because it allows users to trigger next steps without their arm or hand blocking the data they need,” Ozeratis added.  It’s quite literally technology at your fingertips.”

Why does the pharma industry want technology like this? Aside from the demand, there are situations where tools like voice commands are simply not feasible. The AR Research Team also learned that interactive finger menus feel more natural to users and can be mastered quickly. Life science teams are able to enhance training capabilities, improve execution reliability and expand the types of supporting devices they can apply within their various environments.

“Introducing these exciting and highly anticipated XR capabilities is just one stop on our roadmap,” Ozeratis adds. “There are bigger and bolder things ahead that we look forward to sharing as the pharma industry continues to demand more modern, intelligent technologies that improve efficiency and speed.”




A Talk with Christine Perey About the AREA Interoperability & Standards Program

 

AREA: How long have you been involved in standardization activities?

Perey: My role in standardization activities began in 1994 when I joined the ITU-T committee standardizing video conferencing. Seeing needs for interoperability in AR as early as 2010, I formed and led a grassroots community advocating for development of standards for AR. I have chaired dozens of meetings and workshops, and given dozens of webinars on the topics of projects and/or standards that could contribute to the advancement and adoption of open interfaces and AR interoperability. I work directly with a wide range of standards development organizations (SDO). As a member, a working group chair or co-chair, or as an invited expert, I currently contribute to nearly 20 standards. Outreach and coordination between SDOs is another passion of mine. On October 4, 2021, I chaired a tutorial coordinated with Khronos Group and ETSI ISG ARF about AR interoperability and standards in the context of the ISMAR 2021 conference. I encourage people interested in this topic and seeking to better understand what’s available to explore the tutorial website.

AREA: Tell us more about the AREA Interoperability & Standards program.

Perey: Through the Interoperability & Standards program, the AREA seeks to increase knowledge about the benefits and approaches to achieving interoperability and to advance the development of standards or other approaches to interoperability. That entails: informing AREA members and the enterprise AR ecosystem about existing standards for interoperable AR solutions through development of thought leadership content; supporting the identification of interoperability requirements in customer organizations; supporting the identification of interfaces in AR components that, through implementations, provide interoperability in enterprise AR solutions and services; engaging with organizations and members, including those dedicated to standards development and promotion of standards to provide requirements; and building a base of AR professionals who are well versed in the implementation of existing standards for AR, and promote the development and adoption of extensions to existing standards as well as new standards.

AREA: Why are standards so important to enterprise AR adoption?

Perey: The motivations for adopting standards depend on the segment of the ecosystem to which a company belongs. Let’s take the customer segment, because when technology buyers are successful, so are their partners and providers. Today, when companies begin evaluating enterprise AR use cases they do so with isolated projects (products are not integrated with enterprise systems) and using products of one or a few technology providers. In companies that are advanced in their study of AR, there can be partial or full testbeds of multiple AR technology providers, but they are often isolated from other AR projects and are not integrated with enterprise systems.

A company seeking to maintain and expand its testing within a specific technology segment (e.g., comparing multiple providers or models of hardware) or to implement at scale in their enterprise confronts significant obstacles. It has been demonstrated in other industries that when standards or open source interfaces and guidelines have been widely accepted and implemented across an ecosystem, higher technology interoperability can: reduce barriers to deployment of multivendor or multi-product solutions (also known as “integration”); lower costs of ownership; reduce risks of vendor lock-in; and increase innovation and opportunities for new sales through provider specialization. Barriers are removed and everyone benefits.

AREA: What’s on the horizon for the AREA Interoperability & Standards program?

Perey: We will continue to develop thought leadership content, through hosted webinars, white papers, and blog articles, as well as participation in relevant conferences and events. As the awareness of interoperability as a key to success rises, we will work with large enterprises deploying AR to develop their interoperability requirements and integration needs and bring them to the attention of SDOs and the AR technology providers. We will act as a conduit from SDOs to AREA member companies – providers as well as customer segment members – to share SDO draft specifications and gather and deliver feedback to them. And, where there are implementations and testing suites, we will work to support the testing of products and services that comply with international standards in real-world settings.

AREA: Why should AREA members consider participating in the Interoperability & Standards program?

Perey: This is a program that can only thrive when AR customers are actively sharing their requirements and real-world experiences. So we’re looking for AREA members to contribute to the program by preparing blog posts on topics that will share their thought leadership and raise awareness about specific or general challenges. Topics could include: key interoperability and standards requirements for enterprise AR; developing best practices for safety, human factors, and more; sharing their experiences in standards development; and recounting their experiences implementing one or more standards in specific use cases or products. AR component and solution providers will increasingly be able to showcase interoperability through AREA programs to advance interoperability such as plug-fests and testbeds. Now is the time, while AR standards are under development, to make sure your voice is heard, your needs are being considered, and your experiences are being shared.

If you’re an AREA member and would like more information about participating in the AREA Interoperability & Standards program, please contact Christine Perey. If you’re not yet an AREA member but want to see an AR ecosystem that derives the full benefit of standardization and interoperability efforts, please consider joining us. You can find membership information here.

 

 




Using Theorem-XR and HoloLens 2 for Engineering Reviews

You can watch the full webinar on Using Theorem-XR and HoloLens 2 for Engineering Reviews.

Key highlights

Theorem has also picked out 5 key benefits of using the Microsoft HoloLens 2 in Engineering which appear in full detail on their blog.

Visualize your models at full scale.
Work collaboratively with other engineers.
Make better factory planning decisions.
Work with large datasets using Azure Remote Rendering.
You can still work with colleagues that are using other XR technologies, or none at all.




New AR deal to help steel industry protect vital skills and move towards net zero

The project will initially use Vuforia Studio technology to overlay live data – taken from the ThingWorx® industrial platform – to various points of the facility, so that operators moving around will be able to make informed decisions on changes to casting and melting lines or troubleshoot issues before they happen.

It is anticipated that Augmented Reality will make it easier for staff to hae the right information at exactly the right point they need it, whilst the use of HoloLens and RealWear glasses will mean the individual has both hands free to complete tasks.

This project will contribute to the sector’s longer-term desire to move towards a net zero steel works by 2050 and is part of the £22m PRISM steel and metals sector research and innovation programme being delivered by the Materials Processing Institute with funding provided through Innovate UK, part of UK Research and Innovation.

“The successful implementation of digital technologies has the potential to save tens of £millions every year,” explained Chris Oswin, Group Manager of Digital Technologies at the Materials Processing Institute.

“We are taking responsibility for exploring IIoT platforms and AR and working out how we can get the most out of them in a live steel plant, learning from testing and trials to identify best use cases.”

He continued: “This means we absorb a lot of the time and remove the initial expenditure that could act as a barrier to entry for companies in our industry, hopefully encouraging digital adoption as we will have proved it works and how it can be applied to businesses.

“PRISM is guided by a team of industry leaders on our Industrial Advisory Board, including the Aluminium Federation, British Manufacturing Plant Constructors’ Association, British Steel, Celsa Steel, Liberty Steel, Outokumpu Stainless Steel, Sheffield Forgemasters, Swansea University, Tata Steel and the UK Metals Council.”

The Materials Processing Institute has a long-term relationship with PTC, with the latest project following on from the introduction of ThingWorx as part of the £10m programme to explore how digital technologies can be implemented in brownfield manufacturing sites.

In addition to optimising processes and introducing new efficiency improvements, Augmented Reality will also be used to capture some of the traditional skills in the sector that could be lost if the knowledge of older workers is not retained before they retire.

This will be achieved by using PTC’s Vuforia ® software, with Vuforia Expert Capture allowing operators and technicians to film their daily tasks in step-by-step instructions, in situ of when and where they do their work.

This will be uploaded to ‘the Cloud’, which can then be accessed by new starters or people switching roles, using HoloLens or RealWear to get a real hands-on experience, or other devices such as mobiles, tablets or on desktop computers.

Furthermore, for problem resolution and live ‘on the job’ support, there is Vuforia Chalk. Using mobile devices, digital eyewear or seated at a desk – experts can connect with on and off-site employees and customers and collaborate in real-time. It combines live video, audio and the ability for remote and local participants to annotate their live shared view and mark-up the real-world environment.

“If we don’t act soon, we stand to lose so much knowledge from the industry and AR gives us a cost effective and easy way to retain skills and experience in a virtual library for generations to come,” added Chris.

“Working closely with PTC’s experts, we can tailor how we capture information, footage and skills in what is a very demanding and intense environment. We believe we’ve got the initial framework to start the roll-out and will continue to adapt the processes as we understand more about how digital technologies can play a role.”

David Grammer, general manager for UKI for PTC, went on to add: “Covid-19 has definitely thrust the digital thread into the spotlight, but there is still a resistance to adoption due to a lack of awareness of how it will deliver a genuine business benefit.

“This project with the Materials Processing Institute gives an entire sector the opportunity to explore how AR can be applied and developed in a real live steel plant without the potential disruption and cost of trying it in their own facilities.

“Businesses will be involved in the roll-out and informing some of the test cases and our team will be on hand to support experts at the Institute to get the most out of our technology and software.

“The end goal is that we will have proven business cases on how steel and metals companies can optimise processes using Augmented Reality and live data, not to mention protecting vital skills for the steel workers of the future.”

PTC, which has bases in the UK and Ireland, provides a host of technology solutions to help industrial companies create value for themselves and the rest of the world.

This is achieved through a combination of Augmented Reality, Industrial IoT, Product Lifecycle Management and CAD solutions.




PTC expands spatial computing capabilities with Vuforia Engine area targets

Through the use of Area Targets, industrial organisations can create AR interfaces within their facilities to enable employees to better engage with machinery and understand how the environment is being utilised.

More information can be found here https://library.vuforia.com/features/environments/area-targets.html

PTC says that with support from Matterport and Leica 3D scanners, along with NavVis’s indoor mobile mapping systems, Area Targets users can generate “photorealistic, survey-grade digital twins, empowering them to create digital canvases of spaces such as factories, malls, or offices for advanced spatial computing applications”.

As one of the leading emerging technologies, spatial computing powers digital twin renderings to support the activities of machines and people, as well as environments in which they operate.

When deployed across the industrial enterprise, spatial computing enables seamless interactions between employees through AR, enabling companies to close the loop on performance management, improve machine learning capabilities with spatial analytics, and optimise design and factory floor operations,” notes PTC.

“Vuforia Engine Area Targets is a one-of-a-kind solution for large, persistent AR experiences,” said Mike Campbell, Executive Vice President and General Manager of Augmented Reality, PTC.

“Whether users are looking to add navigation to their office building or view in-context data on a factory floor, Area Targets is the answer. We’re pleased to be expanding such a key capability and component of PTC’s spatial computing vision.”

The release of Vuforia Engine Area Targets marks the second Vuforia offering to deploy spatial computing in the form of area targets within the industrial setting, the first being the Vuforia Spatial Toolbox platform.

Combined with the Vuforia ChalkVuforia Expert Capture, and Vuforia Studio AR products, the Vuforia AR Enterprise Platform provides what PTC says is a “robust set of offerings that enables users to increase workforce safety and efficiency, improve customer experiences, and reduce costs”.