1

XR And Spatial Computing Were Everywhere At MWC 2023

XR And Spatial Computing Were Everywhere At MWC 2023

This year’s Mobile World Congress, MWC 2023, was the first true Mobile World Congress since 2019, and it was quite apparent that the show was back in full swing. I attended the event last year, but it was considerably smaller—a shell of its former self—which is something I couldn’t say about this year’s show. The GSMA, which organizes the event, says that 88,000 people attended this year, up from 66,000 last year (albeit still down from 2019’s 109,000).

 

Read more




Augmented Reality for Enterprise Alliance Publishes Key Research

March 6, 2023 PR

3D models or point clouds can lower the cost, time, and developer training to view an object or environment with AR information such as instructions, warnings, or routes overlaid on the physical world. Despite its relatively young presence in the enterprise sector, AR technology has rapidly evolved into a powerful tool with broad versatility and a thriving community of experts.

AR technology is already being leveraged with 3D mapping data to provide strategic tools for site planning, instructional guidance, or real-time navigation. As AR technology advances, so will its capabilities to leverage 3D mapping data.

“3D mapping technology has become pervasive throughout various industries to capture objects and environments in a digital format such as point clouds or 3D models,” said Mark Sage, Executive Director of the AREA. “It allows for rapid visualization, communication, and prototyping without the additional physical overhead. Our new report offers developers, business decisions makers and companies interested in AR, information about 3D mapping technology and techniques to eliminate resistance to augmented reality (AR) adoption.”

“This research helps to inform enterprise on how 3D mapping technologies can be utilized to capture accurate, cost-effective digital representations of real-world environments, how this data can be leveraged in augmented reality applications, and why these concepts can be useful in industrial environments,” said Samuel Neblett, Senior AR/VR Software Developer and 3D Modeler, Boeing Research & Technology.

The new AREA research report provides steps companies can take to ensure accurate and successful capture of objects and environments. A supporting sample project demonstrates a real-world example that leverages 3D scan data for an AR-assisted use case.

“The AREA research project is very valuable for corporations looking to use AR technologies. It offers a good overview of available 3D mapping solutions (including our AR solutions), and outlines the advantages of each,” said Markus Meixner, CEO, ViewAR.

Please view an executive summary of the 3D Mapping Solutions for Enterprise AR research report from the AREA website. Please also view executive summaries of other AREA resources and enterprise guidance from the AREA website.

 

About the AREA

The Augmented Reality for Enterprise Alliance (AREA) is the only global non-profit, member-based organization dedicated to adopting interoperable AR-enabled enterprise systems. Whether you view it as the next computing paradigm, the key to breakthroughs in manufacturing and service efficiencies, or the door to unimagined applications, AR will have an unprecedented impact on enterprises of all kinds. Visit https://thearea.org for more information.

Note to editors: AREA is a program of Object Management Group® (OMG®). See the listing of all OMG trademarks. All other trademarks are the property of their respective owners.

 




Iristick announces major capital increase enabling accelerated user adoption

Iristick announces major capital increase enabling accelerated user adoption

Iristick’s smart glasses are being used by professionals across industries in Europe and the US since the company was founded in 2015. Various large-scale proof-of-concept projects are underway, which are expected to lead to accelerated adoption of smart glasses in 2023-2024.

Currently, more than 700 companies rely on Iristick to help improve operational processes, from remote assistance to inspection for mission-critical processes. Beyond the basic use case of remote assistance, Iristick also develops tailored solutions in close partnership with major corporations in healthcare, oil & gas, crop inspection and logistics. In addition, Iristick also supports projects for telemedicine in developing countries as part of its social impact initiative, Social In Motion.

Key customers include Bayer, Siemens Energy, JBT, Houston Methodist Hospital, HG Molenaar, Aviapartner.

Through a partnership with global ATEX market leader ECOM, a Pepperl+Fuchs subsidiary, Iristick developed the breakthrough Visor-Ex® glasses for use in harsh and potentially hazardous environments, which are typical for ECOM’s oil & gas customers.

Iristick will make follow-on investments in platform development, user adoption and sales & marketing in both Europe and United States. The company has appointed Karel Goderis, a software industry veteran, to lead this expansion, in close cooperation with the existing management team.

About Iristick

Iristick, based in Antwerp and New York, is a leading producer of smart glasses that enable handsfree communication and information sharing for the deskless workforce in various industries. Iristick smart glasses are being used globally for remote assistance, step-by-step workflow guidance, pick-by-vision and video conferencing. Iristick also supports NGOs with telemedicine equipment in most underserved parts of the world, enabling teleconsultation and remote expert guidance in rural areas. The award-winning Iristick smart glasses are the most balanced and lightweight on the market, and they seamlessly connect with both iOS and Android smartphones.

Published on Feb 22, 2023




Magic Leap 2 is now commercially available

Widespread availability of the Magic Leap 2 comes after a successful Early Access Program with companies like Cisco, SentiAR, NeuroSync, Heru, Taqtile, PTC and Brainlab. During this period, Magic Leap continued to refine and improve the device for training, communication, remote assistance use cases in clinical settings, industrial environments, defense, and retail stores.

“The Magic Leap 2 is the smallest and lightest augmented reality device built for the enterprise,” said Peggy Johnson, CEO of Magic Leap. “After working with customers across industries like healthcare, manufacturing and the public sector, we’re proud to release a device that features innovative breakthroughs critical to driving widespread adoption, including Dynamic Dimming™ technology, the industry’s largest field of view, and unparalleled image quality and text legibility. Magic Leap 2 will take the current use cases to the next level, and we can’t wait to see what our customers create.”

Magic Leap 2 integrates new innovations to address the historical barriers that have prevented the widespread adoption of AR technology and are critical to making AR a valuable tool for daily use in the healthcare, manufacturing/light industrial, retail, and defense sectors.

Key features and innovations of Magic Leap 2 include:

  • An open platform that empowers enterprises and developers with flexibility, cloud autonomy, and data privacy
  • 20% lighter and 50% smaller in volume than Magic Leap 1
  • Proprietary optics breakthroughs that enable best-in-class image quality, color fidelity, and text legibility
  • Largest field of view (up to 70° diagonal), compared to similar, currently available AR devices
  • Dynamic Dimming™ technology, a first-to-market innovation that enables Magic Leap 2 to be used more effectively in brightly lit settings with greater image solidity

Each of these advancements is designed to increase utility, comfort, and sustained use, in order to deliver what the enterprise market has been asking for — a device that can provide an immediate return on investment and can be worn for extended periods of time.

Three Commercially Available Editions

Magic Leap 2 is available in three editions:

Magic Leap 2 Base edition is best for stand alone use by professionals and developers that wish to access the most immersive augmented reality device available.

Magic Leap 2 Developer Pro provides access to developer tools, sample projects, enterprise-grade features, and monthly early releases for development and test purposes.  Only for internal use in the development and testing of applications. Use in commercial deployments and production environments is not permitted.

Magic Leap 2 Enterprise is designed for environments that require flexible, large-scale IT deployments and robust enterprise features. This tier includes quarterly software releases fully manageable via enterprise UEM/MDM solutions. Use in commercial deployments and production environments is permitted. Magic Leap 2 Enterprise comes with 2 years of access to enterprise features and updates.




Theorem Solutions – Placing models using QR codes in Augmented & Mixed Reality

How to Use QR Codes in HoloLens 2 Mixed Reality

Video: Using the QR Code Offset tool in Microsoft HoloLens 2The QR code offset feature using QR Code Detection in Microsoft HoloLens 2, allows a QR code to be utilized as an origin point when visualizing 3D models in MR. In Theorem Solutions’ Visualization Pipeline, users can set where the digital model will appear in relation to a QR code. Then any time you use a QR code to load the model it will appear in the same place.

This helps put models in context and allows users to see if something will fit in a certain location. For example, when seeing if parts would fit within an automotive setup, a QR code can be used to set the origin in the center of a car and digital models of parts can be positioned using the offset feature. This allows users to be more exact with the placement of their models when working with physical objects and digital models together.

Additionally, provided the QR code isn’t moved, this feature allows users to load a model in the same place every time. This gives users greater flexibility to their work process, allowing users to look at multiple models in succession, and then revisit a previous model with the assurance that the model will remain exactly where it needs to be.

Using QR Codes in Augmented Reality

For example, the Image Tracking feature in Theorem-AR can be used to load a large factory layout or production line and position it over a QR code. This is ideal for when you are looking to visualize large designs on a table top to be able to see all the data at once. The ability to utilize this on your handheld device makes XR technology much easier and accessible for such use cases.

Use Cases

QR codes can give users flexibility when working with digital objects interacting with a physical environment, and it is designed to be adaptable to a wide variety of use cases. But here are some examples of possible use cases that are enhanced by QR codes.

Precise Placement in MR- QR codes are particularly useful in XR when consistent precision is required. If you needed to line up two holes for a bolt to go through, for instance, you could use the QR Code offset feature to position the digital model correctly and ensure everything lines up.

Scaling large data in AR- Additionally, the ability to set a point of origin in the real world is useful when visualizing large data in AR. Having a positionable point of origin makes scaling much easier within AR, particularly when it comes to larger datasets. With QR codes you can scale the model larger or smaller, but the point of origin will remain the same.

In Summary

QR codes can be used in Augmented and Mixed Reality in a variety of different ways depending on the use case. They allow users to set a point of origin in the real world using a QR code, and with Microsoft HoloLens 2 users can position models in relation to the QR code.

With QR codes you:

• Can load 3D models into the same position multiple times
• Gain more precision when arranging digital models alongside physical objects
• Are able to scale models easier in AR
• Have greater flexibility with how your digital models are positioned in relation to the real world




Is AR and VR in Commercial Aviation Taking Off?

Which Aviation Groups are Providing AR and VR Solutions?

Celebi Aviation Holding are setting up an aviation academy in Turkey. The Celebi Aviation Academy in Turkey being certified by the International Air Transport Association (IATA) and Training Validation Program (TVP). Recognising the academy as an official Center of Excellence in Training and Development. Allowing for the student to virtually sit airside, along many Airbus and Boeing planes. Covering various elements of commercial aviation, from pre arrival too post departure inspection. The various scenarios presented give aviation students a safe environment Letting students find solutions to identified faults. Allowing for difficult environments. Such as different environmental conditions, under LVO (low visibility operations) and night time operations. Japan Airlines teaming with Asia’s largest manufacturer Tecknotrov and Quatar Airlines to invest in more autonomous training for pilots and engineers. Expectations for AR and VR in aviation training has always had a strong relation; both recently and throughout previous generations of the technology. So, it comes as no surprise that, with the well respected foundation between the mixed reality space and aviation. The predicted growth rate of the AR and VR aviation markets is expected to be more than $1372 million by the end 2025. It is important to note the practical strides is where AR and VR fit into the expected processes for a workers day to day routine. With solutions for almost every team member involved in the flyers journey. AR offers flight attendants and handlers a paperless workflow, obviously aiding with cross contamination in post pandemic, busy work environment. SATS, the chief ground-handling and in-flight catering service provider at Singapore Changi Airport. Having integrated M300 smart glasses to 600 of their employees. Getting rid of pen and paper methods during luggage handling. Allowing for quick QR scanning, saving a reported 15 minutes for each flight.

What Can Passengers Expect?

Passengers are also become part of this landscape; VR can offer flyers new forms of entertainment during their long journey. Airfrance are partnered with SkyLights. A VR inflight entertainment group working from San Fransico. Together they have created a unique headset for Airbus A340 flights. Skylights boast a massive success rate with passengers using their VR entertainment headsets during flights. With a 90% recommendation rate and 4h average usage time among passengers. Lufthansa are also innovating for their passengers. Creating a 360-degree immersive experience for passengers to watch while travelling. With worldwide prospects for flyers and aviation workers, when flyers return to airports in mass. They could be presented with more AR and VR options than ever. Making the return to the runway a breeze.




Vuzix AR Glasses For EMTs

During the experimental program, select ambulances are given access to a Vuzix M400, lightweight smart glasses capable of projecting virtual images over the real-world, which EMTs can use to convey critical information to hospitals before their arrival via two-way audio and video calls.

By allowing doctors and nurses access to a patients vital signs, ECG readouts, and facial expressions in real-time, Vuzix claims that various departments can perform examinations and preliminary medical treatment before the ambulance even arrives. Hospital staff can also advice EMTs during in-transit emergency treatments, such as a blood transfusion or surgery.

“Among their expanding healthcare uses, Vuzix smart glasses can be an important life-saving tool for EMTs that require critical interaction and support from the hospitals to which they are headed,” said Paul Travers, President and Chief Executive Officer at Vuzix, in an official release.

“Our glasses are lightweight, comfortable and completely wireless, making them ideal to be used alongside the other head-mounted equipment EMTs must wear. We look forward to seeing an expansion of this trial by its participants, as well as adoption for similar usage by other providers in Japan and around the world.”

Vuzix AR smart glasses are currently being tested in select ambulances operating out of the Shunto Izu Fire Department in Japan, with plans to expand to additional ambulances in the future. The collaborative effort is being spear-headed by Juntendo University, Shizuoka Hospital, the Shunto Izu Fire Department, and AVR Japan Co., Ltd.




Theorm-AR: Multi Model – Visualisation using familiar devices

What is Multi-Model Loading?

We’ve recently increased our Augmented Reality (AR) capabilities to include multi-model loading, to meet the evolving industry requirements and customer needs for XR. Users can now load multiple models at once into the same scene, making the technology even more flexible.

Previously, only one model at a time could be loaded into an AR session. However, with multi-model loading, users can now visualize and mark-up multiple models at once. This gives users greater flexibility in their everyday working processes. Allowing them to quickly alternate between looking at one model and another to see how they compare, line-up or fit the available space. Pre-defined digital layouts that were previously only available in Mixed Reality and Virtual Reality are also now available in Augmented Reality. With Theorem-XR supporting multiple devices and data types, and only needing to prepare data once, this additional functionality in AR is closing the gap for what devices can be used in XR use cases.

How is it Used? Real Augmented Reality Examples

Factory layouts are an excellent example of a use case where Theorem-AR’s new multi-model loading is vital.

Part of a factory layout being visualized in the Theorem-AR application.

Being able to load pre-defined layouts on your smartphone or tablet enables you to work on much larger use cases such as defining shop floor plans in XR. You can visualize the relative scenery, components, and poseables, all on your handheld device.

It also gives you a good idea of how people will interact with a proposed factory layout. Including identifying what is in reach from a certain position, determining whether areas are accessible as well as assessing any risks. This can all be done in a re-configurable environment, allowing users to completely plan and adjust their layouts from the desktop before reviewing in AR.

The advantage of having this feature in Augmented Reality is that you can place the equipment models in your current environment. This means that you can visualize solid models in the room the equipment is planned to be in. The ability to analyze a proposed layout in this way means users can ensure layouts are correct before attempting to implement them. And since AR doesn’t require expensive headsets it’s easy to adopt for everyone involved.

A picture of the markup tool being used in the Theorem-AR app on the Samsung Galaxy Tab S7.

Enhance Your Design Processes

Another feature that is improved by multi-model loading is the ability to snap to a physical object with a digital model. With this feature, a physical object can be used as a reference point in order to automatically overlay a digital version. Users can now also arrange other parts around the digital model on desktop, which will appear when using this Snap To feature in AR. This allows users to test space requirements for a collection of parts using one part as a reference.

This combined with existing features, such as the mark-up tool to add notes and drawings, opens up the opportunity for engineers to collaborate with each other by identifying and easily sharing obstacles or flaws within a design.

To Recap

Extended reality is an excellent tool to remotely visualize design data from anywhere, and AR makes adoption even easier thanks to only requiring a handheld device such as a smartphone or tablet, which we all have access to. With the addition of multi-model loading users can now do even more with their data in AR; all while using a familiar technology that requires minimal training to use.

Factory layout planning is the best example of this, with users now having the ability to visualize layouts in the real world. Additionally, with design reviews, users can review multiple models from anywhere in the world.

Multi-model loading provides more options to address new use cases with AR, using devices that everyone has access to. Working around 3D design data has never been easier.




Factory layout Experience – Theorem Solutions

Optimize designs in immersive XR

The Factory Layout Experience enables a planning or layout engineer, working independently or with a group of colleagues, locally or in remote locations, to optimize Factory layouts through the immersive experience of eXtended Reality (XR) technologies. Seeing your data at full scale, in context, instantly enables you to see the clashes, access issues and missing items which a CAD screen cannot show.

On the shop floor there are literally 1000’s of pieces of equipment- much of it bought in and designed externally. Building designs may only exist as scans or in architectural CAD systems, and robot cells may be designed in specialist CAD systems. There will be libraries of hand tools, storage racks and stillage equipment designed in a range of CAD systems, and product data designed in house in mechanical CAD. To understand the factory and assess changes, all of that has to be put together to get a full picture of where a new line, robot cell or work station will fit.

A catalogue of 3D resources can leverage 2D Factory layouts by being snapped to these layouts to quickly realize a rich 3D layout. Advanced positioning makes it very easy to move, snap and align 3D data. Widely used plant and equipment is readily available, there is no need to design it from scratch for every new layout. Simplified layout tools enable you to position, align and snap layout objects quickly, which can be used by none CAD experts, enabling all stakeholders to be involved in the process, improving communication.

Testing Design and Operational Factors

Human centred operations can be analysed using mannequins that can be switched to match different characteristics. You can test design and operational aspects of a variety of human factors, to determine reachability, access and injury risk situations, ensuring compliance with safety and ergonomic standards.

It enables companies to avoid costly layout redesign by enabling all parties involved to review the layout collaboratively, make or recommend changes, and capture those decisions for later review by staff who could not attend the session.




Building an immersive pharma experience with XR technology

In the world of pharma manufacturing, precision is key. To execute flawlessly, pharmaceutical scientists and operators need the proper training and tools to accomplish the task. User-friendly augmented reality (AR) and mixed reality (XR) technology that can provide workflow guidance to operators is invaluable, helping name brand companies get drugs, vaccines, and advanced therapies to patients faster.

AR has been a cost-effective way to improve training, knowledge transfers, and process execution in the lab during drug discovery and in the manufacturing suite during product commercialization. Apprentice’s AR Research Department is now seeing greater demand within the pharma industry for XR software capabilities that allow life science teams to use 3D holograms to accomplish tasks.

For example, operators are able to map out an entire biomanufacturing suite in 3D using XR technology. This allows them to consume instructional data while they work with both hands, or better understand equipment layouts. They can see and touch virtual objects within their environment, providing better context and a much more in-depth experience than AR provides.

Users can even suspend metadata in a 3D space, such as the entrance to a room, so that they can interact with their environment in a much more complete way, with equipment, objects and instruments tethered to space. Notifications regarding gowning requirements or biohazard warnings for example will automatically pop up as the operator walks in, enriching the environment with information that’s useful to them.

“It’s all about enhancing the user experience,” Linas Ozeratis, Mixed Reality Engineer at Apprentice.io. “At apprentice, our AR/XR Research Team has designed pharma-specific mixed-reality software for the HoloLens device that will offer our customers an easier, more immersive experience in the lab and suite.”

Apprentice’s XR/AR Research Team is currently experimenting with new menu design components for the HoloLens device that will reshape the future of XR user experiences, making it easier for them to interact with menus using just their fingers.

Apprentice’s “finger menu” feature allows users to trigger an action or step by ‘snapping’ together the thumb and individual fingers of the same hand. Each finger contains a different action button that can be triggered at any time during an operator’s workflow.

“Through our research, we’ve determined that the fingers are an ideal location for attaching AR buttons, because it allows users to trigger next steps without their arm or hand blocking the data they need,” Ozeratis added.  It’s quite literally technology at your fingertips.”

Why does the pharma industry want technology like this? Aside from the demand, there are situations where tools like voice commands are simply not feasible. The AR Research Team also learned that interactive finger menus feel more natural to users and can be mastered quickly. Life science teams are able to enhance training capabilities, improve execution reliability and expand the types of supporting devices they can apply within their various environments.

“Introducing these exciting and highly anticipated XR capabilities is just one stop on our roadmap,” Ozeratis adds. “There are bigger and bolder things ahead that we look forward to sharing as the pharma industry continues to demand more modern, intelligent technologies that improve efficiency and speed.”