1

Theorem Solutions – Placing models using QR codes in Augmented & Mixed Reality

How to Use QR Codes in HoloLens 2 Mixed Reality

Video: Using the QR Code Offset tool in Microsoft HoloLens 2The QR code offset feature using QR Code Detection in Microsoft HoloLens 2, allows a QR code to be utilized as an origin point when visualizing 3D models in MR. In Theorem Solutions’ Visualization Pipeline, users can set where the digital model will appear in relation to a QR code. Then any time you use a QR code to load the model it will appear in the same place.

This helps put models in context and allows users to see if something will fit in a certain location. For example, when seeing if parts would fit within an automotive setup, a QR code can be used to set the origin in the center of a car and digital models of parts can be positioned using the offset feature. This allows users to be more exact with the placement of their models when working with physical objects and digital models together.

Additionally, provided the QR code isn’t moved, this feature allows users to load a model in the same place every time. This gives users greater flexibility to their work process, allowing users to look at multiple models in succession, and then revisit a previous model with the assurance that the model will remain exactly where it needs to be.

Using QR Codes in Augmented Reality

For example, the Image Tracking feature in Theorem-AR can be used to load a large factory layout or production line and position it over a QR code. This is ideal for when you are looking to visualize large designs on a table top to be able to see all the data at once. The ability to utilize this on your handheld device makes XR technology much easier and accessible for such use cases.

Use Cases

QR codes can give users flexibility when working with digital objects interacting with a physical environment, and it is designed to be adaptable to a wide variety of use cases. But here are some examples of possible use cases that are enhanced by QR codes.

Precise Placement in MR- QR codes are particularly useful in XR when consistent precision is required. If you needed to line up two holes for a bolt to go through, for instance, you could use the QR Code offset feature to position the digital model correctly and ensure everything lines up.

Scaling large data in AR- Additionally, the ability to set a point of origin in the real world is useful when visualizing large data in AR. Having a positionable point of origin makes scaling much easier within AR, particularly when it comes to larger datasets. With QR codes you can scale the model larger or smaller, but the point of origin will remain the same.

In Summary

QR codes can be used in Augmented and Mixed Reality in a variety of different ways depending on the use case. They allow users to set a point of origin in the real world using a QR code, and with Microsoft HoloLens 2 users can position models in relation to the QR code.

With QR codes you:

• Can load 3D models into the same position multiple times
• Gain more precision when arranging digital models alongside physical objects
• Are able to scale models easier in AR
• Have greater flexibility with how your digital models are positioned in relation to the real world




Is AR and VR in Commercial Aviation Taking Off?

Which Aviation Groups are Providing AR and VR Solutions?

Celebi Aviation Holding are setting up an aviation academy in Turkey. The Celebi Aviation Academy in Turkey being certified by the International Air Transport Association (IATA) and Training Validation Program (TVP). Recognising the academy as an official Center of Excellence in Training and Development. Allowing for the student to virtually sit airside, along many Airbus and Boeing planes. Covering various elements of commercial aviation, from pre arrival too post departure inspection. The various scenarios presented give aviation students a safe environment Letting students find solutions to identified faults. Allowing for difficult environments. Such as different environmental conditions, under LVO (low visibility operations) and night time operations. Japan Airlines teaming with Asia’s largest manufacturer Tecknotrov and Quatar Airlines to invest in more autonomous training for pilots and engineers. Expectations for AR and VR in aviation training has always had a strong relation; both recently and throughout previous generations of the technology. So, it comes as no surprise that, with the well respected foundation between the mixed reality space and aviation. The predicted growth rate of the AR and VR aviation markets is expected to be more than $1372 million by the end 2025. It is important to note the practical strides is where AR and VR fit into the expected processes for a workers day to day routine. With solutions for almost every team member involved in the flyers journey. AR offers flight attendants and handlers a paperless workflow, obviously aiding with cross contamination in post pandemic, busy work environment. SATS, the chief ground-handling and in-flight catering service provider at Singapore Changi Airport. Having integrated M300 smart glasses to 600 of their employees. Getting rid of pen and paper methods during luggage handling. Allowing for quick QR scanning, saving a reported 15 minutes for each flight.

What Can Passengers Expect?

Passengers are also become part of this landscape; VR can offer flyers new forms of entertainment during their long journey. Airfrance are partnered with SkyLights. A VR inflight entertainment group working from San Fransico. Together they have created a unique headset for Airbus A340 flights. Skylights boast a massive success rate with passengers using their VR entertainment headsets during flights. With a 90% recommendation rate and 4h average usage time among passengers. Lufthansa are also innovating for their passengers. Creating a 360-degree immersive experience for passengers to watch while travelling. With worldwide prospects for flyers and aviation workers, when flyers return to airports in mass. They could be presented with more AR and VR options than ever. Making the return to the runway a breeze.




Vuzix AR Glasses For EMTs

During the experimental program, select ambulances are given access to a Vuzix M400, lightweight smart glasses capable of projecting virtual images over the real-world, which EMTs can use to convey critical information to hospitals before their arrival via two-way audio and video calls.

By allowing doctors and nurses access to a patients vital signs, ECG readouts, and facial expressions in real-time, Vuzix claims that various departments can perform examinations and preliminary medical treatment before the ambulance even arrives. Hospital staff can also advice EMTs during in-transit emergency treatments, such as a blood transfusion or surgery.

“Among their expanding healthcare uses, Vuzix smart glasses can be an important life-saving tool for EMTs that require critical interaction and support from the hospitals to which they are headed,” said Paul Travers, President and Chief Executive Officer at Vuzix, in an official release.

“Our glasses are lightweight, comfortable and completely wireless, making them ideal to be used alongside the other head-mounted equipment EMTs must wear. We look forward to seeing an expansion of this trial by its participants, as well as adoption for similar usage by other providers in Japan and around the world.”

Vuzix AR smart glasses are currently being tested in select ambulances operating out of the Shunto Izu Fire Department in Japan, with plans to expand to additional ambulances in the future. The collaborative effort is being spear-headed by Juntendo University, Shizuoka Hospital, the Shunto Izu Fire Department, and AVR Japan Co., Ltd.




Theorm-AR: Multi Model – Visualisation using familiar devices

What is Multi-Model Loading?

We’ve recently increased our Augmented Reality (AR) capabilities to include multi-model loading, to meet the evolving industry requirements and customer needs for XR. Users can now load multiple models at once into the same scene, making the technology even more flexible.

Previously, only one model at a time could be loaded into an AR session. However, with multi-model loading, users can now visualize and mark-up multiple models at once. This gives users greater flexibility in their everyday working processes. Allowing them to quickly alternate between looking at one model and another to see how they compare, line-up or fit the available space. Pre-defined digital layouts that were previously only available in Mixed Reality and Virtual Reality are also now available in Augmented Reality. With Theorem-XR supporting multiple devices and data types, and only needing to prepare data once, this additional functionality in AR is closing the gap for what devices can be used in XR use cases.

How is it Used? Real Augmented Reality Examples

Factory layouts are an excellent example of a use case where Theorem-AR’s new multi-model loading is vital.

Part of a factory layout being visualized in the Theorem-AR application.

Being able to load pre-defined layouts on your smartphone or tablet enables you to work on much larger use cases such as defining shop floor plans in XR. You can visualize the relative scenery, components, and poseables, all on your handheld device.

It also gives you a good idea of how people will interact with a proposed factory layout. Including identifying what is in reach from a certain position, determining whether areas are accessible as well as assessing any risks. This can all be done in a re-configurable environment, allowing users to completely plan and adjust their layouts from the desktop before reviewing in AR.

The advantage of having this feature in Augmented Reality is that you can place the equipment models in your current environment. This means that you can visualize solid models in the room the equipment is planned to be in. The ability to analyze a proposed layout in this way means users can ensure layouts are correct before attempting to implement them. And since AR doesn’t require expensive headsets it’s easy to adopt for everyone involved.

A picture of the markup tool being used in the Theorem-AR app on the Samsung Galaxy Tab S7.

Enhance Your Design Processes

Another feature that is improved by multi-model loading is the ability to snap to a physical object with a digital model. With this feature, a physical object can be used as a reference point in order to automatically overlay a digital version. Users can now also arrange other parts around the digital model on desktop, which will appear when using this Snap To feature in AR. This allows users to test space requirements for a collection of parts using one part as a reference.

This combined with existing features, such as the mark-up tool to add notes and drawings, opens up the opportunity for engineers to collaborate with each other by identifying and easily sharing obstacles or flaws within a design.

To Recap

Extended reality is an excellent tool to remotely visualize design data from anywhere, and AR makes adoption even easier thanks to only requiring a handheld device such as a smartphone or tablet, which we all have access to. With the addition of multi-model loading users can now do even more with their data in AR; all while using a familiar technology that requires minimal training to use.

Factory layout planning is the best example of this, with users now having the ability to visualize layouts in the real world. Additionally, with design reviews, users can review multiple models from anywhere in the world.

Multi-model loading provides more options to address new use cases with AR, using devices that everyone has access to. Working around 3D design data has never been easier.




Building an immersive pharma experience with XR technology

In the world of pharma manufacturing, precision is key. To execute flawlessly, pharmaceutical scientists and operators need the proper training and tools to accomplish the task. User-friendly augmented reality (AR) and mixed reality (XR) technology that can provide workflow guidance to operators is invaluable, helping name brand companies get drugs, vaccines, and advanced therapies to patients faster.

AR has been a cost-effective way to improve training, knowledge transfers, and process execution in the lab during drug discovery and in the manufacturing suite during product commercialization. Apprentice’s AR Research Department is now seeing greater demand within the pharma industry for XR software capabilities that allow life science teams to use 3D holograms to accomplish tasks.

For example, operators are able to map out an entire biomanufacturing suite in 3D using XR technology. This allows them to consume instructional data while they work with both hands, or better understand equipment layouts. They can see and touch virtual objects within their environment, providing better context and a much more in-depth experience than AR provides.

Users can even suspend metadata in a 3D space, such as the entrance to a room, so that they can interact with their environment in a much more complete way, with equipment, objects and instruments tethered to space. Notifications regarding gowning requirements or biohazard warnings for example will automatically pop up as the operator walks in, enriching the environment with information that’s useful to them.

“It’s all about enhancing the user experience,” Linas Ozeratis, Mixed Reality Engineer at Apprentice.io. “At apprentice, our AR/XR Research Team has designed pharma-specific mixed-reality software for the HoloLens device that will offer our customers an easier, more immersive experience in the lab and suite.”

Apprentice’s XR/AR Research Team is currently experimenting with new menu design components for the HoloLens device that will reshape the future of XR user experiences, making it easier for them to interact with menus using just their fingers.

Apprentice’s “finger menu” feature allows users to trigger an action or step by ‘snapping’ together the thumb and individual fingers of the same hand. Each finger contains a different action button that can be triggered at any time during an operator’s workflow.

“Through our research, we’ve determined that the fingers are an ideal location for attaching AR buttons, because it allows users to trigger next steps without their arm or hand blocking the data they need,” Ozeratis added.  It’s quite literally technology at your fingertips.”

Why does the pharma industry want technology like this? Aside from the demand, there are situations where tools like voice commands are simply not feasible. The AR Research Team also learned that interactive finger menus feel more natural to users and can be mastered quickly. Life science teams are able to enhance training capabilities, improve execution reliability and expand the types of supporting devices they can apply within their various environments.

“Introducing these exciting and highly anticipated XR capabilities is just one stop on our roadmap,” Ozeratis adds. “There are bigger and bolder things ahead that we look forward to sharing as the pharma industry continues to demand more modern, intelligent technologies that improve efficiency and speed.”




AREA podcast features PwC’s Jeremy Dalton’s new book, Reality Check


As the Head of XR at PwC UK, Jeremy Dalton saw a fundamental problem in the marketplace: too many enterprises had misconceptions about – or simply didn’t understand – the tremendous potential of AR and VR to transform their businesses. So, Dalton took it upon himself to get the message out. 

The result is his new book, Reality Check. Reality Check dispels common myths about AR and VR and details how business leaders can integrate immersive technologies into their organizations to deliver more efficient, impactful and cost-effective solutions. Dalton backs up his argument with compelling case studies from organizations such as Cisco, Ford, GlaxoSmithKline, LaLiga, and Vodafone. 

AREA Executive Director Mark Sage hosted a podcast with Jeremy Dalton recently to discuss the book and its findings. Go here to hear their conversation. To order a copy of Reality Check and receive an AREA 20% discount, go here and enter the discount code AREA20 before March 5, 2021. 




Augumenta’s Eve Lindroth on Shop Floor AR, Taiwan and the Future


When AREA member Augumenta participated in an AREA webinar about implementing AR on factory shop floors recently, we thought it would be worth catching up on the company and its activities. So we spoke the Eve Lindroth, the company’s head of Marketing Communications. Here’s our conversation.

AREA: Augumenta has distinguished itself as a leader in industrial shop floor uses of AR. To what do you attribute your success so far?

Lindroth: We have a large number of big and well-known industrial companies as our clients, and within these projects, our solutions have been adopted with very few changes. That tells us that we are taking the right approach to developing solutions for the industry. Our clients also praise the ease-of-use of our applications, and appreciate that there is no steep learning curve to start using them. Quite the opposite, they are considered easy to learn.

AREA: What’s a typical Augumenta client?

Lindroth: Most of our business is outside Finland. We have many manufacturing customers in France and Germany, for example, such as Siemens. We also have a presence in Japan and Taiwan which is important considering our focus on the Asian markets and the key customer projects we have ongoing there.

A typical client is a larger industrial company that is active in developing their operations – or during the pandemic, companies that are simply looking for the most efficient and practical ways to keep operating.

AREA: Speaking of that, in October, you announced a partnership with IISI of Taiwan. Tell us about the partnership, its goals, and its progress to date.

Lindroth: IISI is a system integrator and they have a very strong customer base in the fields of manufacturing and government. In our partnership, Augumenta acts as a technology/applications provider and the IISI experts do the final customization and integration with the end customer’s backend systems. Both companies can focus on their key strengths: we on the cutting-edge AR technology, and IISI on developing and managing the overall systems.

We started working together in the springtime and we have finalized all the customization needed for the end customer, a major semiconductor factory in Taiwan. We continue working in close cooperation with IISI and believe we are in a good position to advance enterprise AR in Taiwan together with them.

AREA: What do you see as the most significant barriers to AR adoption, and what is Augumenta doing to overcome them?

Lindroth: We have seen in many pilot projects that the organization has identified the problem they are looking to solve with a pilot, but for example, there are difficulties in defining the current status with an accurate number. For example, there’s downtime – how much there is and which factors exactly are causing it? That can be hard to come by. Another issue is user acceptance, but that can often be tackled by involving the people in planning the solutions from an early stage.

At Augumenta, we’re working to address those issues. For industrial pilots, for example, we created a simple checklist, just to remind the project managers and team leaders responsible for the pilot to consider the factors we have learned to be essential for an AR pilot’s success. These are related to things like target setting, planning together with your people and getting them involved throughout the process, or measuring the results. The checklist is available on our website.

AREA: What can we expect from Augumenta in 2021?

Lindroth: In the future, we believe that discrete industrial AR applications will become more integrated solutions. That means, for example, that there aren’t separate apps for alerting a user and guiding a user in tasks. There will be one solution that can do all of this – without the end user even noticing that there are many use cases included in the app. At some point, things like AI will make the end user’s job even easier by guiding him to the right data or expert automatically, for example.

A key success factor in such a solution is usability. Apps have to integrate seamlessly and be simple and intuitive to use independent of the use case at hand.

The pandemic has meant growth in demand for our services along with our clients’ need to find new ways to do things. In 2021, you’ll see closer integration of our apps. We’re working with new app features that are enabling efficient and sustainable working methods in the new normal. We’ll keep you posted with the latest developments during 2021.

AREA: Finally, how has Augumenta benefitted from its membership in the AREA?

Lindroth: The AREA has provided us with access to research, and there have been some great and very interesting research projects completed. We have also made many new contacts within the ecosystem via the AREA, and it’s always great to see and hear what’s going on with other ecosystem members. The AREA updates its social media channels very actively, and we appreciate the visibility they provide us.




Is AR Emerging as a Key to Resilience and Business Continuity?


The coronavirus pandemic has forced many organizations to reconsider how well-equipped they are to deal with business disruptions that require more remote work. That’s especially true for industrial companies that succeeded pre-COVID through optimized supply chains and manufacturing processes and specialized employee skill sets.

AREA Executive Director Mark Sage recently spoke with Umar Arshad, Head of Growth for AR Products at PTC, to discuss how more organizations will now leverage AR to maintain business continuity and build resiliency.

Watch the discussion in the Video below:






AREA Research Committee Issues Call for Proposals to Study AR and 5G in the Enterprise

The AREA seeks to receive proposals for a funded research project that will examine and capture in a report the current status of 5G in enterprise environments, assessments of the risks and opportunities of using 5G technologies for AR use cases, and areas for future research and potential investment for AREA members. The project will also deliver tables containing objective, vendor-neutral information about current component costs, product and service offerings, past and current trials, proof of concept projects and guidelines for AREA members. 

Organizations with relevant expertise in the research topic may respond to the invitation on or before 12 PM Eastern Daylight Time on February 10th

Industry Context for the Research

Investments in 5G are fueled by the potential for new low-latency, high-throughput network technologies to reduce or remove barriers to implementation of new and powerful use cases. By providing connected devices and machines access to high performance computing and other limited and costly resources, 5G networks will significantly expand and lower the cost of use of powerful computing hardware and software, data sets and other services (e.g., privacy, security, localization and other artificial intelligence-based platforms). 

Telecommunications companies around the world are heavily promoting 5G technology for delivering AR for entertainment and other consumer-facing services. The 5G-based services will be provided by network operators, some of whom are partnering with AR device and software providers to offer solutions to enterprise customers.  

Managers of large enterprise IT organizations are aware of the emerging 5G networks and components, including 5G-ready wearable and mobile devices, but many questions remain to be answered prior to the introduction of these in an enterprise infrastructure.  Before AREA customer segment members begin testing AR over 5G in their facilities, they need deeper understanding of key concepts of 5G, and the requirements, opportunities or benefits 5G could bring.   

Before AREA provider segment members begin evaluating and planning for 5G-enabled product or services to offer to their customers, they must build out 5G expertise internally or partner with companies that have 5G offerings.  

Project Goal

The AREA seeks to provide its members with knowledge about the current status of AR and 5G for enterprise, and actionable information which members can use when planning their AR and 5G strategies. 

Fixed Fee Project

The AREA Research Committee budget for this project is $15,000. Organizations interested in conducting this research for the fixed fee are invited to submit proposals.  

More information

Full information on the project needs, desired outcomes and required components of a winning proposal, including a submission form, can be found here.

If you have any questions concerning this project and the AREA Research Committee, please send an email to the Research Committee.


 




Here’s What AR and Other Similar Technologies Can Do for Your Business

What is one piece of advice you’d give to businesses looking to invest in Augmented Reality (AR) technology?

It’s the simple and classic advice, really. If you are an enterprise looking to bring AR into your organization, be very clear on what business problem you are trying to solve. Companies often want to “try out” new technology, to play with the latest gadgets and see what they do rather than focusing on solving a real business problem.

There are many AR use cases that provide real benefit by improving the performance and efficiency of the company operations. It is important to understand your business problem, then pilot a suitable AR solution and measure the outcome. This may include reducing time to complete a task, minimizing errors, and/or lowering costs of interruptions. These are all benefits that improve the bottom line.

The AREA portal offers more information on how to get started.

Can you discuss a few use cases of augmented reality for industrial professionals? Are there any barriers to adoption businesses should be aware of?

Based on my experience of speaking to the many enterprises and providers in the AR ecosystem, the use cases that are currently getting most traction include:

  • Remote assistance — being able to discuss with an expert (anywhere in the world) and use AR technology to show how to fix the problem.
  • Step-by-step guidance — using an AR-enabled mobile, tablet, or wearable device to show how and what to do when completing a task. This use case works particularly well for infrequent and complicated tasks.

In terms of barriers, the technology is still being developed and will continue to improve.

AR for Enterprise Alliance (AREA) has also identified business problems that it is working to overcome. These include issues when moving from pilot to full deployment. The members are working to understand and overcome safety, security, and human issues (e.g., convincing stakeholders and ensuring the workers are involved), as well as providing useful tools like an ROI calculator and Safety/Human framework.

What is one myth surrounding this technology, or Industry 4.0 in general, you’d like to debunk for our readers?

That it is complicated and difficult to deploy! This is simply not the case, and the most successful implementers of AR solutions and Industry 4.0 have started with solutions using IoT data, with simple analysis, and using tablets, phones, or assisted-reality devices to display actionable information that brings quick and substantial benefit to the company and worker.

Where do you see Industry 4.0 heading?

For Industry 4.0 to continue to provide benefit to manufacturing, Internet of Things (IoT), Artificial Intelligence (AI), and AR technologies need to interact and work together better to help deliver more actionable outcomes. Benefit will also increase as the concept of the digital twin becomes commonplace, enabling designers to plan, develop, and test more efficient processes and products. These can be tested in the augmented world before being implemented in the physical.

This has been demonstrated by the next AREA research project (voted for by the members), where best practices and the merging of IoT, AI, and AR technologies will be researched.

In the future, you can envisage a self-supporting manufacturing process able to solve its own simple problems allowing staff to see (via AR) issues that need timely intervention.

Link to article