1

HP is Using HoloLens to Help Customers Remotely Repair Industrial Printers

While many AR companies are focused on building AR products, HP is making an interesting move in using the technology as an add-on to improve an existing line of its business. The company’s newly announced xRServices program promises to deliver remote AR support for its industrial printer customers.

The program employs Microsoft’s HoloLens 2 headset, which HP’s customers can use to access AR training and live guided instructions to fix issues that arrive with complex, commercial scale printers.

HP is pitching the solution as a way to allow even untrained individuals to fix issues with the help of a specialist on the other end who can guide them step-by-step through troubleshooting and repairs with AR instruction. Further the company says the service can be used to provide AR training for various workflows and issues that may arise with the company’s industrial printers.

HP hasn’t clearly detailed exactly what software it’s running on HoloLens to facilitate xRServices, but it seems likely that it is leveraging Microsoft’s Dynamics 365 Remote Assist platform which includes many of the AR functions that HP showcased in its xRServices concept video—like augmented annotation, document visualization, and video chatting through the headset.





Visor-Ex® 01 – Collaboration between ECOM Instruments and IRISTICK

The Pepperl+Fuchs brand ECOM Instruments introduces Visor-Ex® 01 smart glasses for industrial use in hazardous areas.

The intelligent wearable, weighing just 180 g, combines high camera / display quality and reliable communication features in an ergonomic design for user’s utmost comfort. This provides mobile workers with an optimal companion for tasks that require hands-free use as well as continuous communication, for example with a remote expert.

This product is the result of a longterm close collaboration between ECOM Instruments and IRISTICK, bringing together ECOM’s in-depth knowledge of the requirements of hazardous areas with IRISTICK’s profound experience of smart glasses development.

Innovative tool for the mobile worker in hazardous areas

A total of three integrated cameras transform Visor-Ex® 01 into the remote expert’s bionic eye. Two 16-megapixel cameras are centrally positioned to depict the wearer’s natural field of vision – this way the remote expert views what is happening from the same angle and perspective as the mobile worker. A secondary camera offers a 6x optical zoom for zooming in without loss of quality and fast scanning of barcodes and QR codes. The system utilises the ECOM Smart-Ex® 02 smartphone for hazardous areas as a computing unit with LTE connectivity and a pocket unit with a replaceable battery for power supply, all combined in an intelligent ecosystem for a wide range of application scenarios in the industrial sector.

The distribution of functions across the individual system components helps to minimise the weight of the headset unit – without compromising on performance, connectivity or battery life.

By connecting to the Smart-Ex® 02, users can continue to use their tried-and-tested smartphone for harsh environmental conditions without restriction and benefit from all the advantages and security features and controls of the Android 11 operating system, including over-the-air updates. Leading to ease of use and low Total Cost of Ownership.

Visor-Ex® 01 will be certified for ATEX/IECEx Zone 1/21 and 2/22 as well as NEC/CEC Division 1 and 2 and will have protection class IP68. It can be used within a temperature range of -20 to +60 °C.

Read Iristick’s AREA member profile 

For more information: www.visor-ex.com

For sales information : [email protected] or [email protected]




Magic Leap partners with Geopogo on Augmented Reality solution for architecture and design

Geopogo is a California-based 3D design software company that is working to transform the design and construction process. The company’s software allows architects and designers to create renderings and a virtual reality (VR) or augmented reality experience in minutes by importing existing CAD models or building directly with the Geopogo 3D creator tool.

Now, with Geopogo’s software on Magic Leap’s AR headset platform, the interaction of digital content with the physical world will help to bring architectural designs to life, according to the companies. “This is a phenomenal opportunity to make architectural design understandable and accessible to project clients, city officials, and the general public,” said Geopogo’s Creative Director, Michael Hoppe.

According to Magic Leap, the American Institute of Architects, San Francisco (AIASF) utilized the partnership’s technology as part of its ‘Shape Your City’ campaign, an ongoing fundraising effort to build its new headquarters in the Bay Area’s new Center for Architecture + Design. The organization also sought to fund expanded architecture-focused tours, exhibitions, educational programs, and events for people of all ages.

As a result, AIASF hosted on-site building tours to build excitement and engagement for the project from the architectural community and the public, and offered tour participants a 3D virtual model of the future Center. The integration of AR technology during the building tours allowed for a more interactive, transparent, immersive, and exciting way to visualize what the space will look like, even before construction has started.

“The power of the AR experience succeeded in inspiring donors to contribute much-needed construction funding for the project, as hoped for by the non-profit organizations. We were especially happy to see how the AR experience brought so much delight to the faces of the non-profit Board, the organization members, and members of the larger community,” said Dave Alpert, Geopogo CEO and Cofounder. 

“The AR model has allowed our project partners, Board members, potential donors, and community to experience the future Center first-hand and visualize the positive impact it will have on future generations,” agreed AIASF Executive Director, Stacy Williams.

For more information on Geopogo and its augmented reality solutions for the architecture and design industry, click here. For more information on Magic Leap and its AR hardware solutions, click here.

 




Qualcomm is trying to simplify app creation for AR glasses

The ultimate aim is to make AR more accessible. Ideally, developers will make apps directly available to you through mobile app stores, using glasses tethered to smartphones. You might not see Snapdragon Spaces used for stand-alone glasses, at least not at first.

The manufacturer support will be there. Spaces won’t be widely available until spring 2022, but Qualcomm has lined up partners like Lenovo (including Motorola), Oppo and Xiaomi. Carriers like T-Mobile and NTT DoCoMo will help build “5G experiences” using Spaces. Lenovo will be the first to make use of the technology, pairing its ThinkReality A3 glasses with an unnamed Motorola phone.

It’s too soon to know if Snapdragon Spaces will have a meaningful effect on AR. While this should streamline app work, that will only matter if there are both compelling projects and AR glasses people want to buy. This also won’t be much help for iPhone owners waiting on possible Apple AR devices. Efforts like this might lower some of the barriers, though, and it’s easy to see a flurry of AR software in the near future.

 




Extended Reality – Mixed Reality Versus Augmented Reality

Augmented Reality Defined

Augmented Reality is quickly making its way into a variety of settings. Retailers use it to help customers visualize a product before they buy it. Engineers turn to augmented reality as a way of accessing valuable information about a product without fumbling with physical manuals. With AR, users can embed or overlay elements of the digital world into the physical world.

Tools like ARkit from Apple and Google ARCore even allow users to build their own smartphone immersive experiences. However, it is possible to further enhance AR experiences through things like smart glasses. These overlay the digital content you need to see in the real world in a much more immersive way, without requiring you to hold a phone in front of your face.

Mixed Reality Defined

Mixed Reality is a hybrid of AR and VR (virtual reality), though it goes further than AR when it comes to immersion. Through MR virtual or digital content isn’t just overlaid into the real world; it’s embedded in a way that users can interact with it.

This form of MR is an advanced kind of AR, which makes the digital elements you bring into your environment feel more authentic and realistic. MR can have elements of both virtual and augmented reality within it. However, the major difference is that the focus is on blending everything together. You’re not entirely replacing an environment, or simply augmenting it with new content. Instead, you’re creating an entirely new reality by combining both the physical and digital environment.

Exploring AR and MR

There are numerous differences between AR and MR, but the biggest noticeable aspects are:

  • Device requirements – AR is usable on most smartphones or tablets, with the added option of specialist headsets. However, to provide a MR experience, more power and sensors are required.
  • Realistic interaction – AR offers limited interactivity with the virtualized elements. The computer-generated content can’t interact with the real-world elements users see.

It’s up to you whether to use VR, AR for your project. Each of them is made for particular tasks. For many companies, augmented reality will be one of the easiest ways to enter the world of extended reality. The environment is accessible because you can create applications and tools that work in smartphones, as well as through smart glasses and headsets. However, as the technology available to us continues to evolve, Mixed Reality may also become more accessible.

Many leading companies are experimenting with MR already, though it’s still technically the youngest technology in the XR space.

In manufacturing, an important hurdle to overcome when trying to bring together several emerging technologies in one place is data connectivity. At the Manufacturing Technology Center (MTC) in the UK, they understand this issue all too well and are working to combat it using ATS Bus.

ATS Bus is a platform for their VIVAR (Virtual Instruction, Inspection and Verification using Augmented and/or Virtual Reality) project which investigates “how augmented and virtual reality could be used to enhance the operator experience when viewing work instructions and increase efficiency and accuracy for both instruction delivery and data capture.”

The work orders received are translated by ATS Bus into a standard data format where they are then sent down to the shop floor where ATS Bus translates them again into the required format for use on the Adv (Advanced Display Device) server.

You can read the original article on INFRASI’s website.




As the Metaverse & AR Mature, Will They Fall Into Tech’s Common Silos

As the world of AR and the Metaverse matures, the ability for software and hardware products to integrate with one another becomes a huge factor in the adoption and use of these technologies.

Dan chats with Christine Perey, the founder and principal analyst of Perey Research & Consulting and founder of The AREA, on how history reflects tech’s tendency to embrace operational and hardware silos, and why siloed products cause significant inefficiencies and increase cost.

Abridged Thoughts:

“[Interoperability in the AR world] is the ability for components, software, hardware, services from any vendor, to be able to exchange data without the user needing to concern themselves with who made that part, and so it’s the ability for multiple vendors to combine parts and their customers also to be able to combine parts into new and unique ways and come up with new, innovative solutions that solve a specific problem.

And so the interoperability also allows the market to go to scale because you’re no longer going to be focusing only on one use case or only on one component of the whole system. You can take your component into many, many different pieces of hardware, for example, something I know a lot about, or software; you could take your content and deliver it on any browser, any player.”

– Christine Perey 

 




AR enables efficient remote support – XMReality

One of the greatest examples of AR technology is the popular mobile app Pokémon Go, which allows players to locate and capture Pokémon characters that appear in the real world. In addition to entertainment, augmented reality is also used in other areas, such as marketing, fashion, tourism, and retail.

Overall, the use of AR is growing as mobile devices that are powerful enough to handle AR software become more accessible around the world. However, AR is not a new invention. In fact, the first AR technology was developed back in 1968, when the Harvard computer scientist Ivan Sutherland created an AR head-mounted display system.

Following in Surtherland’s footsteps, lab universities, companies, and national agencies developed AR for wearables and digital displays. But it was not until 2008 that the first commercial AR application was created by German agencies in Munich. They designed a printed magazine ad for a BMW Mini car. When held in front of a computer’s camera, the user was able to control the car on the screen simply by manipulating the magazine ad.

Since then, one of the most successful uses of AR for commercial purposes has been the ability to try on products, such as clothes, jewelry, and even make-up, without having to leave your house. In addition, many tourism apps use AR technology to bring the past to life at historical sites. For example, at Pompeii in Italy, AR can project views of ancient civilizations over today’s ruins. Other examples include neurosurgeons using an AR projection of a 3D brain to aid them in surgeries and airport ground crews wearing AR glasses to see information about cargo containers. Needless to say, the potential of augmented reality is endless.

 

AR enables efficient remote support 

At XMReality, we have embraced augmented reality from the beginning. Founded in 2007 by researchers from the Swedish Defense Research Agency, our first project was to help bomb disposal experts defuse landmines in the field. For six years, we performed advanced contract research in AR for the Swedish Defense Materiel Administration and BAE Systems.

Though we continue to work and innovate in the defense sector, we expanded to help other industries with our remote support solution XMReality Remote Guidance. In remote support calls, you can use the AR feature Hands Overlay to guide your counterpart by overlaying your hand gestures on top of real time video.

This is especially useful when you need to show someone how to turn a screw, explain what cord goes where, or provide other instructions where technical support is needed. And it comes in handy when you need both your hands to give instructions or guide someone through complex tasks.

The user-friendly software and AR technology enables you to improve operational efficiency and quality for processes like audits, maintenance, service, repair, training, and support at production sites, packaging, energy grids or properties. Find more information about how to use remote support in different industries here.

Don’t tell it, show it with AR

In a rapidly growing ​​AR marketplace, we always continue to develop the use of AR technology. To enhance the Hands Overlay experience, we have introduced additional hardware: The Pointpad.

Together with the Hands Overlay, the Pointpad is useful for experts in a helpdesk setup who is using XMReality Remote Guidance from a desktop computer or support stations. This allows you to enhance hand gestures for clear instructions during everyday calls.

Imagine that you are a technician dealing with electricity sub-stations, which include extremely complex industrial installations with myriad switch-gear, screens, and interfaces. When you are restricted to voice only support, you have to rely on the customer to explain what they see in front of them, and you must give them support while acting blind.

By using XMReality and its’ AR technology, you can both see exactly what the customer sees but also guide their hands with your own.  This way you don’t have to trust the customer to explain everything just right, and you don’t need to keep in mind every detail that the customer has said, since you can continuously see it while you and the customer are troubleshooting together. You also don’t need to worry about language barriers and having to say every instruction in the most easily-understood way, since you will use your hands to show the customer what to do with their own. The reduced risk for misunderstandings combined with faster trouble resolution is a great way to achieve happier customers and more efficient processes

You can read the original blog post by XMReality here.




Case Study of AR Technology Hirschmann Automotive and RealWear

The Challenge

With seven factories worldwide, Hirschmann Automotive needed a more cost-effective and time-efficient knowledge-transfer approach to maintaining and repairing equipment than flying experts around the world.

“If something isn’t working properly at one of our plants, technicians have to call our headquarters in Austria. And even then, they might not be able to solve the problem. Then it becomes an issue of flying someone around the world to assess the problem in person”

That’s when Fliri and his team looked at virtual and augmented reality solutions. Unfortunately, most devices were too delicate for the production plant environment — until Fliri discovered the RealWear HMT-1.

The Solution

Deploying RealWear running Cisco Webex Expert on Demand allowed Hirschmann Automotive to streamline collaboration and reduce equipment downtime.

The Results

  • Reduced travel needs and costs
  • Improved maintenance and repair response
  • Streamlined information accessibility and collaboration
  • Increased first-time fix rates
  • Shortened first-time resolution time

Hands-Free Use Case

  • Remote mentoring

Readers can download the case study for free on RealWear’s website




Boeing’s Dr. Greg Garrett on the Work of the AREA Safety Committee

AREA: Are you an AR guy who got into safety, or a safety guy who got into AR?

Dr. Garrett: It’s the latter. In 2017, I was supporting the Boeing 767 tanker program when a couple of colleagues approached us in the Safety organization looking for safety and ergonomics guidelines on an Augmented Reality project using HoloLens for wiring work. We looked at each other and said, “What’s a HoloLens?” (laughs) I did some looking around and I couldn’t find any research on the safety ramifications of AR. I finally landed on some ergonomic recommendations for helicopter pilots using night vision goggles. That was the closest thing I could find, but at least it was a starting point. I put some recommendations together and very quickly became the subject matter expert for AR safety.

AREA: It sounds like everybody involved in studying safety requirements in enterprise AR has had to learn as they go along.

Dr. Garrett: It has been a very hands-on learning experience, but the technology is still a hands-on learning experience in a lot of ways. And as we’ve gone along, my interest has been pushed more into fully immersive technologies, not just the AR space. Once I became known as the AR guy, people started coming to me and asking me to help them with their VR projects. So that’s become part of my work now.

AREA: What is the AREA Safety Committee focused on right now?

Dr. Garrett: The past few years have been largely project-focused. There was the AREA Safety and Human Factors Assessment Framework and Best Practice Report. Things have changed a lot since that was published, so we’ll be doing a refresh of it. And then we put together the AREA Safety Infographic. We’ve now moved into the development of a playbook of sorts, a general guide to things to be aware of when you’re implement AR solutions from a safety perspective. What kind of infrastructure do you need? What kind of issues should you be aware of? How should you assess the environment? We’ve also brought in outside experts from academia and industry to provide their viewpoints and lessons learned. For example, at our next meeting in November, the CEO of Design Interactive will present some of the things they’ve been working on from a product design perspective, but also some of the research they’ve been involved in with their customers on usage requirements. We’ll be learning about the impact they’re beginning to see on the individuals who use AR.

AREA: What are the top AR safety issues that people are concerned about?

Dr. Garrett: Situational awareness is a big one. The restricted field of view. These are of particular concern in environments that have potential hazards. If you’re interacting with the system, you may not hear emergency or other messaging going on in your area. And with a restricted field of view, you might trip over something or bump into someone. Those are probably the top two. Cyber sickness is not generally a concern with AR, but we are starting to see some research that there are some impacts among those who are exposed for two hours or more. There is a correlation between the amount of usage and how much downtime you should have. As that research continues, we’ll be able to develop some requirements to address that issue.

AREA: What can we look forward to from the AREA Safety Committee in the near future?

Dr. Garrett: Last year, we entered into a partnership with the National Safety Council. We’re going to be working with them on the further refinement of the framework tool. It will give new AR adopters a checklist whereby they answer a series of yes/no questions to evaluate the job or their work environment from a safety perspective. In addition to the AREA sharing that framework tool with the AR ecosystem, the National Safety Council will be able to share it with their membership. We’re currently waiting for the NSC to arrange the resourcing of that work, but I expect we’ll see that completed next year.

AREA: Why should AREA members consider joining the Safety Committee?

Dr. Garrett: It’s really about having a voice and a say as to what content is being delivered to protect all employees. International standards are another area where we need a lot of support. There are standards development efforts underway right now at Underwriters Laboratories, IEEE, and ISO, and we need AR users to be represented in the room. There’s a lot of manufacturers and academics involved, but not enough AR customers, and their voices need to be heard.

 

If you’re an AREA member and would like more information about joining the AREA Safety Committee, contact Dr. Greg Garrett or AREA Executive Director Mark Sage. If you’re not yet an AREA member but care about ensuring safety in enterprise AR, please consider joining; you can find member information here.




How Assisted Reality differs from Augmented Reality

In Industry 4.0, Augmented Reality (AR) and Virtual Reality (VR) often get the spotlight as the next great leap in boosting worker productivity. But these X-Reality (XR) technologies aren’t always practical when used as manufacturing or frontline tools.

Enter another aR: assisted Reality.

What is assisted reality? How does it differ from augmented reality?

Assisted Reality gives you access to the right information right when you need it, allowing you to have full situational awareness. Unlike AR, it’s a reality first, digital second experience. Assisted Reality allows a person to view a screen within immediate field of vision, hands free. Information is not overlaid with real-world view.

Let’s explore this by looking at heads-up displays (HUDs). HUDs in vehicles give an extra layer of relevant information without hampering vision or distracting the driver. The driver doesn’t have to shift their gaze to the dashboard. They can keep their eyes on what’s most important (the road) and have both hands free to control their vehicle.

Assisted reality devices can also be wearable to be more practical in certain situations.

  • Headsets with micro-displays: A small but high-resolution screen that’s positioned in front of the user’s eye. With the appropriate focal depth, a half-inch display can look like a 7-inch tablet held at arm’s length.
  • Smart glasses: Worn like ordinary glasses, purpose-built smart glasses project images directly onto the lenses (note: most assisted reality use cases are not dependent on SLAM (simultaneous location and mapping) computer vision.
  • RealWear devices with assisted reality technology are leading the industrial field’s digital transformation with hands-free, Android-based headsets, designed specifically with safety in mind.
  • RealWear devices with assisted reality technology are leading the industrial field’s digital transformation with hands-free, Android-based headsets, designed specifically with safety in mind.

How is assisted reality different from augmented reality?

Assisted reality differs from augmented reality in a key way. Assisted reality gives users access to relevant information in their immediate field of view (FoV), augmented reality uses computer-generated, digital content to create an interactive experience within real-world environments.

Read the full article on the RealWear blog here.