Realwear Navigator First Look at the Future of Assisted Reality

This offers a frontline connected worker platform for the integration of multiple assisted and augmented reality (SLAM) experiences into a high-performance industrial solution

RealWear Navigator™ 500 solution is the all-new head-mounted device product platform specifically designed to engage, empower and elevate the frontline worker for the next several years.

Building on the accumulated experience of the last four years, working with 5000 enterprise customers in 60 countries with solutions based on our HMT-1™ and HMT-1Z1™ platforms, this new product brings targeted innovation in all the key areas that matter most to achieving solid results at scale.

RealWear has been known for establishing and gaining major customer deployments for frontline worker solutions based on “assisted reality”.

The core concept of assisted reality is that it makes a different tradeoff than mixed reality. Assisted reality is better suited to the majority of industrial use cases where user safety is paramount.

The goals of assisted reality are to keep the user’s attention in the real world, with a direct line of sight, for the most part unoccluded by digital objects or “holograms” that require extra cognitive focus for humans to process.

Situational awareness of moving machinery, approaching forklifts or other vehicles, steam escape valves, slip and trip hazards and electrical and chemical hazards is key for RealWear’s customers. These are the same working environments that mandate specific personal protective equipment for safety glasses and goggles, to hard hats, hearing protection, heavy gloves and even respirators. Users in these situations mostly require both hands to be available for the use of tools and equipment, or to hold on to railings, ropework, etc.

In turn the user interface for assisted reality cannot rely on the availability of hands to operate handheld controllers, or to draw gestures in the air.  RealWear’s assisted reality solutions rely on voice recognition that is field proven in very high noise environments, plus the minimal use of head motion detection. The platform uses a single articulated micro-display easily adjusted to sit below the dominant eye that does not obstruct direct vision and provides the user a view similar to a 7-inch tablet screen at arm’s length.

A core concept of mixed reality has been the placement of virtual 3D digital objects overlaid on the physical world – such as 3D models or animations. This requires two stereoscopic see-through displays that are brought to a point of focus that typically is not in the same plane as the real-world object. The resulting vergence-accommodation conflict – where the greater convergence of the eyes when looking at near objects is in conflict with the focal distance, or accommodation of the eye’s lens needed to bring the digital image into focus – is a source of eyestrain, discomfort and in some cases headaches after extended use. In addition, in bright conditions, especially outdoors, mixed reality displays struggle to provide sufficient contrast with the real world and therefore they always either cut a significant amount of light from the real world using darkened glass or have to generate such a bright display that battery life is very short unless tethered with a cord to a separate battery pack. Both situations contribute to eyestrain with extended use.

However mixed reality applications do allow information to be overlaid on the real-world asset which in some use cases can provide an additional boost in productivity in identifying the item to be worked on.

So how could this tradeoff be solved?   Is it possible to tag or overlay information on the real 3D world while also maintaining safety, situational awareness, low eyestrain, hands-free use and full-shift battery life?

We’ve long believed that the answer lies in amping up the amount of “assistance” in assisted reality rather than solely focusing on the amount of reality, with power-hungry, wide field of view, super bright stereoscopic, transparent and ultra-high resolution displays. With advanced camera capabilities and computer-vision processing, key information about real-world assets can be placed on the camera view shown in the single, monocular, non-see-through (opaque) display.

Read more. 

 




Samsung and Microsoft may be working on a future augmented reality hardware

A report by The Elec suggests that Microsoft and Samsung are working together on future augmented reality hardware. It is not disclosed whether this is for the consumer market, enterprise market, or both. All that is known is that the project is AR-related and may involve some sort of hardware Samsung will be producing (rather than Microsoft). Samsung’s investments in DigiLens, the company behind tech found in AR display devices, may further substantiate the idea that the former will be handling the physical gadgetry in its collaboration with Microsoft.

Apparently, several divisions of Samsung are involved in the project, with Samsung Display, Samsung Electro-Mechanics, and Samsung SDI are all tied in. This AR project started in the middle fo 2021 and aims to result in a commercially viable product by 2024.

 

 




RealWear Introduces RealWear Navigator™ 500 Industrial-Strength Assisted Reality Wearable for Frontline Workers

Fully optimized for hands-free use, RealWear Navigator 500 is an innovative platform solution that combines hardware, software, and cloud-ready services with a rugged design that is one-third lighter and slimmer than the previous generation, making it easier for frontline workers to wear the device for their entire shift. The hardware is designed as a modular platform with an upgradeable 48 megapixel (MP) camera system, a truly hot-swappable battery, with Wi-Fi, and an optional 4G (and soon-to-be-available 5G) modem. The voice-controlled user interface includes unique noise-cancelation technology designed for high-noise environments. RealWear has more than 200 optimized partner apps supporting a variety of use cases, such as remote collaboration, guided workflow and IoT and AI data visualization.

Assisted reality [infographic available] is a non-immersive experience and has become the preferred Extended Reality (XR) solution for frontline industrial workers, especially where high situational awareness is necessary. Assisted reality experiences are closer to the physical world, compared to virtual reality (VR) and augmented reality (AR) experiences that immerse workers in the metaverse.

With RealWear Navigator 500, RealWear has again raised the bar for how assisted reality and other XR technologies are deployed at the world’s leading industrial companies. Automotive, logistics, manufacturing, food & beverage and energy companies, among others, can use RealWear Navigator 500 to deliver real-time access to online information and expertise to the world’s more than 100 million industrial frontline workers.

“With pandemic concerns continuing to press upon the global economy, how technology is enabling a ‘new way to work’ is very much in focus, particularly for industrial frontline workers,” said Andrew Chrostowski, Chairman and CEO of RealWear. “Today we’re unveiling something far bigger than a product. The RealWear Navigator 500 delivers the next generation of work with a ‘reality-first, digital-second’ enterprise solution for remote collaboration, operational efficiency, and hybrid work in safety-critical industries. Assisted reality – more so than augmented or virtual reality – is designed specifically for the frontline worker who requires both hands for the job, striking the perfect balance of keeping workers 100% present and self-aware with the ability to safely navigate industrial surroundings. After all, nobody wants to be near hazardous equipment with their head stuck into the metaverse.”

Read the rest of the full press release here. 




Ford Technical Assistance Center Using TeamViewer Frontline Augmented Reality Solution to Streamline Customer Vehicle Repairs Worldwide

The new service is offered by Ford’s Technical Assistance Center (TAC), a centralized diagnostic troubleshooting team that provides support to all Ford and Lincoln dealerships’ technicians who diagnose and repair customer vehicles.  Dealer technicians can initially reach out to TAC specialists via a web-based portal or even on a phone.  With the new See What I See program, TAC specialists can now start a remote AR session using TeamViewer Frontline through a pair of onsite RealWear smart glasses to share, in real time, exactly what the repair technician is looking at.  TAC specialists can add on-screen annotations and additional documentation directly in the line of sight of the repair technicians, as well as zoom in, share their screen, record the session and even turn on flashlights remotely.

“My team diagnoses some of the most complex and complicated vehicle issues,” says Bryan Jenkins, TAC powertrain operations manager.  “I would frequently hear my team say that if they could only see what that technician is talking about, or what the technician is doing or how they’re completing a test, then they could solve the problem more accurately.  A picture is worth 1000 words, but sometimes that still wasn’t quite enough, and we needed a way to see something live and in action.  And that’s what really kicked this whole program off.”

Ford’s See What I See program is an additional layer of support that is already used by more than 400 dealers in the U.S., Mexico, South Africa, Thailand, Australia, New Zealand and the U.K.  Currently Ford is promoting the new program to its full network of 3,100 U.S. based dealers, with a positive response. “Feedback from the dealers has been really good,” says Jenkins.  “From the dealer technician perspective, they just turn on their smart glasses and accept an incoming call, then it is like my specialists are there looking over their shoulder to help resolve the problem.”

“We are very excited to add Ford to our growing list of forward-thinking customers that are leveraging AR solutions to improve business processes,” says Patty Nagle, president of TeamViewer Americas.  “The majority of workers globally do not sit in front of a desk.  Our goal is to enable those frontline workers with AR guided solutions to enable them to do their jobs better by digitalizing and streamlining processes.”




Tech trends driving Industry to v5.0 – Rockwell Automation

Rarely has industrial automation changed at such an exponential rate. The combination of various technology trends has propelled enterprises into Industry 4.0 so fast that Frost & Sullivan has already delivered an Industry 5.0 blueprint to guide the journey.

Edge-and-cloud integration, converged development environments, artificial intelligence (AI) and autonomous production are far more than conceptual. These technological innovations are already happening.

“This is a unique time in our industry,” explained Cyril Perducat, who shared the automation supplier’s plans for the immediate future at Automation Fair 2021 in Houston. “The future is a trajectory, a path that we are already on. When I think of Industry 4.0, which was first coined in 2011, there is certainly a lot of learning over the past 10 years of what Industry 4.0 can deliver. And COVID has accelerated many of those dimensions.”

Remote connectivity, advanced engineering with multiple digital twins, mixing physical and digital assets, and the change of human-machine interaction are driving industry along that path toward Industry 5.0.

Perducat questioned whether it’s too soon to look at Industry 5.0 when all the promise of Industry 4.0 has not yet been delivered, but he identified five changes that are attainable and impactful in Frost & Sullivan’s comparison of Industry 4.0 to Industry 5.0:

  • delivery of customer experience,
  • hyper customization,
  • responsive and distributed supply chain,
  • experience-activated (interactive) products, and
  • return of manpower to factories.

“We are able to bring more capabilities to people,” said Perducat. “Human resources are scarce. By delivering systems that make the human-machine interaction more efficient, we make it more impactful while remaining safe.”

size=0 width=”100%” noshade style=’color:white’ align=center>

Rockwell Automation has identified four areas where technology can move companies along that journey:

  • evolution of cloud, edge and software,
  • universal control and converged integrated development environments (IDEs),
  • AI native operation management, including software as a service (SaaS) and digital services, and
  • autonomous systems and augmented workforce.

“We believe in control at the enterprise level,” explained Perducat. “We believe in systems with software-defined architecture and the underlying hardware. It doesn’t mean hardware is becoming obsolete. And it’s not that every piece of the system needs to be smart. The entire system, from the device to the edge and to the cloud, is smart. Edge + cloud architecture is fundamental.”

In the converged environment, control, safety and motion all come together and must work in an integrated fashion. This is especially true with the growth of robotics. “The boundaries between control and robotics are becoming more and more blurred,” said Perducat. “Safety is very fundamental in this more complex architecture. It does not work if it is not safe.”

Operations management becomes more efficient when AI is native to the architecture and is at the level of the enterprise. “A holistic view requires a lot of data and the ability to process that data,” explained Perducat. “Part of this has to be autonomous using the power of applied AI; it’s not just one more tool but is everywhere in the architecture. We can use AI on the machine to translate vibrations into data. We can think of AI in terms of process modeling. And model predictive control is evolving with AI. When you can orchestrate all the elements of the architecture, that is a system.”

FactoryTalk Analytics LogixAI is a modeling engine that enables closed-loop optimization through four steps—observe (sensor), infer (model), decide (controller) and act (actuator).

Finally, by transforming from automated systems to autonomous systems, it enables better decisions to expand human possibility.

AI can also help to simplify a new generation of design. “You can use AI to help to generate blocks of code, like individuals working together peer-to-peer, but one of them is AI, augmenting human possibility,” explained Perducat.

“We see the next step to autonomous manufacturing as an opportunity to deliver value to our customers,” he said. “The autonomous system is reimagining the fundamental principles of autonomous control systems. You don’t need to rip and replace. We have the ability to augment existing systems with new technology.”

Perducat stressed that it cannot be just technology innovation. “Technology only creates possibilities or potential values,” he explained. “It has to be accessible by users, so we have to innovate on the user experience point of view. We want to bring that to all the products, experiences and models. In a digital native world, innovation extends beyond technology and features.




HP is Using HoloLens to Help Customers Remotely Repair Industrial Printers

While many AR companies are focused on building AR products, HP is making an interesting move in using the technology as an add-on to improve an existing line of its business. The company’s newly announced xRServices program promises to deliver remote AR support for its industrial printer customers.

The program employs Microsoft’s HoloLens 2 headset, which HP’s customers can use to access AR training and live guided instructions to fix issues that arrive with complex, commercial scale printers.

HP is pitching the solution as a way to allow even untrained individuals to fix issues with the help of a specialist on the other end who can guide them step-by-step through troubleshooting and repairs with AR instruction. Further the company says the service can be used to provide AR training for various workflows and issues that may arise with the company’s industrial printers.

HP hasn’t clearly detailed exactly what software it’s running on HoloLens to facilitate xRServices, but it seems likely that it is leveraging Microsoft’s Dynamics 365 Remote Assist platform which includes many of the AR functions that HP showcased in its xRServices concept video—like augmented annotation, document visualization, and video chatting through the headset.





Visor-Ex® 01 – Collaboration between ECOM Instruments and IRISTICK

The Pepperl+Fuchs brand ECOM Instruments introduces Visor-Ex® 01 smart glasses for industrial use in hazardous areas.

The intelligent wearable, weighing just 180 g, combines high camera / display quality and reliable communication features in an ergonomic design for user’s utmost comfort. This provides mobile workers with an optimal companion for tasks that require hands-free use as well as continuous communication, for example with a remote expert.

This product is the result of a longterm close collaboration between ECOM Instruments and IRISTICK, bringing together ECOM’s in-depth knowledge of the requirements of hazardous areas with IRISTICK’s profound experience of smart glasses development.

Innovative tool for the mobile worker in hazardous areas

A total of three integrated cameras transform Visor-Ex® 01 into the remote expert’s bionic eye. Two 16-megapixel cameras are centrally positioned to depict the wearer’s natural field of vision – this way the remote expert views what is happening from the same angle and perspective as the mobile worker. A secondary camera offers a 6x optical zoom for zooming in without loss of quality and fast scanning of barcodes and QR codes. The system utilises the ECOM Smart-Ex® 02 smartphone for hazardous areas as a computing unit with LTE connectivity and a pocket unit with a replaceable battery for power supply, all combined in an intelligent ecosystem for a wide range of application scenarios in the industrial sector.

The distribution of functions across the individual system components helps to minimise the weight of the headset unit – without compromising on performance, connectivity or battery life.

By connecting to the Smart-Ex® 02, users can continue to use their tried-and-tested smartphone for harsh environmental conditions without restriction and benefit from all the advantages and security features and controls of the Android 11 operating system, including over-the-air updates. Leading to ease of use and low Total Cost of Ownership.

Visor-Ex® 01 will be certified for ATEX/IECEx Zone 1/21 and 2/22 as well as NEC/CEC Division 1 and 2 and will have protection class IP68. It can be used within a temperature range of -20 to +60 °C.

Read Iristick’s AREA member profile 

For more information: www.visor-ex.com

For sales information : [email protected] or [email protected]




Magic Leap partners with Geopogo on Augmented Reality solution for architecture and design

Geopogo is a California-based 3D design software company that is working to transform the design and construction process. The company’s software allows architects and designers to create renderings and a virtual reality (VR) or augmented reality experience in minutes by importing existing CAD models or building directly with the Geopogo 3D creator tool.

Now, with Geopogo’s software on Magic Leap’s AR headset platform, the interaction of digital content with the physical world will help to bring architectural designs to life, according to the companies. “This is a phenomenal opportunity to make architectural design understandable and accessible to project clients, city officials, and the general public,” said Geopogo’s Creative Director, Michael Hoppe.

According to Magic Leap, the American Institute of Architects, San Francisco (AIASF) utilized the partnership’s technology as part of its ‘Shape Your City’ campaign, an ongoing fundraising effort to build its new headquarters in the Bay Area’s new Center for Architecture + Design. The organization also sought to fund expanded architecture-focused tours, exhibitions, educational programs, and events for people of all ages.

As a result, AIASF hosted on-site building tours to build excitement and engagement for the project from the architectural community and the public, and offered tour participants a 3D virtual model of the future Center. The integration of AR technology during the building tours allowed for a more interactive, transparent, immersive, and exciting way to visualize what the space will look like, even before construction has started.

“The power of the AR experience succeeded in inspiring donors to contribute much-needed construction funding for the project, as hoped for by the non-profit organizations. We were especially happy to see how the AR experience brought so much delight to the faces of the non-profit Board, the organization members, and members of the larger community,” said Dave Alpert, Geopogo CEO and Cofounder. 

“The AR model has allowed our project partners, Board members, potential donors, and community to experience the future Center first-hand and visualize the positive impact it will have on future generations,” agreed AIASF Executive Director, Stacy Williams.

For more information on Geopogo and its augmented reality solutions for the architecture and design industry, click here. For more information on Magic Leap and its AR hardware solutions, click here.

 




Qualcomm is trying to simplify app creation for AR glasses

The ultimate aim is to make AR more accessible. Ideally, developers will make apps directly available to you through mobile app stores, using glasses tethered to smartphones. You might not see Snapdragon Spaces used for stand-alone glasses, at least not at first.

help build “5G experiences” using Spaces. Lenovo will be the first to make use of the technology, pairing its ThinkReality A3 glasses with an unnamed Motorola phone.

possible Apple AR devices. Efforts like this might lower some of the barriers, though, and it’s easy to see a flurry of AR software in the near future.

 




Extended Reality – Mixed Reality Versus Augmented Reality

Augmented Reality Defined

Augmented Reality is quickly making its way into a variety of settings. Retailers use it to help customers visualize a product before they buy it. Engineers turn to augmented reality as a way of accessing valuable information about a product without fumbling with physical manuals. With AR, users can embed or overlay elements of the digital world into the physical world.

Tools like ARkit from Apple and Google ARCore even allow users to build their own smartphone immersive experiences. However, it is possible to further enhance AR experiences through things like smart glasses. These overlay the digital content you need to see in the real world in a much more immersive way, without requiring you to hold a phone in front of your face.

Mixed Reality Defined

Mixed Reality is a hybrid of AR and VR (virtual reality), though it goes further than AR when it comes to immersion. Through MR virtual or digital content isn’t just overlaid into the real world; it’s embedded in a way that users can interact with it.

This form of MR is an advanced kind of AR, which makes the digital elements you bring into your environment feel more authentic and realistic. MR can have elements of both virtual and augmented reality within it. However, the major difference is that the focus is on blending everything together. You’re not entirely replacing an environment, or simply augmenting it with new content. Instead, you’re creating an entirely new reality by combining both the physical and digital environment.

Exploring AR and MR

There are numerous differences between AR and MR, but the biggest noticeable aspects are:

  • Device requirements – AR is usable on most smartphones or tablets, with the added option of specialist headsets. However, to provide a MR experience, more power and sensors are required.
  • Realistic interaction – AR offers limited interactivity with the virtualized elements. The computer-generated content can’t interact with the real-world elements users see.

It’s up to you whether to use VR, AR for your project. Each of them is made for particular tasks. For many companies, augmented reality will be one of the easiest ways to enter the world of extended reality. The environment is accessible because you can create applications and tools that work in smartphones, as well as through smart glasses and headsets. However, as the technology available to us continues to evolve, Mixed Reality may also become more accessible.

Many leading companies are experimenting with MR already, though it’s still technically the youngest technology in the XR space.

In manufacturing, an important hurdle to overcome when trying to bring together several emerging technologies in one place is data connectivity. At the Manufacturing Technology Center (MTC) in the UK, they understand this issue all too well and are working to combat it using ATS Bus.

ATS Bus is a platform for their VIVAR (Virtual Instruction, Inspection and Verification using Augmented and/or Virtual Reality) project which investigates “how augmented and virtual reality could be used to enhance the operator experience when viewing work instructions and increase efficiency and accuracy for both instruction delivery and data capture.”

The work orders received are translated by ATS Bus into a standard data format where they are then sent down to the shop floor where ATS Bus translates them again into the required format for use on the Adv (Advanced Display Device) server.

You can read the original article on INFRASI’s website.