Realwear Navigator First Look at the Future of Assisted Reality

The latest news from AREA member RealWear is their updated new product RealWear Navigator™ 500 solution. Readers can view product information and RealWear blogs here which explains technical information about assisted reality alongside real world examples and front line business purposes. 

This offers a frontline connected worker platform for the integration of multiple assisted and augmented reality (SLAM) experiences into a high-performance industrial solution

RealWear Navigator™ 500 solution is the all-new head-mounted device product platform specifically designed to engage, empower and elevate the frontline worker for the next several years.

Building on the accumulated experience of the last four years, working with 5000 enterprise customers in 60 countries with solutions based on our HMT-1™ and HMT-1Z1™ platforms, this new product brings targeted innovation in all the key areas that matter most to achieving solid results at scale.

RealWear has been known for establishing and gaining major customer deployments for frontline worker solutions based on “assisted reality”.

The core concept of assisted reality is that it makes a different tradeoff than mixed reality. Assisted reality is better suited to the majority of industrial use cases where user safety is paramount.

The goals of assisted reality are to keep the user’s attention in the real world, with a direct line of sight, for the most part unoccluded by digital objects or “holograms” that require extra cognitive focus for humans to process.

Situational awareness of moving machinery, approaching forklifts or other vehicles, steam escape valves, slip and trip hazards and electrical and chemical hazards is key for RealWear’s customers. These are the same working environments that mandate specific personal protective equipment for safety glasses and goggles, to hard hats, hearing protection, heavy gloves and even respirators. Users in these situations mostly require both hands to be available for the use of tools and equipment, or to hold on to railings, ropework, etc.

In turn the user interface for assisted reality cannot rely on the availability of hands to operate handheld controllers, or to draw gestures in the air.  RealWear’s assisted reality solutions rely on voice recognition that is field proven in very high noise environments, plus the minimal use of head motion detection. The platform uses a single articulated micro-display easily adjusted to sit below the dominant eye that does not obstruct direct vision and provides the user a view similar to a 7-inch tablet screen at arm’s length.

A core concept of mixed reality has been the placement of virtual 3D digital objects overlaid on the physical world – such as 3D models or animations. This requires two stereoscopic see-through displays that are brought to a point of focus that typically is not in the same plane as the real-world object. The resulting vergence-accommodation conflict – where the greater convergence of the eyes when looking at near objects is in conflict with the focal distance, or accommodation of the eye’s lens needed to bring the digital image into focus – is a source of eyestrain, discomfort and in some cases headaches after extended use. In addition, in bright conditions, especially outdoors, mixed reality displays struggle to provide sufficient contrast with the real world and therefore they always either cut a significant amount of light from the real world using darkened glass or have to generate such a bright display that battery life is very short unless tethered with a cord to a separate battery pack. Both situations contribute to eyestrain with extended use.

However mixed reality applications do allow information to be overlaid on the real-world asset which in some use cases can provide an additional boost in productivity in identifying the item to be worked on.

So how could this tradeoff be solved?   Is it possible to tag or overlay information on the real 3D world while also maintaining safety, situational awareness, low eyestrain, hands-free use and full-shift battery life?

We’ve long believed that the answer lies in amping up the amount of “assistance” in assisted reality rather than solely focusing on the amount of reality, with power-hungry, wide field of view, super bright stereoscopic, transparent and ultra-high resolution displays. With advanced camera capabilities and computer-vision processing, key information about real-world assets can be placed on the camera view shown in the single, monocular, non-see-through (opaque) display.

Read more. 

 

Back to News +

Share Article: