AR and VR Hurdles To Clear In The Enterprise

An article appeared on Venture Beat by author David Brebner stating the AR and VR still have hurdles to clear in the enterprise.  Open standards will help.

This opinion piece article focuses on how focus on vendors should be on two key problem areas: cospacing abilities and headset deficiencies.

The author argues that with a few improvements and modifications, AR and VR technology could be improved so that it’s useful for companies for a range of scenarios including training, maintenance, product design and more.

What are the major issues with the current capabilities, and how can they improve for enterprise use?

 

Cospacing

 

The author states that perhaps the biggest shortcoming of AR and VR is the lack of cospacing functionality. Cospacing is the ability for multiple people to interact in the same experience and have real objects — like a mechanical wrench or a CPR dummy — be included, which is essential to making training experiences effective. Currently in the AR landscape, there is no single vendor product that allows for this, making the technology much harder for enterprises to adopt.

Why is cospacing essential for enterprises to get value from AR and VR? Consider the limitations of a training exercise that does not allow for multiple employees to participate at the same time, in the same place. Further, imagine that objects needed to make the training realistic — like a specific tool that will be used in the real life scenario — cannot be incorporated into the experience. This would severely limit the effectiveness of the training exercise, because it would not allow for the real-time collaboration or practice with the actual tools. It would be like a soccer team trying to practice with each player on a separate field and no balls or goals.

 

Another example of cospacing’s importance comes from product design teams using AR or VR. There are currently no solutions offered by any of the major vendors that allow people to collaborate in real time with a shared view. In other words, you can imagine two product designers working together in the same space collaborating on a design, but today this isn’t possible — they each have their own separate view making it hard to understand what the other designer might be doing and making collaboration extremely difficult.

 

Luckily, there is already a technology that solves these issues — here’s one example — and allows for collaborative experiences and the use of real-life tools. More platforms need to offer room-tracking systems with sensors placed on all involved people and objects, allowing them to be easily connected and included in the experience. New objects and people can be included and visualized in the experience by simply adding new sensors. This also gives the ability for motion capture that allows for custom scenarios to be pre-recorded and programmed without needing to wait for a studio to develop and release them — which is especially useful for specialized training scenarios.

 

Headsets and controllers

 

Another major issue with AR and VR technology is the hardware on the market. While major leaps have been made since the technology first conceived, they still fall woefully short of being able to provide a practical and useful application for enterprises.

A major issue with today’s AR and VR headsets the lack of eye tracking capabilities. This means users shift their view by moving their entire head rather than looking in a different direction as they would do in real life. In other words, to look up, one must tilt their entire head upwards rather than just looking up with their eyes. This creates a number of issues, the largest being the neck aches caused by moving around the sheer weight of the headset in order to change views, causing an unpleasant user experience and making more complex scenarios difficult to navigate.

 

The other downside to not tracking users’ eyes is that a treasure trove of potential data is lost. In a training scenario, if eyes are tracked then an employee could get more detailed and useful feedback on how to improve, and a company can find out what potential obstacles their employees do not see.

 

Another issue that plagues most headsets is their field of view (FOV), which is still quite limited. Ideally AR glasses would provide a minimum of 114 degree horizontal per eye, which covers human stereo vision (the remaining 40 degrees is monocular peripheral vision). To put the inadequacies of current options in perspective, the HoloLens is only 40 degrees, and the ODG R9 FOV is around 50 degrees. The Meta 2 is more respectable at around 90 degrees with the SmokeHMD VR/AR unit at around 100.  A limited FOV creates an issue because it limits the users immersion in the experience creating a keyhole or windowed effect.

 

Projection systems with a focal plane too close to the user, chromatic aberration and other visual distortion can cause eye strain and overall stress due to headaches and confusion.  Headsets which block the users downwards peripheral vision stop the user from seeing their feet making walking and moving around a space much more dangerous.

 

Hand controllers that pair with headsets provide further difficulties, because they require users to have their hands outstretched for long periods of time, causing discomfort. A solution to this issue, is building more content that uses voice controls — specifically when combined with eye tracking and with a visual prompt this will reduce the potential for misunderstandings.

 

Notes in conclusion are given in the full article as to the likely resolutions to these issues with information about the author.

 

Back to News +

Share Article: