Tech Interfaces Need to Get Better – Extended Reality Can Help

A recent article on Singularity Hub discusses how Extended Reality (XR) can improve interfaces. It is claimed that current electronic devices are un-intuitive and require multi-tasking due to having to use the tool while simultaneously performing the task at hand which is inefficient.

Jody Medich, Director of Design for Singularity University Ventures, is quoted to have said that devices should become ‘ready to hand’, acting seamlessly with our hand or body movements. She urges us away from the training of humans to operate devices, instead believing our minds should be free to pay attention to the task rather than the tool.

Medich has demonstrated that XR can make technology ready to hand which leads to an improved interface for a more effortless experience. An example of an application is firefighters wearing Augmented Reality helmets with thermal imaging capabilities and toxicity sensors, meaning that the helmets do not require an interface because the thermal images are as clear as the physical world.

In the medical sector, vein visualisation is an AR technique that projects near-infrared light onto a patient’s skin which allows professionals to ‘see through’ their skin and look at veins. The surface of the patient’s skin effectively becomes the interface as no screen or external device is needed. AccuVein, one of the vein visualisation tool’s manufacturers, claimed the produce increased the chance of a successful first stick by 3.5 times, leading to increased patient satisfaction and reduced costs. Medich has pointed out that a large amount of the work in Augmented and Virtual Reality is happening in the healthcare industry; a medical research team from Duke University found evidence that the level of immersion can be enhanced by VR in a distracting environment which occurs even during a patient experiencing pain.

Another aspect of human psychology significant for human-machine interface design is that we tend to learn more efficiently when we use our whole bodies, Medich mentioned. Embodied cognition is the concept of cognition not being confined to the brain, and experiences of the body affecting our mental constructs and performance on cognitive tasks. Screens do not involve the body, as they are flat, visual, and text-based. In contrast, XR provides an opportunity to take advantage of embodied cognition by offering an immersive experience and creating the perception of the body being engaged.

Briefly touched on is the lack of breakthrough for VR, which a 2016 MIT Technology Review article attributed to the high cost of devices such as the Oculus Rift.

The article concludes by likening effortlessly accessible technology to having ‘superpowers’; Clay Bavor, Vice President of Virtual and Augmented Reality at Google, is quoted to have said that VR transports you elsewhere, whereas AR leaves you in place, and despite this different, both give us superpowers.

Back to News +

Share Article: