1

Standalone AR/VR headsets are finally ready to make a big leap forward

Key points in the article include:

  • In 2019, Qualcomm foreshadowed that XR2 was on the cusp of being adopted by AR/VR headset makers.
  • The Snapdragon XR2 and Niantic’s XR2-powered AR glasses were both announced by their respective companies without a product or timeline/imagery to go with them.
  • However, Geekbench 5 released results last week for the HTC Vive Focus model with the XR2, and its likely configuration matched with the specs.
  • Completed XR2-powered headsets will vary depending on the company despite Qualcomm providing the reference platforms and chipset.
  • Horwitz, the writer of the article, believes that the general trend will favour higer-resolution VR displays.
  • Pico’s high-resolution Neo 2 is powered by a Snapdragon 845, therefore Horwitz expects that XR2 headsets will surpass Pico’s quality – the new headsets will likely use 90Hz refresh rates, creating display speeds of PC standard, therefore reduced nausea.
  • Facebook’s Quest achieved complex visuals from the Snapdragon 835, and Qualcomm’s suggestion that the XR2 has twice as much GPU and CPU power as the 835 means that XR2-powered titles will rival previous visuals.
  • Since “current-generation” and “entry-level” are changeable, there is no way mobile-class XR2 headsets will entirely eliminate a demand for high-spec technology, however, the visual delta between tethered and untethered headsets will be less of a priority.
  • Snapdragon XR2’s Artificial Intelligence processing capabilities could also be a key factor in enhancing MR headset performance.
  • Although quantity does matter in regards to AI performance, as do quality and system-level engineering and software considerations making use of competencies.
  • AI can further impact MR headset performance via generating solutions for partly original problems, empowering computer opponents, enabling richer voice controls, and segmenting live visuals to blend with digital content.

The article concludes by acknowledging that nothing is currently certain in regards to release of XR headsets due to COVID-19, however, it appears that Snapdragon XR2 headsets will be in stores relatively soon.




AREA member RealWear’s Firmware Release 11.2

Release Highlights:

  • Cloud Sync
    • A new application that enables customers to easily authenticate to cloud storage drives including Microsoft OneDrive, upload tagged photos / videos captured in My Camera and browse cloud drives in My Files
  • Ease of Use
    • Tetrominos, a fun Tetris-like game which helps users get familiar with RealWear’s user interface
    • Wi-Fi Band Control which allows end users or IT Admins to lock their RealWear devices to either the 2.4 or 5 GHz band
    • My Controls Grid View to easily navigate the growing functionality in My Controls
  • Security
    • Android Security Patches from March – July 2020 integrated into the RealWear device firmware
    • Updated Lock Screen which leverages a secure keyboard instead of head tracking
  • Equity and Inclusion
    • Changes in software and documentation terminology in support of equity and inclusion
  • Full Language Support for Traditional Chinese
  • Bug Fixes – As with any release, bug fixes and minor enhancements are incorporated.

See Realwear’s AREA member profile here

Visit RealWear’s website here




Porsche Triples Down on AR

Mike Boland of AR Insider claims that there are key lessons to be learned from Porsche’s investment, as it indicates that AR is working. Also, Porsche got past ‘pilot purgatory’, which is when enterprises integrate AR but fail to gain mass adoption, often a result of poor communication with front line workers uncertain in utilising such technology. Via Atheer’s influence of “thinking like a marketer”, Porsche has now avoided pilot purgatory.

Another way in which Atheer has guided Porsche is by deploying AR in the most impactful areas. For example, Atheer has reported that AR is more useful in guidance rather than training, despite VR’s ability to increase knowledge retention in training. Therefore, AR has a greater impact in non-repetitive jobs.

Amar Dhaliwal, CEO of Atheer, is quoted to have said that they start by assessing what it is that Porsche is trying to do; if this is training-related, then Atheer will advise against deploying AR, as it will not have the suitable ROI.




Ultraleap and Qualcomm announce a multi-year agreement

The leading standalone VR headset, Oculus Quest, has been increasingly focusing on controllerless hand-tracking as a means of input for the device. Other major headset makers, like Microsoft and its HoloLens 2, have also honed in on hand-tracking as a key input method. As industry leaders coalesce around hand-tracking, it becomes increasingly important for competing devices to offer similar functionality.

But hand-tracking isn’t a ‘solved’ problem, making it a challenge for organizations that don’t have the resources of Facebook and Microsoft to work out their own hand-tracking solution.

Ultraleap’s fifth generation hand tracking platform, known as Gemini, will be pre-integrated and optimised on the standalone, untethered Snapdragon XR2 5G reference design, signalling a significant step change for the XR space. The Gemini platform delivers the fastest, most accurate and most robust hand tracking and will provide the most open and accessible platform for developers.

The Snapdragon XR2 5G Platform is the world’s first 5G-supported platform designed specifically for untethered VR, MR and AR (collectively, extended reality or XR). Gemini has been optimised for the Snapdragon XR2 5G platform to allow for an ‘always on’ experience and the most natural interaction in untethered XR.

Steve Cliffe, CEO of Ultraleap, said: “Qualcomm Technologies recognises the importance of high-precision hand tracking in order to revolutionise interaction in XR. The compatibility of our technology with the Snapdragon XR2 5G Platform will make the process of designing hand tracking within a very wide variety of products as simple as pick and place. Qualcomm Technologies is in the position to bring transformation to XR by making state-of-the-art technologies – including 5G and spatial computing – available to a broad market. We are proud to be at the forefront of this fast-growing ecosystem alongside them.”

Hiren Bhinde, Director of Product Management, Qualcomm Technologies, Inc., said: “Hand tracking is becoming a table stakes feature in next-gen XR devices. True immersive XR experiences require seamless, natural and intuitive usage and interaction of the users’ hand when interacting in the digital world as they do in the physical world. Ultraleap’s hand tracking technology enables this seamless interaction through a natural connection between people and technology, which is incredibly important for the next generation of XR devices. We are excited to work with Ultraleap to help deliver more immersive experiences on the Snapdragon XR2 5G reference design.”

Read the original Ultraleap news press release here 

 




A new 3D approach to remote design engineering

And trying to untangle complex problems remotely from thousands of miles away is fraught with difficulties – even when using products like Microsoft’s Remote Assist. The expert often has to resort to waving their hands around on a screen to communicate to the technician which part of a machine they should be fixing – and which parts should be left alone.

Real-time immersive 3D collaboration is now adding a new dimension to such problem solving – users can share live, complex 3D files such as CAD data, interact with them and reveal ‘hidden’ parts deep within a machine that may be causing an issue. The technology also transforms day-to-day collaboration between remote engineering team members. Design reviews, for example, can be brought to life, with participants ‘walking through’ a model, no matter where they are in the world.

 

The fundamental problem at the root of many of these issues until now has been that enterprise teams have lacked the ability to effectively collaborate in real time using live, complex 3D data. The solution lies in purpose-built framework technology for integrating natively real-time collaboration and immersive device support directly into legacy enterprise software packages.

The key to enabling true real-time collaboration is to start where the data ‘sits’ and ensure that this original data ‘truth’ is the same for everybody when working together, no matter where they are located or what device they wish to use. This way, everyone in the team has the correct and most up-to-date information available.

Whether it is a CAD package, PLM software, an MRI scanner, robotic simulation software or a laser scanning system, many industries are becoming increasingly dependent on spatial data types and digital twins. These complex data formats are usually incompatible or just too cumbersome to use ‘as is’ in existing collaboration platforms such as Webex, Skype, Google docs or Slack – all built primarily for 2D data formats such as video, text or images.

Moreover, the legacy software generating the data itself is unlikely to have any in-built real-time collaboration functionality – forcing enterprise users to resort to one of two methods. One option is to manually export the data, carry out a painful and time-consuming reformatting process, then manually import the newly crunched data into some type of third-party standalone collaboration package. The alternative is to ignore the spatial nature of the data entirely and instead screen-grab or render out 2D ‘flat’ images of the original 3D data for use in a basic PowerPoint presentation or something similar.

Neither of these methods allows teams to efficiently collaborate using a live data truth – i.e. the original data itself instead of a reformatted, already out-of-date interpretation of it. So, both methods only compound the root collaboration problem instead of helping to solve it.

The latest generation of real-time immersive 3D collaboration technology is integrated directly into the host software, grabbing the original data at source before efficiently pushing it into a real-time environment which users can access using their choice of device (VR, AR, desktop, browser or mobile) for instant and intuitive collaboration. End-to-end encryption ensures that even the most sensitive data may be confidently shared across remote locations.

The integration into the host package provides not only a live connection to the data but also a bi-directional connection, meaning that users are still connected to the host software package running in the background. The advantage of this over standalone applications is that it still gives access to core features of the host package – enabling accurate measurement of a CAD model using vertex or spline snapping to the original B-Rep held in the CAD package, for example. All the underlying metadata from the host package is also available to the immersive experience – and annotations, snapshots, action lists or spatial co-ordinate changes can be saved back into the host package.

The new post-pandemic requirement to have a distributed workforce – in conjunction with the rollout and adoption of key technology enablers such as server-side rendering and high-capacity, low-latency connectivity – is set to accelerate the adoption and integration of real-time immersive collaboration solutions. In the future, 5G technology will also open up the potential to stream to immersive AR and VR devices – untethering the experience and facilitating factory-wide adoption of immersive solutions. For example, as ‘Industrial Internet of Things’ (IIoT) data streams from smart devices in the factory, it will be overlaid via AR glasses in the physical space. And as cloud service providers build out features such as spatial anchoring to support ever-larger physical spaces, these new services will be used within collaborative environments rich with real-time data.

Factory workers, for example, will have the ability to ‘dial an expert’ directly from a virtual panel on a smart factory device. This offsite expert will appear as a holographic colleague and bring with them live 3D data for that individual machine. Both users will have real-time IIoT data overlaid intuitively on the fully interactive 3D model to facilitate a more effective diagnosis and maintenance process.

Empowering shop-floor workers with hands-free AR and detailed 3D data will dramatically improve assembly line efficiency, with an intuitive environment where product data is fully interactive. Users will be able to move, hide, isolate and cross-section through parts, while using mark-up and voice tools to create efficient instructions for the assembly or disassembly of complex products. These instructions will be recorded and delivered as holographic guides via AR directly on the assembly line.

The next generation of real-time immersive 3D collaboration technology is even set to enable you to have a scaled-down hologram of your latest engine design sitting right in front of you on your desk. As you work on the design and refine it using your CAD software, the changes will be dynamically loaded into the hologram so that you can see the effects immediately and make any further necessary adjustments.

Meanwhile, digital sleeving – with 3D images overlaid on physical designs – will enable you to check how two parts of the engine come together, even when they are being designed by different teams in different locations. Similarly, you will be able to see how, for example, cabling will fit inside your latest aircraft seat design or where best to put the maintenance pockets for easy access.

This kind of approach adds a new dimension to the handoff between design and manufacturing. If adjustments need to be made to a fan assembly design, for example, the relevant part can be isolated within an immersive design review – and speech-to-text notes can be added to the part number and automatically become part of the change request. It’s all a far cry from endless design iterations, spreadsheets and printouts – or CAD screen shares using a 2D representation of a 3D problem.

In the post-pandemic remote world, conferencing is bringing people, video and documents together. Collaboration is now adding the fourth dimension of 3D immersive experience to complete the picture.

 




Smart Glasses In Surgery: Expert Analysis Outside The Operating Room

Surgical teams around the world consist of doctors with diverse levels of training, experience and expertise. Sometimes, members of those teams need to consult with a specialist about a surgery they’re performing while the patient is on the operating table, to decide the best steps to take in their care.

Historically, an on-call consultant at a hospital where a surgery is being performed would have to don the necessary personal protective equipment (PPE), head into the theatre and give their verdict. Now, thanks to smart glasses technology, there is a much more efficient route forward.

Iristick, a company that makes smart glasses for industrial purposes, has partnered with Rods&Cones, which focuses on remote assistance in the operating theatre, to create a specialist solution. The two organisations have developed a specially designed pair of smart specs customised for use during surgeries to enhance communication and interaction within an operating theatre.

The smart glasses enable a surgeon to share what they are seeing with a remote specialist. Through the glasses’ microphone, and its two cameras with optical zoom lenses, a consultant outside of the operating room can have an unrestricted, close-up view of a surgery as it progresses. Watching the operation unfold, they have the ability to speak to the surgeon and provide real-time feedback and advice.

As the smart glasses are technically classed as a telecommunications device, rather than a medical one, they haven’t had to seek the European CE approval to start being used in hospitals. Currently, they’re being used in the Netherlands, Belgium, Spain and Italy, with plans for further international expansion.

JUST A QR CODE AWAY

“We keep the surgeon in full control over the communication, while all the handling of the cameras is done by the remote expert,” says Rods&Cones founding partner and CEO Bruno Dheedene.

Let’s say a surgeon is implanting a patient with a device that hasn’t been on the market for long, and which as a result they aren’t overly familiar with. The smart glasses feature a QR code scanner that enables a surgeon to dial-in an on-call expert, perhaps even somebody from the team that developed the new device, simply by looking at the code.

“YOU JUST HAVE TO ASK A CIRCULATORY NURSE FOR THE QR CODE OF THE PERSON YOU WANT TO CALL.”

“You wash your hands, you start the surgery, and half an hour later you want to get some expert advice from a colleague,” says Dheedene. “You just have to ask a circulatory nurse for the QR code of the person you want to call.”

The remote expert will then be able to see everything the surgeon can see through the cameras of the glasses. They’re in full control of and can make enhancements to the footage streamed to them by zooming in, taking pictures and even adjusting the exposure and contrast of the images.

Rods&Cones have also made specific enhancements to the glasses so they can handle X-ray video feeds, the high-contract screens of in-theatre devices and red balance issues.

IMPROVED ACCESS TO EXPERTS, PPE SAVINGS AND A BETTER VIEW

There are a number of advantages to allowing operating surgeons to consult remotely with experts outside of the hospital they’re working in. They have access to a much wider field of specialists than they would otherwise have, and could even speak to multiple people about the same issue if it proves to be particularly complex.

Additionally, the hospital saves on PPE. No one has to gown up to look over a surgery for a few minutes when they can dial-in from outside the room. In the age of Covid-19, when PPE supplies are running low, this is particularly significant. The smart glasses can help to enforce social distancing too, by keeping the number of people inside each operating room that’s currently up and running to a minimum.

The glasses also provide an arguably better view of the surgical field than could be gained from actually being stood in the room. Remote assistants now have what are effectively the best seats in the house.

“Surgery is mostly happening in a very small cavity. If you go into surgery and stand next to the doctor, you won’t be able to see everything he’s doing, because he’s working in between his hands,” says Dheedene.

THE FUTURE OF SMART GLASSES IN SURGERY
Rods&Cones chose to partner with Iristick for the development of the device due to the quality of the glasses the company was already manufacturing.

Alongside the video quality of the intuitively positioned cameras, the glasses are incredibly light at only 70g, meaning they’re unlikely to prove bothersome to wear for long stretches of time. Instead of having hardware weighing the device down Iristick’s glasses are fibreoptic and all streaming and processing is carried out via a module worn in the surgeon’s pocket.

That said, the Rods&Cones software can integrate with other smart glasses too.

“It’s not a mutually exclusive partnership, so in the future we might go in with other partners,” says Dheedene. “We want to adapt existing technology, as far as possible, to the use-case of surgery. We have made our software such that we can integrate with any glass. You just need to put a module in-between, to connect the parameters of our platform and the glass platform.”

Introducing video conferencing to an operating room in such a sophisticated fashion could well be a gamechanger. When it’s possible for operations to be carried out from miles away by utilising a 5G mobile network connection, using a pair of smart glasses to dial-in a consultant when needed seems only logical. With the world being as interconnected as it is, having on-demand access to specialist feedback and advice during an operation is more than just a futuristic luxury – it may, instead, become a daily essential.




AfterNow – New Remote Software Platform Avoids “Zoom Fatigue” by using AR VR

This technology, combined with Oculus or Microsoft headsets, empowers executives, sales teams, trainers, and educators to present immersive content to their audience resulting in increased engagement, satisfaction, and retention while avoiding burnout and fatigue. As the Coronavirus COVID-19 Pandemic sweeps across the globe it’s ramifications can be felt in every facet of daily life. Perhaps most visible is that in the workplace and education. As many employees and students are working and learning from home, they rely heavily on remote software. According to Time Is Ltd., a productivity software company, online meetings doubled from February to April 2020. As the debate rages over the safety of in-person schooling, the quest to make remote learning efficient and efficacious is paramount. The drawback to more popular platforms has resulted in experiences such as ‘Zoom Burnout’. Additionally, many people feel the typical video set-up doesn’t fit their particular working style. Current research shows Virtual Reality training actually results in increased engagement, retention and satisfaction. AfterNow Prez Remote makes use of Virtual and Augmented reality, proving more effective than Zoom or Powerpoint for communicating, learning, and exchanging ideas. “Over the past few years, we’ve been building high end custom immersive presentations for the largest companies in the world and we’ve seen how effective the technology is – we’ve seen positive outcomes when our customers have used AfterNow Prez for sales and internal meetings.” said Philippe Lewicki, Co-founder of AfterNow. “This private beta of our new remote platform is the first step to bringing this technology to everyone – small and medium companies, schools and universities.” AfterNow is an augmented reality company specializing in custom visualizations and presentations. As a Microsoft Mixed Reality partner, AfterNow has created and designed AR presentations for Fortune 500 companies.

Though there are existing VR/AR collaboration platforms like Spatial, Glue, MeetinVR, EngagedVR, and Virbela, to name a few, AfterNow Prez Remote is designed specifically to present which makes it more effective for sales, training, and education. The presenter prepares content specifically for the meeting and goes through that content with the team or students. “Almost all of us present ideas to one other. Presenting has become central to how we educate our children and work with our colleagues,” says Dave Birnbaum, HCI expert and AfterNow client. “As we spend an increasing amount of our time in immersive spaces, we will need welldesigned software that allows us to make presentations that are richly sensory, nuanced, and convincing. Software that gets this right will be a key enabler of the XR revolution.” AfterNow Prez Remote requires an internet connection and supports one presenter with up to 500 participants. Platforms supported are Hololens 1 & 2 and Oculus Quest.

The easy to use platform allows a presenter to upload images, audio, videos, 3D objects, as well as enter text and organize them into Slides with transitions. “In 5 to 10 years everyone will be using this technology. Today it’s a great solution for teachers to bring their classes into a students home with engaging 3D animated content and live interaction or a sales team that can go and see their prospects or present a virtual tradeshow.” adds Lewicki. Large tech companies such as Facebook, Microsoft and Magic Leap have invested billions in Augmented and Virtual Reality. The Coronavirus pandemic can be seen as the type of black swan event that illustrates dire circumstances often lead to innovation.

Humans are 3 dimensional creatures and with AfterNow Prez Remote, moving away from 2D flat screens and into immersive technology to better learn, communicate, convince and understand, proves the future just got real.

For more information, please visit www.afternow.io or email [email protected] or call 424 258 0776.

Read AfterNow’s AREA member profile 

About AfterNow

Since 2015 AfterNow specializes in envisioning, designing, and building mixed reality applications for Microsoft’s Hololens (1 and 2) , iOS, Magic Leap and Quest. Notable clients include Anthem, T-Mobile, Sprint PCS, Hershey, Marvel Disney, WB, Hyperloop (HTT), Boeing, Becton Dickinson and Qualcomm. Auggie award winner and Microsoft Mixed Reality partner since 2016, AfterNow, is building the future of spatial computing.




Essential Steps For Any Business To Prepare For Augmented Reality

How can a business make itself ready to successfully apply AR? What will make implementation easier and more effective and ensure that the initial efforts provide a solid foundation for future transformation?

Knowing Where You Are And Where You Want To Go

There are two things you need to do at the very beginning: Identify a business goal, and assess what you are currently doing to achieve that goal.

A business goal can be retaining expertise by transferring skills from older or retiring workers to newer or unskilled workers. It can be providing product demos to prospects for products in a portfolio. It can be ensuring that engineers collaborate successfully on meeting permitting and safety requirements for new assembly lines across global locations.

Most importantly, what are the processes and procedures? Where are the bottlenecks or particular difficulties?

When considering where best to apply AR, further assessment is necessary — technology readiness. An impressive AR demo can be created for almost any business situation, but it’s important to choose a use case that can scale. AR can be used to improve routine, repetitive activities, but it won’t show its true value there, and investment in it will show less return.

AR really shines at helping with complex, varied and changing circumstances. The wider the range of product types, manufacturing procedures or workforce capabilities, the more clearly AR will show its value and the wider the organizational uptake will be.

Delivering AR To The User

AR content can be delivered to the end user in a variety of ways, and careful consideration of that user’s needs and the constraints of their work environment is necessary for a successful demonstration.

For example, a sales rep may need to present a broad portfolio of thousands of product configurations to prospects and customers. Currently, that may involve shipping samples to trade shows, providing spec sheets, and linking to diagrams and videos on webpages.

With AR, a customer can see all the details of a specific product, get a good idea of how it works and understand how it differs from the competition. Implementing AR on a phone or tablet can allow that sales rep to easily build a relationship with that customer, demonstrate a product of interest, communicate its details and use, and answer any technical questions while maintaining the touch essential to the sales process.

However, if the goal is to improve worker productivity on the production line, where various tools need to be picked up and used, AR content can be best delivered through a hands-free wearable, whether binocular eyewear such as Microsoft HoloLens or Magic Leap or monocular eyewear such as Google Glass Enterprise or the RealWear HMT-1. That information is overlaid on what the worker is seeing, whether it is instructions, fill levels or safety precautions, without interfering with the worker’s tasks.

It’s worth spending some time to really consider the various possible ways your AR could be used now and in the future so the chosen technology presents the information in an optimal way for the user.

Ensuring Access To The Necessary Content

An audit of the information necessary to build an AR experience that communicates effectively to the user can turn up gaps. This is fairly common because the range of information AR can communicate is much wider than is possible with existing channels. Ensuring the availability of this information as early in the process as possible can make for effective implementation.

If you want to provide procedural guidance to line workers, you must have — or be able to create or capture — digitized work instructions. If you want to provide 3D instructions on how to maintain and service a newly acquired machine, you must have the 3D CAD data. If you want workers to see diagnostic information about a machine’s performance such as vibration, temperatures and fill levels, that machine must have the necessary sensors and connectivity.

Identifying this information will require acquiring, storing, managing, distributing and analyzing new types of data and repurposing data you already have.

While not ideal, the lack of some information is not fatal. For example, if there is no 3D CAD data for your machine, using a head-mounted device to record an expert performing all the required maintenance procedures can fill the gap. However, identifying those gaps and planning methods for filling those gaps is essential.

Presenting That Content In A Useable Way

Technologies such as web and mobile apps, which were new not so long ago, are now established, and the methods for creating them and making them usable are defined. AR is much earlier in the process of becoming routine, so the specifics of AR usability still require attention.

Even an AR project that addresses a business goal, understands user needs and is supplied with the right content can fail if the user experience is inadequate. There are many ways to go wrong, from excessive or poorly organized information to inadequate visual contrast.

The need for usability is great, and tools to assist in AR content authoring are developing quickly. They’re already providing significant assistance to content developers, but understanding the capabilities and needs of the worker and rigorously establishing what information is most important in what context is key in this step.

You are Ready For AR

Almost every business can improve efficiency, reduce costs, more quickly skill workers or ensure compliance through the information AR communicates. Choosing the right place to try AR first takes some thought and planning, which will enable an effective AR implementation that will provide a foundation for future growth.

 




AR, AI and IIoT empower Front Line teams

The technologies underpinning the Industrial Internet of Things (IIoT) are key to the success of Industry 4.0. I recently had the honor of hosting a panel during IIoT World Days virtual conference looking at the role and power of analytics in IIoT.

Each of the five guests on the panel had insights into what they called the “Manufacturing Analytics Journey” – taking a detailed look at how analytics impacts profitability, powers prediction, informs intelligent optimization and leverages big data.

The insights they offered about the importance of data and analytics got me to thinking about the important role that AR, AI and mobile devices can play in actually making use of that data on the front line.

As it happens, integration with industrial IIoT infrastructures is something that our team has spent a great deal of time working on over the last several years. Since the first release of our “Transforming the Enterprise” white paper back in late 2018, we have been clear about the relationship between IIoT, AR and AI.

In the latest release of that White Paper, we spelled out exactly how we saw the connections between AR. AI, IIoT and machine learning. We start with the context of the frontline team member in an industrial setting who is servicing a piece of equipment.

This context could leverage data about:

  • the work identity profile of the frontline team member
  • the skill set data of the frontline team member
  • historical data covering the work instructions they may have previously worked with in relation to a particular piece of equipment they are servicing
  • the remote experts or colleagues they typically work with
  • and what level of certification and training they may have in undertaking the job they’re about to do.

Once we have that foundational context, we can combine it with information about location, time and date (all drawn from the mobile device itself) – and then start using relevant industrial IoT data to provide:

  • very specific assistance that is relevant to the task at hand,
  • insights into how the equipment that the frontline team member is working on may relate to other useful IoT data from similar equipment
  • live diagnostic data from the equipment itself.

We believe that front line teams need to be able to use their mobile devices (including smart glasses, tablets and smartphones) to get information from machines, sensors, and the IIoT infrastructure and see the the data flow into their field of vision.

The IoT data can come from the frontline team member’s immediate work environment – with QR code or object recognition scans being used to perhaps draw information about when a piece of equipment was last serviced, provide immediate access to all relevant service records, work instructions and performance data for the equipment itself.

And the utility of having these technologies linked doesn’t stop there. Context is also a vital component of helping systems become more intelligent (though ML and AI technologies) and predictive.

Leveraging both edge computing and AR technologies, enhanced by machine learning and artificial intelligence, creates a platform that can anticipate what members of the extended enterprise will need to do next – sometimes before they know it themselves.

It builds on the idea that an organization has the capability, with the simple introduction of something like our Front Line OS (powered by AR and AI), to hold up a mirror to itself – and its supply chain – to gain true predictive insight in both the specific and broad collaborations of the extended enterprise.

 




First knee replacement surgery successfully completed with Augmented Reality Technology

Pixee anticipates the number of total knee replacement cases using the Knee+ technology will increase quickly as they already have a sizable list of surgeons interested in trying this innovation using the Vuzix M400 Smart Glasses. The combination is compact, easy to use, wireless and does not require disposables.

Pixee Medical expects to sign their first distribution agreements with implant manufacturers over the next few weeks, allowing their solution built around the Vuzix M400 Smart Glasses to be promoted by them worldwide. Pixee Medical is pursuing and expecting FDA approval (510k) for Knee+ before the end of 2020.

“The team at Pixee Medical created an innovative path to bring the Vuzix M400 Smart Glasses into the operating room to perform knee replacement surgeries and we look forward to supporting the worldwide distribution of their innovative AR solution,” said Paul Travers, President and Chief Executive Officer at Vuzix.

Read more in the full press release