1

AREA Research Project on Best Practices of Merging IOT +AI+AR

The AREA is issuing a request for proposals for a funded research project that will produce tools to increase dialog between stakeholders in enterprises and suppliers of IoT, AI and AR technologies, and a report which describes the state of the art and proposes potential courses of action to address challenges facing enterprises who seek to combine IoT, AI and AR in the workplace.




ThirdEye Gen: The company behind “the world’s smallest Mixed Reality smart glasses”

AREA: How and when was ThirdEye Gen founded?

CHERUKURI: The company’s been around for about three years. The first two years were spent researching and designing. Then we came up with our first product, our X1 smart glasses. We launched the X1 at CES in January of 2018 and we just unveiled our new X2 product, the world smallest Mixed Reality smart glasses with built-in SLAM, at CES 2019.

AREA: How does the X2 compare to other smart glasses?

CHERUKURI: The main differentiator for the X2 is its field of view, which, at 42 degrees, it is very wide. We designed and built the product mainly for the enterprise market, so it also features a very high brightness level and includes massive battery packs so you can wear the glasses six to eight hours at a time. And it’s based on Android, so it’s very easy to customize and create applications.  The built-in SLAM (Simultaneous Localization and Mapping) we developed in-house allows for advanced tracking applications, such as 3D machine instructions.

AREA: What made you decide to get into the smart glasses business?

CHERUKURI: We had a related background in the military. In addition, our engineers had been working with this technology for the past 20 or 30 years. With the growth of AR, we saw an opportunity to expand our business into the enterprise market.

AREA: What applications or industries have you had success in?

CHERUKURI: The most popular application for the X2 so far has been the ThirdEye App Suite. We offer our own remote help software, but we also partner with third-party software companies that have their own platforms. A lot of these companies buy our glasses, load them with their software, and resell them with their value added. That’s the most common use case, but we have others. For example, we work with many companies in the healthcare space who are VARs and use it for the visually impaired, surgical use cases, and other uses.

AREA: What kinds of benefits are ThirdEye customers getting from your products?

CHERUKURI: Most of our customers are still in the pilot phase, but even at that early stage, they’re seeing that, by saving the cost of sending an expert just once to the customer site to fix a problem, the glasses pay for themselves. The ROI is huge. The main challenge to mass deployment is the legacy of installed systems and the integration effort.

AREA: Do you consider that to be the greatest barrier to AR adoption?

CHERUKURI: Yes. Whenever you have a new technology, the integration required to bring that solution to the day-to-day life of a large company takes time in terms of going from a pilot program to wide-scale use.




Fieldbit on AR-enabled Industrial Field Service Applications

AREA: How and when did Fieldbit originate?

RAPOPORT: My partners and I came from the world of industrial automation technologies. In 2013, Google Glass came out and we started to think about how this gadget could be used in an industrial setting. We had a feeling that this technology would have a significant impact on the way we work, especially in field services for the industrial world. Initially, our idea was to create procedures on smart glasses showing users how to repair complex equipment. After interviewing many prospective customers, we realized that converting existing procedures to smart glasses and an AR format would require considerable resources from the organizations themselves. What customers really needed was a way to assure that on-site fixes are completed by technicians on the first visit in order to avoid sending experts to resolve the problem. This could save everyone – vendors and customers – a lot of money. So we decided to develop a solution using Augmented Reality and smart glasses that would help the technician solve a complex technical problem by receiving augmented instructions from an expert. This was our first product – Fieldbit Hero – an end-to-end visual collaboration platform for field services.

AREA: In just five years, Fieldbit has developed a pretty comprehensive, integrated set of solution offerings.

RAPOPORT:  After Fieldbit Hero rolled out to the market, we realized that remote assistance alone was not enough. Customers wanted to digitize the entire process of on-site service and repair. So we added Fieldbit Knowledge to capture on-the-job processes and share best working practices across the organization. Later, we also added Fieldbit Cosmic, which is used to augment real-time data from equipment and control systems or information from back office applications using Fieldbit’s SDK and APIs.

AREA: What distinguishes Fieldbit in the AR marketplace?

RAPOPORT: There are several key differentiators. First, we are focused solely on industrial field service applications. Second, we enable our users to build Augmented Reality online. This real-time capability is extremely important for field service. Our customers, like machine builders or technical service providers, don’t have hours, or even minutes, to prepare AR instructions. Their problems need to be fixed fast. This is our core strength – enabling users to immediately create AR and send it to a remote service technician or equipment operator. Third, Fieldbit allows customers to record the entire service process including all interactions and exchange of information, so they can analyze how the relevant service procedures were conducted and reuse this information. This enables Fieldbit users to capture the practical expertise of experienced technicians and share it across the organization. Capturing on-the-job knowledge is a tremendous asset for organizations looking for a way to preserve the knowledge of their aging field service workforce. Without such a solution, experienced technicians and experts would retire and their decades of accumulated knowledge would simply vanish.

AREA: In what industries have you seen the greatest traction to date?

RAPOPORT: We’re focused primarily on industries where sophisticated, complex equipment is installed, such as oil and gas exploration and production, industrial machine manufacturing and complex telecom installations. We’ve seen success, for example, in German-speaking countries with machine builders. In Germany alone there are around 6,000 industrial machine builders. We’ve also been expanding our presence in the US market, primarily in the energy sector.

AREA: Can you give us a few brief examples of Fieldbit customers and the benefits they have gained from your technology?

RAPOPORT: BP is a good example. BP’s Lower 48 division has a fully deployed Fieldbit solution being used by all of its field engineers. They operate in the upstream oil and gas market and have some 13,000 wells in the US. We started working with BP over two years ago. During that time, they’ve dramatically shortened the problem resolution cycle and reduced the number of travels of top-notch experts to remote sites. Using Fieldbit, BP’s experts guide field technicians to resolve the problem on the first visit. These factors have enabled BP to increase production.

We also work closely with equipment manufacturers. For example, Emerson Electric uses Fieldbit to support not only their engineers and technicians, but also their customers. When an equipment operator has a question, an Emerson subject matter expert in a support center can send the operator a link by text message or email and invite him or her to open a Fieldbit interactive session. The expert then guides the operator using Augmented Reality tools until the problem is resolved, without the need to dispatch an Emerson engineer to a remote site. With Fieldbit, many technical issues and even complex problems can be resolved remotely.

AREA: Tell us about your recent product announcement.

RAPOPORT: In December, we launched Fieldbit 5.0, a major new release. There are two key highlights in this release. First, we introduced new AR capabilities in our iOS application that combine image recognition with SLAM (Simultaneous Localization and Mapping) algorithms. This improves the AR user experience, making it more stable, vivid, and accurate. Second, we completely redesigned our mobile user interface. It’s very intuitive, so now mobile users can begin using the Fieldbit application immediately without any training.

 AREA: What do you see as the greatest barriers to AR adoption?

RAPOPORT: We believe that smart glasses will eventually be the most useful smart device in field services. Using smart glasses, technical personnel can work hands-free, access information and get instructions from a domain expert. However, smart glasses are still cumbersome to wear and operate. In addition, optical limitations such as narrow field of view have not yet been resolved. That said, we are seeing significant progress in this area. I believe that when smart glasses become easier to use, with better optical functionality, we will see a leap forward in AR adoption.

AREA: What do you hope to gain by being a member of the AREA?

RAPOPORT: We look forward to participating in committees and exchanging views, knowledge and experiences with other AR providers and equipment manufacturers. We hope to learn from AR customers and other members about changing requirements. And we’re already taking advantage of AREA resources, such as reports and market studies.




ExxonMobil Becomes the AREA’s 50th Member

“Joining the AREA means we can collaborate with fellow technology leaders to further explore immersive technology,” said Adam Wariner, architecture and technology manager, ExxonMobil. “ExxonMobil has been expanding its world-class technology leadership through applications in augmented reality and other emerging technologies, and we are excited about the opportunities to revolutionize the way we work.”

In addition to ExxonMobil, new Contributing Members include Dwr Cymru Welsh Water, and Vuzix. The AREA has also welcomed seven organizations to its ranks of Startup members: Apprentice, Augumenta, HART Influencers, LogistiVIEW, Sarcos Robotics, ThirdEye Gen, and Threesixty Reality

“With each new member that joins, the AREA becomes a more diverse, resourceful, and valued organization,” said Mark Sage, AREA Executive Director. “Our activities and resources will be greatly enhanced by the expertise and insights of these forward-thinking enterprises. The AREA is focused on helping to reduce the barriers to AR adoption and helping to accelerate the enterprise AR ecosystem.”

The AREA’s membership benefits include access to high-quality, vendor-neutral content and participation in various programs to help reduce the adoption of AR within organizations, a research framework to address key challenges shared by all members, discounts for fee-based events, and more. Sponsor members have a direct role in shaping the rapidly expanding AR industry and demonstrate their companies’ leadership and commitment to improving workplace performance.




Highlights of 2018 at The AREA

Of the coming year, Mark Sage, Executive Director said:

“As we head into 2019 and we look forward to continued growth in the ecosystem. More enterprises are researching, developing pilots and /or moving towards commercialising AR technology and gaining great ROI.  Thank you to all our members, partners and associates for being a part of this exciting journey and development during 2018.  I look forward to working with you to see what the new year will bring for AR in the enterprise.”

We would like to wish all our followers, readers, associates, colleagues, staff, leaders and members a very Happy New Year.




Sarcos’ Brad Kell on AR in Robotics

AREA: Sarcos has been a pioneer in robotics for 25 years now, most recently with its Guardian XO exoskeleton. How does AR fit into the picture?

KELL: While Sarcos has been building robots a long time, the Guardian XO under development will be our first exoskeleton product with an intelligent user interface. We need to understand how people are going to interact with the XO, for example, how information is presented to the user, how that information is used, and how to make things easier for the user so they can focus on the task at hand. We see AR fulfilling many of these requirements and being an important part of the product.

AREA: What kinds of user requirements are you expecting to see in the future?

KELL: We see a number of possibilities as more exoskeletons and similar systems come into use, particularly regarding access, presentation, and use of information, hands-free operation, and awareness of and interaction with the surrounding environment – for example, things like hazard avoidance and semi-automated user guidance.

AREA: It sounds as if somebody at Sarcos was looking at what the capabilities of the exoskeleton would be, how complex the future of this product would be, and decided that traditional forms of user interface were not going to be sufficient.

KELL: We could make an exoskeleton that is functional today using a traditional interface, but we are looking towards the future along with our customers. They are looking for a good user experience and better ways to control new technology that also integrates with AR systems they are already evaluating for other applications. Given the nature of our product, hands free operation is particularly important and is a critical driver to make sure we are meeting our customer’s needs.

AREA: Is AR being considered exclusively for the exoskeleton or will it be used in other Sarcos products?

KELL: When we talk about long-term strategy, we recognize that it is very important. We’ve already discussed wider use of AR internally and with our customers and we’ve started to evaluate select existing technologies as a first step. We have several products that could really benefit from AR technologies. With our Guardian S product we expect to enable enhanced inspections through the use of AR.

AREA: What aspects of AR are you exploring right now?

KELL: A user in an exoskeleton is not necessarily going to have hands available to push buttons. They want to tell the system what to do, and the system should give them audio and visual feedback. They may want to visualize digital three-dimensional models and documentation. We can see the potential AR can bring to our integrated strategy. It’s a matter of us selecting an appropriate technology and seeing how it fits into our product delivery schedule.

AREA: Can you give us a sense of the next near-term milestone in the Sarcos AR strategy?

KELL: Our targeted commercial design release for the delivery of the XO is late 2019. Until then, we must work through technology selection, and make sure it is compatible with what our customers want and expect or will allow. Once we decide what to do, select AR technology could be integrated with our existing systems and tested to make sure that it meets specifications.

AREA: What do you hope to gain through your membership in the AREA?

KELL: We’ve already been exposed to some new technologies and met some other AREA members. Our goals are to get a better understanding of what’s out there, meet potential partners, and have an opportunity to evaluate ideas and technologies to see how they might address our customers’ needs. The other thing that’s very interesting is access to early adopters. There’s nothing better than working with people to get insight on something they have already tried.




AREA-supported ISMAR 2018 Workshop Explores Enterprise AR Requirements

Led by Michael Rygol and Christine Perey, the workshop attracted 33 attendees from a variety of commercial users of AR, as well as AR technology suppliers and academic/research institutions. With a truly international feel, we were pleased to welcome delegates from throughout Europe and North America, in addition to those from Japan, China and South Korea.

The workshop objectives included:

  • Discussing five widely-validated AR use cases to collaboratively derive a set of structured and related requirements;
  • Generating new content for the AREA’s recently launched AREA Statements of Need (ASON) management tool;
  • Providing networking opportunities

Following the introductions, Nicole Le Minous and Muriel Deschanel, of B<>COM (a Technology Research Institute focused on digital technologies, based in France) presented insights from a recent survey of 77 participants regarding AR use cases and challenges. The presentation is now available to download.

Michael then discussed the need for structured and connected use cases and requirements in order to help move the AR ecosystem forward, with reference to the ASON taxonomy. The goals were to help vendors understand the AR requirements of their customers; and to provide companies seeking to use AR with a neutral view of needs from the industry itself.

The AREA’s ASON provides users with the ability to manage, connect, review and report numerous artifacts used to help articulate a neutral set of needs. Such artifacts include business process scenarios, use cases, requirements, personas, industries and settings, and more to help create an actionable repository of connected data.

The audience was then organized into five groups, each of which included AR users, suppliers and researchers. Each group discussed its overview use case definition and applied it to a setting of their choice, from which they created a set of requirements. These use cases were (with the group-chosen specific application in parentheses):

  1. Remote assistance with AR (onsite troubleshooting with a remote expert in an industrial setting)
  2. AR for complex assembly (factory assembly of medical devices)
  3. Inspection and quality assurance with AR (as-built vs. as-designed inspection for hotel construction)
  4. Virtual user interfaces with AR (utilities controls within restricted access buildings)
  5. AR for training (rapid training and instructions for business audio/video hardware installations)

To finish, a representative from each group presented a summary of their discussions.

The workshop was a very useful exercise and the content captured is now being entered into ASON for further use by the AREA community.




AREA Completes Safety and Human Factors Research Project

The AREA Research Committee recently distributed to members two deliverables produced as part of the organization’s third research project, Assessing Safety and Human Factors of AR in the Workplace. This groundbreaking, member-exclusive research project produced the first framework for assessing and managing safety and human factors risks when introducing AR in the workplace. In addition to a tool to support decision-making, members also received an in-depth report of findings based on primary research.

Through the knowledge of its members and detailed interviews and research conducted with the wider enterprise AR ecosystem, the AREA’s reusable framework will promote a consistent approach to assessing safety and human factors of AR solutions.

This research was undertaken by AREA member Manufacturing Technology Centre (MTC) and managed by AREA sponsor member Christine Perey of PEREY Research and Consulting.

“For the first time, AREA members have a framework that will enable them to consider important requirements from the perspectives of key project roles and at each stage of the AR project,” said Perey. “The framework and supporting report are invaluable tools, built on the experience and knowledge gained by members and the larger community through many AR projects.”

“Through a combination of desk research and interviews with experts in the enterprise AR field, we captured rich and comprehensive insights into best practices and potential issues to overcome in these previously under-researched areas,” noted Amina Naqvi of the MTC, the author of the framework and research paper.

“This is another great example of the value the AREA brings to its members and the wider enterprise AR ecosystem,” said Mark Sage, Executive Director of the AREA. “By working together and learning from our fellow members, we’ve been able to produce research results that bring real benefits, and help to reduce the barriers to adoption for AR projects.”

The AREA has prepared a free Executive Summary of the Best Practice Report and a case study for non-members, “Assessing AR for Safety and Usability in Manufacturing” to help companies in the AR ecosystem to adopt or design safer and more usable wearable AR solutions.

If you’d like access to these resources please follow the links below hereto download them.




AREA Launches Research into AR Manufacturing Barriers

Hot on the heels of delivering its third research project, the AREA has launched a new project, defined and voted for by the AREA members, targeting barriers to AR adoption in manufacturing.

While many manufacturers have implemented AR trials, proofs of concept, and tests, relatively few have rolled out fully industrialized solutions throughout their organizations. The goal of the fourth AREA research project is to identify issues and possible strategies to overcome these barriers.

This is the first AREA research project that focuses on a single industry in which there are many use cases that can improve performance, productivity and safety, and reduce risks and downtime. The project will have both quantitative and qualitative components and the deliverables will include an AREA member-exclusive report and a framework for identification of common barriers and the best mitigation strategies. In addition, there will be a case study illustrating the use of the framework that will be published for the AR community.

Dr. Philipp Rauschnabel of the xReality Lab at Universität der Bundeswehr in Munich and his team will be leading this research. Enterprises interested in providing input to the project may complete this form or send an email to [email protected].




Mozilla Takes Aim at the AR Browser

Mozilla is the not-for-profit behind the popular web browser, Firefox. The company works to ensure it stays open by building products, technologies and programs that put people in control of their online lives. In April of this year, Mozilla officially announced Firefox Reality, a browser designed specifically for VR and AR headsets. Firefox Reality currently runs in developer mode on Google’s Daydream and Samsung’s and Ocular’s Gear VR, with more devices on the way. We spoke recently with Lars Bergstrom, Director of Engineering, Mixed Reality at Mozilla Research, to learn more about Mozilla’s plans.

AREA: Tell us about how Mozilla decided to get into augmented reality.

Bergstrom: As a nonprofit, mission-driven organization, Mozilla believes in protecting the user online. We want everyone to be able to get online, use information on the Internet and have it delivered to them in a safe and private manner. We had been experimenting with VR and AR headsets for a while and saw a couple of years ago that the technology was finally getting to a point where it’s ready for wide scale adoption in enterprises and very soon, consumers. And so, in order to get to a place where we could ensure that the web was safe, private and relevant for consumers, we knew that we needed to start investing in building browsers and web-based technologies for these VR and AR headsets so that, by the time people were ready to buy them, the web was already there.

AREA: What’s different about Firefox for AR compared to other headset-enabled browsers?

Bergstrom: The biggest thing we’re trying to do is to partner and build a really great immersive experience for each headset, customized for the hardware. Each one has its own sweet spot, field of view, and ergonomics. What gestures does it support? How long can people use it and be comfortable? We’ve tried to work with each of the headsets, either with the manufacturers directly or ourselves, to figure out how to build a browser that’s customized to work well on that hardware, is comfortable, and offers something that users want to be in all the time. So, one of the biggest things that distinguishes us is that we’re working directly with the hardware vendors.

AREA: What are some of the fine points of browser design when you’re talking about the augmented reality environment?

Bergstrom: The web as a platform didn’t do a lot with computer vision. The camera is only available to the browser through generic media APIs, but when we transition to augmented reality, everyone using these devices expects to have access to the specialized hardware APIs available on those devices. They expect to have some sort of object recognition, they expect to be able to find QR codes, and they expect reasoning about the environment, whether that’s object detection or surface detection. So one of the big things that we’ve had to do in the browser was work with the hardware vendors and other people to determine how we were going to expose this information, up to web pages and do it in a safe way? How are we going to ask users for privacy? Because right now, if you build a Unity app, you just get access to all of these APIs and then there’s some assumption of trust. You install the application and it doesn’t really prompt you again. However, with the web, we’re talking about potentially untrusted content; you go to some website that’s selling a poster and they ask you if you want to put the poster on the wall. We need to be very careful for the user. How are we asking them for permission to display that in the real world and how much information is actually being transmitted back to that website? Because the expectation of trust is very different.

AREA: It sounds as if you’re trying to be very responsible about safety and privacy.

Bergstrom: We have to. We are a nonprofit and we’re mission-driven so that is our measure of success – getting people to safely and privately use the Internet on these devices.

AREA: You announced Firefox Reality in April. What’s the roadmap and what milestones can we look forward to?

Bergstrom: We’re focusing on the standalone device experience right now. We think mobile phone AR is really great, as is mobile phone VR, for getting developers experimenting with the hardware. But it’s hard to see people integrating it with their daily lives until it’s in a standalone headset. So, we’ve been focusing on all of the unique requirements there. For example, how do you load a URL when you don’t have an onscreen keyboard to fall back on? Our latest milestone is the release of versions of both the VR and AR browsers in application stores for several standalone headsets.

AREA: And after that?

Bergstrom: The big thing we’re working on is going beyond stores and actually getting it installed on devices for distribution with the headsets because that’s really where we want to be. We want to make sure that all of these headsets that ship have a good browser on them for doing web-based virtual and augmented reality.

AREA: What does Mozilla hope to get out of its membership in the AREA?

Bergstrom: One of the things that I really like about the AREA is that it brings together a lot of very experienced large organizations that have been running pilots and programs with augmented reality for many years. These are organizations we normally wouldn’t have contact with, as a nonprofit organization that’s mainly consumer focused. The AREA bridges that gap to help us connect to enterprises and industry in the way that we haven’t traditionally had. I am always surprised at how much I learn every time I talk to members of the AREA.

AREA: What kind of response have you had from the marketplace about the Firefox Reality announcement?

Bergstrom: We wanted to reach out to more hardware vendors and strongly signal to the market that we’re bringing a browser to AR and VR headsets and they should reach out to us. In our view, there’s no value in being exclusive to one device. Most of the value comes when we are on every device. Then people can build a web app and they get the experience to work across all of them. We’ve gotten a lot feedback. Immediately after we went live, a number of hardware vendors reached out to us. They’ve tried it on their devices and they’d like us to tune it for their hardware. We’ve also engaged developers to begin experimenting with it. Their feedback has been, “Okay, when are you going to have this installed on devices by default so I can ship my application?”

AREA: What’s the biggest question developers have?

Bergstrom: How is the performance relative to a native solution? In some ways, the web is not as mature a platform as Unity, in particular in terms of access to the raw graphics compute hardware. So we’re working to help developers get answers to those questions.