AREA Requirements Committee Advances Work at F2F Meeting

The future success of enterprise AR depends on vendors and enterprises having a shared understanding of the hardware, software, and use case requirements for each type of AR solution. Establishing those requirements is the work of the AREA Requirements Committee – and on August 11th, the group convened in Boston for two days of face-to-face meetings to advance their work.

Requirements are essential because they enable enterprises to evaluate what they need to implement an AR solution. At the same time, requirements provide AR hardware and software developers with the input they need to build products that fulfil enterprise needs.

Over the past three months, the Requirements Working Group has been meeting on a regular basis to develop and agree on a set of Global Enterprise AR Requirements. The face-to-face meeting in Boston was tasked with finalizing the first phase of the Global Enterprise AR Requirements.

The Working Group included the following AREA members:

 

  • Brian Kononchik – Boston Engineering
  • Jeff Coon – PTC
  • Matthew Cooney – PTC
  • Dan McPeters – Newport News Shipbuilding
  • Malcolm Spencer – Magic Leap
  • Jeremy Marvel – NIST
  • Barry Cavanaugh – MIT Lincoln Lab
  • Doyin Adewodu – Infrasi
  • Mark Sage – Executive Director of the AREA
AREA-Requirements-Committee-Advances-F2F

The two-day workshop was a great success – and highly productive! Bringing together AREA members from all parts of the AR ecosystem (end users, hardware providers, software providers, standards organizations and academics) created a rich, diverse, focused and expert view of the Requirements needed to successfully deploy an enterprise AR solution.

The team focused on three key areas:

  • Hardware Requirements
  • Generic Software Requirements
  • AR Use Case Requirements (based on the defined AREA Use Cases)

The first order of business was to conduct a detailed review and update of the Hardware and Generic Software Requirements that the Working Group had previously drafted. The Working Group then turned to defining the individual Use Case Requirements. Over the two days, the team succeeded in prioritizing the Use Cases and identifying a common set of requirements.

There was also an opportunity to review the updated AREA Statement of Needs (ASoN) tool, a purpose-built online capture, store, update and publish AR Requirements tool. A review of the functionality and reporting was made, and suggested improvements captured.

At the end of the event, all the participants agreed it was a very useful and informative workshop that needs to be run on a regular basis. My thanks to the attendees and the amazing team at PTC who provided the space and amazing facilities for the workshop.

Watch this space for more information about next steps and the upcoming launch of the AREA Global Enterprise AR Requirements.




AREA Human Factors Group Developing an AR & MR Usability Heuristic Checklist

Usability is an essential prerequisite for any successful AR application. If any aspect of the application – from the cognitive impact on the user to the comfort of the AR device – has a significant negative impact on usability, it could discourage user acceptance and limit projected productivity gains and return-on-investment.

But how can organizations pursuing an AR application evaluate a solution’s usability? To answer that question, the AREA Human Factors Committee has undertaken the development of an AR and MR Usability Heuristic Checklist. Driven by Jessyca Derby and Barbara S. Chaparro of Embry-Riddle Aeronautical University and Jon Kies of Qualcomm, the Checklist is intended to be used as a tool for practitioners to evaluate the usability and experience of an AR or MR application.

The AR & MR Usability Heuristic Checklist currently includes the following heuristics:

  • Unboxing & Set-Up
  • Help & Documentation
  • Cognitive Overload
  • Integration of Physical and Virtual Worlds
  • Consistency & Standards
  • Collaboration
  • Comfort
  • Feedback
  • User Interaction
  • Recognition Rather than Recall
  • Device Maintainability

The team is in the process of validating these heuristics across a range of devices and applications. So far, they have conducted evaluations with head-mounted display devices (such as Magic Leap and HoloLens), mobile phones, educational applications, and AR/MR games; see their recent journal article for more information.

To further ensure that the breadth of the AR and MR Usability Heuristic Checklist remains valuable across domains and devices, they are in the process of conducting further validation that will consider:

  • Privacy
  • Safety
  • Inclusion, Diversity, and Accessibility
  • Technical aspects of designing for AR and MR (e.g., standards for 3D rendering)
  • Standards for sensory output (e.g., tactile feedback, spatial audio, etc.)
  • Applications that involve multiple users to collaborate in a shared space
  • A range of devices (e.g., AR and MR glasses such as Lenovo’s Think Reality A3)

In the coming months, the team will move on to identifying and obtaining applications and/or hardware that touch on the areas outlined above. They will then conduct heuristic evaluations and usability testing with the applications and/or hardware to further refine and validate the Checklist. The final step will be to establish an Excel-based toolkit that will house the Checklist. This will enable practitioners to easily complete an evaluation and immediately obtain results.

Upon completion of the project, the AR and MR Usability Heuristic Checklist will become a vital resource for any organization considering the adoption of AR. If you would like to learn more or have an idea for an application that could be included in this validation process, please contact Dr. Barbara Chaparro or Jessyca Derby.




AREA Issues RFP for Research on AR-Delivered Instructions for High-Dexterity Work

To date, AREA members have funded 10 AR research projects on a wide range of timely topics critical to the adoption of enterprise AR. Now the AREA is pleased to announce a call for proposals for its 11th research project, which will evaluate the effectiveness of AR for delivery of work instructions for tasks requiring high dexterity. Building on prior research, the project will expand the current state of knowledge and shed light on AR support for tasks that involve high levels of variability and flexibility in completion of a set of manual operations, such as that found in composite manufacturing.

This project will answer questions, such as:

  • How does AR for high dexterity tasks differ from other instruction delivery methods?
  • How are users impacted by the delivery of instructions using AR in high dexterity tasks?
  • What are the key factors informing decision-making and driving return-on-investment in delivering work instructions for particularly dexterous, manual tasks?
  • Can AR-assisted work instructions help improve quality, productivity, or waste reduction and/or rework of manufactured parts?

This AREA research project will produce: a report on the efficiency and effectiveness of AR work instruction for tasks requiring high levels of dexterity; a research data set; a video summary highlighting the key findings; and an interactive members-only webinar presenting the research findings to the AREA.

The AREA Research Committee budget for this project is $15,000. Organizations interested in conducting this research for the fixed fee are invited to submit proposals. All proposals must be submitted by 12 noon Eastern Daylight Time on July 1, 2022.

Full information on the project needs, desired outcomes and required components of a winning proposal, including a submission form, can be found here.

If you have any questions concerning this project or the AREA Research Committee, please email the Research Committee.

 




AREA ED Explores Immersive Technologies on Mouser Podcast

What does the term “Immersive Technologies” encompass? And how are these technologies evolving to solve more and more business needs? Mouser Electronics’ The Tech Between Us podcast took up these questions – and more – recently when host Raymond Yin spoke with AREA Executive Director Mark Sage.

 

Mark and Raymond take a closer look at everything from remote assistance and guidance to digital twins and remote collaboration. Immerse yourself in this lively discussion.




XR at Work Podcast is Here to Talk Shop with AR Practitioners

XR at Work Podcast is Here to Talk Shop with AR Practitioners

We got together with Scott and Dane recently to learn more about the podcast and what they hope to accomplish with it.

AREA: Before we get into XR@Work, could you tell us what you do for a living?

Scott: I’m a Principal XR Product Manager for WestRock, a global consumer packaging manufacturing company. I’m responsible for all things XR-related for our 300 factories and our customer interactions.

Dane: I’m on the business transformation team for INVISTA, a polymer manufacturing company and subsidiary of Koch Industries. I lead XR and digital twin within INVISTA and I also lead the Republic of Science, a community of practice across Koch for XR technologies.

AREA: How did you two meet up?

Dane: We were both on a panel at AWE on real-life practitioners and Scott and I hit it off really well. There’s a fair number of people looking to get into the XR space that don’t have anybody else to reach out to, other than a vendor. Scott and I had conversations about how hard it is getting started and that’s what led to the podcast.

AREA: And when did the podcast start?

Scott: I think it was November of last year.

AREA: What’s the mission of XR at Work?

Scott: What Dane said is absolutely true. New folks starting off in Extended Reality in the workplace are being asked to do something that is still emerging, that can be confusing, and that has a lot of misinformation around it. So our goal is to do two things with XR at Work. Number one, we want to provide insight and guidance to XR practitioners in enterprise. And second, we want to foster and build a community of Extended Reality professionals that work in industrial environments – everything from oil and gas to manufacturing to automotive to logistics. The idea is to get us together to share ideas and best practices.

AREA: So your focus is really complementary to what the AREA focuses on. We’re both serving the enterprise, but XR at Work is more exclusively targeting industrial companies.

Scott: Yeah, I think that’s a fair assessment.

AREA: Where do interested people go to check out XR at Work?

Scott: We have two main places where people can connect with us. Number one is LinkedIn. We have an XR at Work company page where we invite folks to follow us. On that LinkedIn page, we will post when we have a new podcast up or we speak somewhere or we see new opportunities. The second place is YouTube.

AREA: For people who haven’t seen the podcast, what can viewers expect? What’s the range of topics discussed?

Dane: We’ve started with pragmatic discussions around core AR/VR applications and topics, such as remote assistance, guided workflows, and how to scale. More recently, we’ve started doing interviews with people who work in the industry. No offense to vendors, but our goal is to keep it community-focused around the practitioner side of the house. We want to hear from people who are already working with XR – what’s working for them, what’s not, where the field is heading, the whole metaverse concept. We’re also thinking about adding things like hardware reviews, although we want to be careful to keep it community-focused and not be beholden to somebody because they sent us a headset. That’s the key to us – to be authentic.

AREA: It sounds like the range of content really goes from helping people get started in XR to sharing tips and techniques for people who already have some proficiency. What are your long-term goals for the podcast?

Scott: In addition to the stuff Dane talked about, we’re looking at taking part in some larger events, doing a live broadcast from an event this year. We want to be seen as everyman’s XR thought leaders. We live and breathe in the factory and rugged environments, putting devices on the heads and in the hands of industrial workers. Our goal is to be seen as the go-to friendly voice in the wilderness for a community that’s trying to find real answers – not the answers they get from sizzle reels or market videos or salespeople.

AREA: I would presume you’re also hoping to learn from this – so that you can apply new ideas to your “day jobs.”

Dane: XR at Work does give us access to other people who are doing things. A lot of the stuff in the XR space is really hard. How do you manage headsets at 300 facilities like Scott’s doing? How do we go ahead as a business if our favored headset is being discontinued? There are a lot of challenges you run into as you’re managing this across a business. This gives us a chance to talk to other people who have maybe thought differently about it and we can learn from. We also like to understand what’s coming in the hardware space, so my hope is that we can be a partner to people building products to offer them insights to support product development.

Scott: We look forward to building a community and interacting more with the members of the AREA.




Masters of Pie Wants to Hear About Your XR Collaboration Experiences and Plans

Survey

The Masters of Pie team is especially interested in hearing from IT managers and C-level executives knowledgeable about the broad application of XR collaboration use cases across their businesses. They’re seeking input from leading companies in a broad range of industries, including manufacturing/engineering, construction, healthcare, defense, and energy. Even organizations that are just beginning to adopt immersive technologies are invited to participate.

 

To take part, please visit the survey site and submit your information by April 20. Thank you for helping further the AR ecosystem’s understanding of how XR collaboration is gaining traction.




AREA Member Apprentice.io Raises $100M for Pharma AR Platform

AREA Member Apprentice.io Raises $100M for Pharma AR Platform

Tempo brings the transformative power of technology to an industry that is still largely paper-based. It accelerates the entire drug production lifecycle by orchestrating manufacturing across global teams and sites with one shared platform.

 

Tempo also expands Apprentice’s footprint in the AR space. It enables manufacturing operators to use AR to:

  • Reduce human error as operators follow audio or text instructions enhanced with added photo, video, or AR overlay directions that are specific to their work environment or equipment, making each workflow step clear.
  • Increase efficiency and overcome production delays by supporting cross-team collaboration and remote support through video conferencing that utilizes AR directional tools such as live drawing, arrows, laser and pointers.

 

Apprentice leverages AR headsets to empower operators and scientists in the lab and manufacturing to work with greater efficiency and speed, without having to reference cumbersome paper-based procedural manuals or record handwritten documentation. Using voice commands and intelligent data capture, operators can easily access their procedures using their headsets. They can intelligently collect, store or reference critical data as they go, without any interruption to their workflow. With 1,500+ devices deployed, Apprentice believes it has the largest wearables deployment in enterprise manufacturing.

 

“This recent funding is a testament to the power of Augmented Reality,” says Angelo Stracquatanio, CEO of Apprentice. “AR and wearables have long held the promise to change the way we work. With pharma manufacturing, we’ve found a meaningful application of this technology that truly helps the operator execute better – for the benefit of patients everywhere.”

 

Apprentice is also expanding into Europe and Asia and continues to grow the company to further fuel its 12-fold revenue growth and sixfold growth in employees. Learn more here.




Jon Kies Explores the Potential of the AREA Human Factors Committee

AR and Human Factor

AREA: What does Human Factors in Augmented Reality encompass?

Kies: Human Factors is the study of humans, from both cognitive and physical perspectives. We investigate how humans interact with devices, applications, and services, and incorporate those insights into the design of systems. In the case of AR, it’s especially important because you may be wearing a device on your head, and interacting via an interface overlaid on the real world.  This is arguably one of the most challenging design problems.

 

AREA: Do we still have a lot to learn about the Human Factors implications of AR?

Kies: That’s absolutely the case. The technology is still evolving. Many current devices can’t be used for a significant amount of time. It’s going to get there, but there are some technical hurdles that need to be resolved. That’s why it’s super-important that human characteristics become part of the requirements and are factored into the device design process.

 

AREA: How much of our past user experience knowledge is relatable to AR, and how much is starting from scratch?

Kies: We’re not entirely starting from scratch. A lot of people in the field have experience designing for 2D interfaces like smartphones. But you then have to translate that to a spatial computing paradigm where everything is not only in 3D, but also superimposed on the real world. That’s unlike a smartphone or a PC, where the interface is primarily contained in a rectangle. That’s what makes AR enormously challenging compared to working with other computing platforms. But there has been a lot of research in AR and VR in the military and universities, so there’s a lot to glean from those areas, and established human-centered design processes are still relevant.

 

AREA: What’s your top priority for the AREA Human Factors Committee this year?

Kies: Our overriding goal is to identify and develop best practices to help ensure the best possible AR user experience. In pursuit of that goal, our number-one priority is to engage more with academic research labs – to invite them to share their findings with the AREA membership. They are often experimenting with or building the latest technologies and they’re learning a great deal from their studies. Another thing we’re discussing is compiling a set of unique human-centered design practices that are pertinent to AR systems. And of course, we always want to get more AREA members involved in the Committee.

 

AREA: What’s your pitch for why AREA members should get involved in the Human Factors Committee?

Kies: My bias is toward conversation. Having meetings that act as a forum where people can talk about the challenges they’re facing, the successes they’ve had, and just connect – that’s a compelling reason to participate. By participating in Human Factors Committee meetings, end-user members have an opportunity to hear about other members’ experiences and lessons learned and apply that knowledge to their own efforts. For AR solutions providers, it’s an opportunity to get direct feedback from the AR user community.  We also hope that concrete deliverables, like guidance on design, will enable AREA members to optimize their enterprise AR solutions for their target users.

 

It’s all about making connections and enabling dialogue – between users and providers, between the AR ecosystem and academic institutions – to everyone’s benefit. We’d like to build out a vibrant AR Human Factors community where people are learning from each other, contributing ideas, highlighting new discoveries, and finding solutions.

 

If you’re an AREA member and would like more information about joining the AREA Human Factors Committee, contact Jonathan Kies or AREA Executive Director Mark Sage. If you’re not yet an AREA member but interested in AR human factors and design, please consider joining; you can find member information here.

 




AREA Safety Playbook Offers Step-by-Step Guide to Protect Workers

The Augmented Reality Best Practice Safety Playbook discusses:

  • Risk factors to consider when using AR systems in work environments
  • Risk assessment tools and methods
  • Usability considerations
  • User medical evaluation criteria
  • Cleanliness and disinfection procedures
  • Safety awareness training, and more

 

“Enterprise AR often brings new devices, new working methods, and new modes of user interaction into the workplace. With that in mind, organizations adopting AR need a thorough understanding of health and safety risks and how best to mitigate them,” said Mark Sage, Executive Director, the AREA. “The playbook helps organizations avoid safety issues before they occur and helps ensure AR solution meet an organizations expectation for productivity and cost savings.”

 

The AREA Safety Committee provided expert input and insight to produce the playbook.

 

Download the Augmented Reality Best Practice Safety Playbook for more information and a list of contributors. To learn more about AREA membership and the work of the AREA Safety Committee, please get in touch with AREA Executive Director Mark Sage at [email protected].

 

About AREA

The Augmented Reality for Enterprise Alliance (AREA) is the only global non-profit, member-based organization dedicated to the widespread adoption of interoperable AR-enabled enterprise systems. Whether you view it as the next computing paradigm, the key to breakthroughs in manufacturing and service efficiencies, or the door to as-yet unimagined applications, AR will have an unprecedented impact on enterprises of all kinds. AREA is a managed program of Object Management Group® (OMG®). Visit https://thearea.org for more information.

Note to editors: Object Management Group and the OMG acronym are registered trademarks of the Object Management Group. For a listing of all OMG trademarks, visit https://www.omg.org/legal/tm_list.htm. All other trademarks are the property of their respective owners.

 

Media Contact:

Karen Quatromoni

[email protected]




Microsoft Power Apps Make AR Part of Core Business Solutions

Moreover, Uitz and Pile expect enterprise users to add those Augmented Reality / Mixed Reality capabilities themselves – without help from AR solutions developers.

 

The key is Microsoft Power Apps.

 

“Power Apps is a low-code, no-code application platform,” explained Uitz. “It enables anyone to quickly and easily build and deploy sophisticated applications by using drag-and-drop controls to pull in data from any data source.” Introduced in 2018, Power Apps got its first Mixed Reality capabilities in 2020.

 

“We added straightforward Mixed Reality capabilities that enable you to build sophisticated, device-centric applications that leverage a phone’s built-in sensors to use MR to see images and models in a space as well as measure things,” said Uitz. “We’ve seen many customers leveraging their mission-critical Power Apps business applications for huge improvements in their workflows.”

 

According to Uitz and Pile, the AR-enhanced Power Apps applications tend to focus on three areas. The first is sales team enablement. For example, salespeople are using Power Apps’ MR capabilities to help their customers visualize their products in their environment before they buy. A consumer packaged goods company salesperson could use AR to show a retailer how their product would look when installed in their stores and what it would mean in terms of incremental sales. That visualization can help close deals.

 

The AR visualization capabilities can be useful post-sales, as well. For example, a company can provide the installation team with images from various angles showing exactly where a product – visualized at real-world scale in their customer’s site through AR – needs to be installed.

 

Microsoft is also seeing its customers embrace the new AR capabilities for measuring applications. Armed with just a mobile phone, a flooring contractor can quickly measure an area to provide an accurate estimate of the amount of flooring needed for the space. Because it’s integrated with Power Apps, Dynamics 365, and Dataverse, it can be set up to support the entire business workflow.

 

“The user can open the Power Apps application, press ‘measure,’ take the measurements, press ‘submit,’ and the pricing calculations are done automatically, captured in the system, and an estimate email automatically sent to the customer,” said Uitz.

 

Another popular measuring use case is auditing. For example, users can use the measuring capability to confirm that a building is compliant with their local building code, including enough space for egress and sufficient lines of sight. This can save hours of time and effort doing physical measurements and recording data by hand.

 

“We’re all about democratizing Mixed Reality – making it another tool in the worker’s toolbox,” said Uitz. On top of that, Power Apps provides enterprise security and enterprise scalability, so a user-developed AR-enabled application can easily ramp up from a small, local trial to an enterprise-wide deployment without difficulty.

 

The democratizing of Mixed Reality extends to the hardware requirements, as well. Power Apps’ MR capabilities do not require a HoloLens; they work with any iPhone, iPad, or Android phone that supports ARKit / ARCore.

 

“The goal is to get companies to integrate MR capabilities into their mission-critical workflows to make their lives better,” said Uitz.

 

As these capabilities become better known, Uitz and Pile are expecting more Power Apps users to take advantage of the ability to: view and manipulate 3D content; overlay 3D content and 2D images onto the feed from the camera; measure distance, area, and volume using a device with AR; and identify spaces in the real world through an AR overlay.

 

Meanwhile, Microsoft is continuing to enhance the software to add additional industrial-strength features, and the Power Apps team is open to working with customers to add capabilities for their particular use cases.

 

“More often than not, it’s not a new thing that they want to do,” explained Pile. “It’s something that they’ve always done, but they want to do it faster, or at lower cost, or integrate into existing workflows. That’s where our primary focus is.”

 

Another key focus is getting the word out. Uitz, Pile, and the rest of the Power Apps team have been offering a variety of resources to increase awareness among customers and get them thinking about what AR capabilities can do for their operations. Readers interested in learning more can go here and here.

 

If the Power Apps team is successful, more enterprises will get their first AR experience, not from super-sophisticated “gee-wizardry” AR pilots, but from AR enhancements that deliver immediate value to their everyday solutions.