Factory layout Experience – Theorem Solutions

Optimize designs in immersive XR

The Factory Layout Experience enables a planning or layout engineer, working independently or with a group of colleagues, locally or in remote locations, to optimize Factory layouts through the immersive experience of eXtended Reality (XR) technologies. Seeing your data at full scale, in context, instantly enables you to see the clashes, access issues and missing items which a CAD screen cannot show.

On the shop floor there are literally 1000’s of pieces of equipment- much of it bought in and designed externally. Building designs may only exist as scans or in architectural CAD systems, and robot cells may be designed in specialist CAD systems. There will be libraries of hand tools, storage racks and stillage equipment designed in a range of CAD systems, and product data designed in house in mechanical CAD. To understand the factory and assess changes, all of that has to be put together to get a full picture of where a new line, robot cell or work station will fit.

A catalogue of 3D resources can leverage 2D Factory layouts by being snapped to these layouts to quickly realize a rich 3D layout. Advanced positioning makes it very easy to move, snap and align 3D data. Widely used plant and equipment is readily available, there is no need to design it from scratch for every new layout. Simplified layout tools enable you to position, align and snap layout objects quickly, which can be used by none CAD experts, enabling all stakeholders to be involved in the process, improving communication.

Testing Design and Operational Factors

Human centred operations can be analysed using mannequins that can be switched to match different characteristics. You can test design and operational aspects of a variety of human factors, to determine reachability, access and injury risk situations, ensuring compliance with safety and ergonomic standards.

It enables companies to avoid costly layout redesign by enabling all parties involved to review the layout collaboratively, make or recommend changes, and capture those decisions for later review by staff who could not attend the session.




AREA Issues RFP for Research on AR-Delivered Instructions for High-Dexterity Work

To date, AREA members have funded 10 AR research projects on a wide range of timely topics critical to the adoption of enterprise AR. Now the AREA is pleased to announce a call for proposals for its 11th research project, which will evaluate the effectiveness of AR for delivery of work instructions for tasks requiring high dexterity. Building on prior research, the project will expand the current state of knowledge and shed light on AR support for tasks that involve high levels of variability and flexibility in completion of a set of manual operations, such as that found in composite manufacturing.

This project will answer questions, such as:

  • How does AR for high dexterity tasks differ from other instruction delivery methods?
  • How are users impacted by the delivery of instructions using AR in high dexterity tasks?
  • What are the key factors informing decision-making and driving return-on-investment in delivering work instructions for particularly dexterous, manual tasks?
  • Can AR-assisted work instructions help improve quality, productivity, or waste reduction and/or rework of manufactured parts?

This AREA research project will produce: a report on the efficiency and effectiveness of AR work instruction for tasks requiring high levels of dexterity; a research data set; a video summary highlighting the key findings; and an interactive members-only webinar presenting the research findings to the AREA.

The AREA Research Committee budget for this project is $15,000. Organizations interested in conducting this research for the fixed fee are invited to submit proposals. All proposals must be submitted by 12 noon Eastern Daylight Time on July 1, 2022.

Full information on the project needs, desired outcomes and required components of a winning proposal, including a submission form, can be found here.

If you have any questions concerning this project or the AREA Research Committee, please email the Research Committee.

 




Building an immersive pharma experience with XR technology

In the world of pharma manufacturing, precision is key. To execute flawlessly, pharmaceutical scientists and operators need the proper training and tools to accomplish the task. User-friendly augmented reality (AR) and mixed reality (XR) technology that can provide workflow guidance to operators is invaluable, helping name brand companies get drugs, vaccines, and advanced therapies to patients faster.

AR has been a cost-effective way to improve training, knowledge transfers, and process execution in the lab during drug discovery and in the manufacturing suite during product commercialization. Apprentice’s AR Research Department is now seeing greater demand within the pharma industry for XR software capabilities that allow life science teams to use 3D holograms to accomplish tasks.

For example, operators are able to map out an entire biomanufacturing suite in 3D using XR technology. This allows them to consume instructional data while they work with both hands, or better understand equipment layouts. They can see and touch virtual objects within their environment, providing better context and a much more in-depth experience than AR provides.

Users can even suspend metadata in a 3D space, such as the entrance to a room, so that they can interact with their environment in a much more complete way, with equipment, objects and instruments tethered to space. Notifications regarding gowning requirements or biohazard warnings for example will automatically pop up as the operator walks in, enriching the environment with information that’s useful to them.

“It’s all about enhancing the user experience,” Linas Ozeratis, Mixed Reality Engineer at Apprentice.io. “At apprentice, our AR/XR Research Team has designed pharma-specific mixed-reality software for the HoloLens device that will offer our customers an easier, more immersive experience in the lab and suite.”

Apprentice’s XR/AR Research Team is currently experimenting with new menu design components for the HoloLens device that will reshape the future of XR user experiences, making it easier for them to interact with menus using just their fingers.

Apprentice’s “finger menu” feature allows users to trigger an action or step by ‘snapping’ together the thumb and individual fingers of the same hand. Each finger contains a different action button that can be triggered at any time during an operator’s workflow.

“Through our research, we’ve determined that the fingers are an ideal location for attaching AR buttons, because it allows users to trigger next steps without their arm or hand blocking the data they need,” Ozeratis added.  It’s quite literally technology at your fingertips.”

Why does the pharma industry want technology like this? Aside from the demand, there are situations where tools like voice commands are simply not feasible. The AR Research Team also learned that interactive finger menus feel more natural to users and can be mastered quickly. Life science teams are able to enhance training capabilities, improve execution reliability and expand the types of supporting devices they can apply within their various environments.

“Introducing these exciting and highly anticipated XR capabilities is just one stop on our roadmap,” Ozeratis adds. “There are bigger and bolder things ahead that we look forward to sharing as the pharma industry continues to demand more modern, intelligent technologies that improve efficiency and speed.”




Rokid displayed their AR glasses to AWE 2022

Liang Guan, General Manager at Rokid, enthusiastically stated:
“Numerous top-tech companies currently explore AR, XR, or the metaverse. As early as 2016, Rokid has been proactively expanding our AR product pipeline across leading technological areas of optics, chips, smart voice, and visual image. Today, we have X-Craft deployed in over 70 regions and Air Pro has been widely used in 60+ museums around the world. Moving forward, Rokid will keep delivering real value to enterprises through its line of AR products.”

Rokid products empower the frontline workforce, providing real-time analysis, views, and documents to the control center. Many media and participants were surprised after trying Rokid products. Saying that the various control modes provided by Rokid AR glasses are very convenient for users to operate and can effectively improve work efficiency.

Rokid X-Craft, demonstrated live at the AWE 2022, has officially received ATEX Zone 1 certification from TUV Rheinland Group. Becoming the world’s first explosion-proof, waterproof, dustproof, 5G, and GPS-supported XR device. This is not only a great advance in AR and 5G technology but also a breakthrough in AR explosion-proof applications in the industrial field. Many users at the event said after the trial that safety headsets are comfortable to wear and are highly competitive products in the market. It not only effectively ensures the safety of front-end staff, but also helps oil and gas fields increase production capacity.

Rokid Air Pro, a powerful binocular AR glasses, features voice control to help you enjoy a wide variety of media including games, movies, and augmented reality experiences. Rokid Glass 2, provided real-time analysis, views, and documents to the control center, and successfully improved traffic management and prevention to ensure the long- term stability of the city.

 

 




AREA ED Explores Immersive Technologies on Mouser Podcast

What does the term “Immersive Technologies” encompass? And how are these technologies evolving to solve more and more business needs? Mouser Electronics’ The Tech Between Us podcast took up these questions – and more – recently when host Raymond Yin spoke with AREA Executive Director Mark Sage.

 

Mark and Raymond take a closer look at everything from remote assistance and guidance to digital twins and remote collaboration. Immerse yourself in this lively discussion.




XR at Work Podcast is Here to Talk Shop with AR Practitioners

XR at Work Podcast is Here to Talk Shop with AR Practitioners

We got together with Scott and Dane recently to learn more about the podcast and what they hope to accomplish with it.

AREA: Before we get into XR@Work, could you tell us what you do for a living?

Scott: I’m a Principal XR Product Manager for WestRock, a global consumer packaging manufacturing company. I’m responsible for all things XR-related for our 300 factories and our customer interactions.

Dane: I’m on the business transformation team for INVISTA, a polymer manufacturing company and subsidiary of Koch Industries. I lead XR and digital twin within INVISTA and I also lead the Republic of Science, a community of practice across Koch for XR technologies.

AREA: How did you two meet up?

Dane: We were both on a panel at AWE on real-life practitioners and Scott and I hit it off really well. There’s a fair number of people looking to get into the XR space that don’t have anybody else to reach out to, other than a vendor. Scott and I had conversations about how hard it is getting started and that’s what led to the podcast.

AREA: And when did the podcast start?

Scott: I think it was November of last year.

AREA: What’s the mission of XR at Work?

Scott: What Dane said is absolutely true. New folks starting off in Extended Reality in the workplace are being asked to do something that is still emerging, that can be confusing, and that has a lot of misinformation around it. So our goal is to do two things with XR at Work. Number one, we want to provide insight and guidance to XR practitioners in enterprise. And second, we want to foster and build a community of Extended Reality professionals that work in industrial environments – everything from oil and gas to manufacturing to automotive to logistics. The idea is to get us together to share ideas and best practices.

AREA: So your focus is really complementary to what the AREA focuses on. We’re both serving the enterprise, but XR at Work is more exclusively targeting industrial companies.

Scott: Yeah, I think that’s a fair assessment.

AREA: Where do interested people go to check out XR at Work?

Scott: We have two main places where people can connect with us. Number one is LinkedIn. We have an XR at Work company page where we invite folks to follow us. On that LinkedIn page, we will post when we have a new podcast up or we speak somewhere or we see new opportunities. The second place is YouTube.

AREA: For people who haven’t seen the podcast, what can viewers expect? What’s the range of topics discussed?

Dane: We’ve started with pragmatic discussions around core AR/VR applications and topics, such as remote assistance, guided workflows, and how to scale. More recently, we’ve started doing interviews with people who work in the industry. No offense to vendors, but our goal is to keep it community-focused around the practitioner side of the house. We want to hear from people who are already working with XR – what’s working for them, what’s not, where the field is heading, the whole metaverse concept. We’re also thinking about adding things like hardware reviews, although we want to be careful to keep it community-focused and not be beholden to somebody because they sent us a headset. That’s the key to us – to be authentic.

AREA: It sounds like the range of content really goes from helping people get started in XR to sharing tips and techniques for people who already have some proficiency. What are your long-term goals for the podcast?

Scott: In addition to the stuff Dane talked about, we’re looking at taking part in some larger events, doing a live broadcast from an event this year. We want to be seen as everyman’s XR thought leaders. We live and breathe in the factory and rugged environments, putting devices on the heads and in the hands of industrial workers. Our goal is to be seen as the go-to friendly voice in the wilderness for a community that’s trying to find real answers – not the answers they get from sizzle reels or market videos or salespeople.

AREA: I would presume you’re also hoping to learn from this – so that you can apply new ideas to your “day jobs.”

Dane: XR at Work does give us access to other people who are doing things. A lot of the stuff in the XR space is really hard. How do you manage headsets at 300 facilities like Scott’s doing? How do we go ahead as a business if our favored headset is being discontinued? There are a lot of challenges you run into as you’re managing this across a business. This gives us a chance to talk to other people who have maybe thought differently about it and we can learn from. We also like to understand what’s coming in the hardware space, so my hope is that we can be a partner to people building products to offer them insights to support product development.

Scott: We look forward to building a community and interacting more with the members of the AREA.




Masters of Pie Wants to Hear About Your XR Collaboration Experiences and Plans

Survey

The Masters of Pie team is especially interested in hearing from IT managers and C-level executives knowledgeable about the broad application of XR collaboration use cases across their businesses. They’re seeking input from leading companies in a broad range of industries, including manufacturing/engineering, construction, healthcare, defense, and energy. Even organizations that are just beginning to adopt immersive technologies are invited to participate.

 

To take part, please visit the survey site and submit your information by April 20. Thank you for helping further the AR ecosystem’s understanding of how XR collaboration is gaining traction.




AREA Member Apprentice.io Raises $100M for Pharma AR Platform

AREA Member Apprentice.io Raises $100M for Pharma AR Platform

Tempo brings the transformative power of technology to an industry that is still largely paper-based. It accelerates the entire drug production lifecycle by orchestrating manufacturing across global teams and sites with one shared platform.

 

Tempo also expands Apprentice’s footprint in the AR space. It enables manufacturing operators to use AR to:

  • Reduce human error as operators follow audio or text instructions enhanced with added photo, video, or AR overlay directions that are specific to their work environment or equipment, making each workflow step clear.
  • Increase efficiency and overcome production delays by supporting cross-team collaboration and remote support through video conferencing that utilizes AR directional tools such as live drawing, arrows, laser and pointers.

 

Apprentice leverages AR headsets to empower operators and scientists in the lab and manufacturing to work with greater efficiency and speed, without having to reference cumbersome paper-based procedural manuals or record handwritten documentation. Using voice commands and intelligent data capture, operators can easily access their procedures using their headsets. They can intelligently collect, store or reference critical data as they go, without any interruption to their workflow. With 1,500+ devices deployed, Apprentice believes it has the largest wearables deployment in enterprise manufacturing.

 

“This recent funding is a testament to the power of Augmented Reality,” says Angelo Stracquatanio, CEO of Apprentice. “AR and wearables have long held the promise to change the way we work. With pharma manufacturing, we’ve found a meaningful application of this technology that truly helps the operator execute better – for the benefit of patients everywhere.”

 

Apprentice is also expanding into Europe and Asia and continues to grow the company to further fuel its 12-fold revenue growth and sixfold growth in employees. Learn more here.




Jon Kies Explores the Potential of the AREA Human Factors Committee

AR and Human Factor

AREA: What does Human Factors in Augmented Reality encompass?

Kies: Human Factors is the study of humans, from both cognitive and physical perspectives. We investigate how humans interact with devices, applications, and services, and incorporate those insights into the design of systems. In the case of AR, it’s especially important because you may be wearing a device on your head, and interacting via an interface overlaid on the real world.  This is arguably one of the most challenging design problems.

 

AREA: Do we still have a lot to learn about the Human Factors implications of AR?

Kies: That’s absolutely the case. The technology is still evolving. Many current devices can’t be used for a significant amount of time. It’s going to get there, but there are some technical hurdles that need to be resolved. That’s why it’s super-important that human characteristics become part of the requirements and are factored into the device design process.

 

AREA: How much of our past user experience knowledge is relatable to AR, and how much is starting from scratch?

Kies: We’re not entirely starting from scratch. A lot of people in the field have experience designing for 2D interfaces like smartphones. But you then have to translate that to a spatial computing paradigm where everything is not only in 3D, but also superimposed on the real world. That’s unlike a smartphone or a PC, where the interface is primarily contained in a rectangle. That’s what makes AR enormously challenging compared to working with other computing platforms. But there has been a lot of research in AR and VR in the military and universities, so there’s a lot to glean from those areas, and established human-centered design processes are still relevant.

 

AREA: What’s your top priority for the AREA Human Factors Committee this year?

Kies: Our overriding goal is to identify and develop best practices to help ensure the best possible AR user experience. In pursuit of that goal, our number-one priority is to engage more with academic research labs – to invite them to share their findings with the AREA membership. They are often experimenting with or building the latest technologies and they’re learning a great deal from their studies. Another thing we’re discussing is compiling a set of unique human-centered design practices that are pertinent to AR systems. And of course, we always want to get more AREA members involved in the Committee.

 

AREA: What’s your pitch for why AREA members should get involved in the Human Factors Committee?

Kies: My bias is toward conversation. Having meetings that act as a forum where people can talk about the challenges they’re facing, the successes they’ve had, and just connect – that’s a compelling reason to participate. By participating in Human Factors Committee meetings, end-user members have an opportunity to hear about other members’ experiences and lessons learned and apply that knowledge to their own efforts. For AR solutions providers, it’s an opportunity to get direct feedback from the AR user community.  We also hope that concrete deliverables, like guidance on design, will enable AREA members to optimize their enterprise AR solutions for their target users.

 

It’s all about making connections and enabling dialogue – between users and providers, between the AR ecosystem and academic institutions – to everyone’s benefit. We’d like to build out a vibrant AR Human Factors community where people are learning from each other, contributing ideas, highlighting new discoveries, and finding solutions.

 

If you’re an AREA member and would like more information about joining the AREA Human Factors Committee, contact Jonathan Kies or AREA Executive Director Mark Sage. If you’re not yet an AREA member but interested in AR human factors and design, please consider joining; you can find member information here.

 




AREA Safety Playbook Offers Step-by-Step Guide to Protect Workers

The Augmented Reality Best Practice Safety Playbook discusses:

  • Risk factors to consider when using AR systems in work environments
  • Risk assessment tools and methods
  • Usability considerations
  • User medical evaluation criteria
  • Cleanliness and disinfection procedures
  • Safety awareness training, and more

 

“Enterprise AR often brings new devices, new working methods, and new modes of user interaction into the workplace. With that in mind, organizations adopting AR need a thorough understanding of health and safety risks and how best to mitigate them,” said Mark Sage, Executive Director, the AREA. “The playbook helps organizations avoid safety issues before they occur and helps ensure AR solution meet an organizations expectation for productivity and cost savings.”

 

The AREA Safety Committee provided expert input and insight to produce the playbook.

 

Download the Augmented Reality Best Practice Safety Playbook for more information and a list of contributors. To learn more about AREA membership and the work of the AREA Safety Committee, please get in touch with AREA Executive Director Mark Sage at [email protected].

 

About AREA

The Augmented Reality for Enterprise Alliance (AREA) is the only global non-profit, member-based organization dedicated to the widespread adoption of interoperable AR-enabled enterprise systems. Whether you view it as the next computing paradigm, the key to breakthroughs in manufacturing and service efficiencies, or the door to as-yet unimagined applications, AR will have an unprecedented impact on enterprises of all kinds. AREA is a managed program of Object Management Group® (OMG®). Visit https://thearea.org for more information.

Note to editors: Object Management Group and the OMG acronym are registered trademarks of the Object Management Group. For a listing of all OMG trademarks, visit https://www.omg.org/legal/tm_list.htm. All other trademarks are the property of their respective owners.

 

Media Contact:

Karen Quatromoni

[email protected]