1

XR at Work Podcast is Here to Talk Shop with AR Practitioners

XR at Work Podcast is Here to Talk Shop with AR Practitioners

We got together with Scott and Dane recently to learn more about the podcast and what they hope to accomplish with it.

AREA: Before we get into XR@Work, could you tell us what you do for a living?

Scott: I’m a Principal XR Product Manager for WestRock, a global consumer packaging manufacturing company. I’m responsible for all things XR-related for our 300 factories and our customer interactions.

Dane: I’m on the business transformation team for INVISTA, a polymer manufacturing company and subsidiary of Koch Industries. I lead XR and digital twin within INVISTA and I also lead the Republic of Science, a community of practice across Koch for XR technologies.

AREA: How did you two meet up?

Dane: We were both on a panel at AWE on real-life practitioners and Scott and I hit it off really well. There’s a fair number of people looking to get into the XR space that don’t have anybody else to reach out to, other than a vendor. Scott and I had conversations about how hard it is getting started and that’s what led to the podcast.

AREA: And when did the podcast start?

Scott: I think it was November of last year.

AREA: What’s the mission of XR at Work?

Scott: What Dane said is absolutely true. New folks starting off in Extended Reality in the workplace are being asked to do something that is still emerging, that can be confusing, and that has a lot of misinformation around it. So our goal is to do two things with XR at Work. Number one, we want to provide insight and guidance to XR practitioners in enterprise. And second, we want to foster and build a community of Extended Reality professionals that work in industrial environments – everything from oil and gas to manufacturing to automotive to logistics. The idea is to get us together to share ideas and best practices.

AREA: So your focus is really complementary to what the AREA focuses on. We’re both serving the enterprise, but XR at Work is more exclusively targeting industrial companies.

Scott: Yeah, I think that’s a fair assessment.

AREA: Where do interested people go to check out XR at Work?

Scott: We have two main places where people can connect with us. Number one is LinkedIn. We have an XR at Work company page where we invite folks to follow us. On that LinkedIn page, we will post when we have a new podcast up or we speak somewhere or we see new opportunities. The second place is YouTube.

AREA: For people who haven’t seen the podcast, what can viewers expect? What’s the range of topics discussed?

Dane: We’ve started with pragmatic discussions around core AR/VR applications and topics, such as remote assistance, guided workflows, and how to scale. More recently, we’ve started doing interviews with people who work in the industry. No offense to vendors, but our goal is to keep it community-focused around the practitioner side of the house. We want to hear from people who are already working with XR – what’s working for them, what’s not, where the field is heading, the whole metaverse concept. We’re also thinking about adding things like hardware reviews, although we want to be careful to keep it community-focused and not be beholden to somebody because they sent us a headset. That’s the key to us – to be authentic.

AREA: It sounds like the range of content really goes from helping people get started in XR to sharing tips and techniques for people who already have some proficiency. What are your long-term goals for the podcast?

Scott: In addition to the stuff Dane talked about, we’re looking at taking part in some larger events, doing a live broadcast from an event this year. We want to be seen as everyman’s XR thought leaders. We live and breathe in the factory and rugged environments, putting devices on the heads and in the hands of industrial workers. Our goal is to be seen as the go-to friendly voice in the wilderness for a community that’s trying to find real answers – not the answers they get from sizzle reels or market videos or salespeople.

AREA: I would presume you’re also hoping to learn from this – so that you can apply new ideas to your “day jobs.”

Dane: XR at Work does give us access to other people who are doing things. A lot of the stuff in the XR space is really hard. How do you manage headsets at 300 facilities like Scott’s doing? How do we go ahead as a business if our favored headset is being discontinued? There are a lot of challenges you run into as you’re managing this across a business. This gives us a chance to talk to other people who have maybe thought differently about it and we can learn from. We also like to understand what’s coming in the hardware space, so my hope is that we can be a partner to people building products to offer them insights to support product development.

Scott: We look forward to building a community and interacting more with the members of the AREA.




Masters of Pie Wants to Hear About Your XR Collaboration Experiences and Plans

Survey

The Masters of Pie team is especially interested in hearing from IT managers and C-level executives knowledgeable about the broad application of XR collaboration use cases across their businesses. They’re seeking input from leading companies in a broad range of industries, including manufacturing/engineering, construction, healthcare, defense, and energy. Even organizations that are just beginning to adopt immersive technologies are invited to participate.

 

To take part, please visit the survey site and submit your information by April 20. Thank you for helping further the AR ecosystem’s understanding of how XR collaboration is gaining traction.




AREA Member Apprentice.io Raises $100M for Pharma AR Platform

AREA Member Apprentice.io Raises $100M for Pharma AR Platform

Tempo brings the transformative power of technology to an industry that is still largely paper-based. It accelerates the entire drug production lifecycle by orchestrating manufacturing across global teams and sites with one shared platform.

 

Tempo also expands Apprentice’s footprint in the AR space. It enables manufacturing operators to use AR to:

  • Reduce human error as operators follow audio or text instructions enhanced with added photo, video, or AR overlay directions that are specific to their work environment or equipment, making each workflow step clear.
  • Increase efficiency and overcome production delays by supporting cross-team collaboration and remote support through video conferencing that utilizes AR directional tools such as live drawing, arrows, laser and pointers.

 

Apprentice leverages AR headsets to empower operators and scientists in the lab and manufacturing to work with greater efficiency and speed, without having to reference cumbersome paper-based procedural manuals or record handwritten documentation. Using voice commands and intelligent data capture, operators can easily access their procedures using their headsets. They can intelligently collect, store or reference critical data as they go, without any interruption to their workflow. With 1,500+ devices deployed, Apprentice believes it has the largest wearables deployment in enterprise manufacturing.

 

“This recent funding is a testament to the power of Augmented Reality,” says Angelo Stracquatanio, CEO of Apprentice. “AR and wearables have long held the promise to change the way we work. With pharma manufacturing, we’ve found a meaningful application of this technology that truly helps the operator execute better – for the benefit of patients everywhere.”

 

Apprentice is also expanding into Europe and Asia and continues to grow the company to further fuel its 12-fold revenue growth and sixfold growth in employees. Learn more here.




Jon Kies Explores the Potential of the AREA Human Factors Committee

AR and Human Factor

AREA: What does Human Factors in Augmented Reality encompass?

Kies: Human Factors is the study of humans, from both cognitive and physical perspectives. We investigate how humans interact with devices, applications, and services, and incorporate those insights into the design of systems. In the case of AR, it’s especially important because you may be wearing a device on your head, and interacting via an interface overlaid on the real world.  This is arguably one of the most challenging design problems.

 

AREA: Do we still have a lot to learn about the Human Factors implications of AR?

Kies: That’s absolutely the case. The technology is still evolving. Many current devices can’t be used for a significant amount of time. It’s going to get there, but there are some technical hurdles that need to be resolved. That’s why it’s super-important that human characteristics become part of the requirements and are factored into the device design process.

 

AREA: How much of our past user experience knowledge is relatable to AR, and how much is starting from scratch?

Kies: We’re not entirely starting from scratch. A lot of people in the field have experience designing for 2D interfaces like smartphones. But you then have to translate that to a spatial computing paradigm where everything is not only in 3D, but also superimposed on the real world. That’s unlike a smartphone or a PC, where the interface is primarily contained in a rectangle. That’s what makes AR enormously challenging compared to working with other computing platforms. But there has been a lot of research in AR and VR in the military and universities, so there’s a lot to glean from those areas, and established human-centered design processes are still relevant.

 

AREA: What’s your top priority for the AREA Human Factors Committee this year?

Kies: Our overriding goal is to identify and develop best practices to help ensure the best possible AR user experience. In pursuit of that goal, our number-one priority is to engage more with academic research labs – to invite them to share their findings with the AREA membership. They are often experimenting with or building the latest technologies and they’re learning a great deal from their studies. Another thing we’re discussing is compiling a set of unique human-centered design practices that are pertinent to AR systems. And of course, we always want to get more AREA members involved in the Committee.

 

AREA: What’s your pitch for why AREA members should get involved in the Human Factors Committee?

Kies: My bias is toward conversation. Having meetings that act as a forum where people can talk about the challenges they’re facing, the successes they’ve had, and just connect – that’s a compelling reason to participate. By participating in Human Factors Committee meetings, end-user members have an opportunity to hear about other members’ experiences and lessons learned and apply that knowledge to their own efforts. For AR solutions providers, it’s an opportunity to get direct feedback from the AR user community.  We also hope that concrete deliverables, like guidance on design, will enable AREA members to optimize their enterprise AR solutions for their target users.

 

It’s all about making connections and enabling dialogue – between users and providers, between the AR ecosystem and academic institutions – to everyone’s benefit. We’d like to build out a vibrant AR Human Factors community where people are learning from each other, contributing ideas, highlighting new discoveries, and finding solutions.

 

If you’re an AREA member and would like more information about joining the AREA Human Factors Committee, contact Jonathan Kies or AREA Executive Director Mark Sage. If you’re not yet an AREA member but interested in AR human factors and design, please consider joining; you can find member information here.

 




AREA Safety Playbook Offers Step-by-Step Guide to Protect Workers

The Augmented Reality Best Practice Safety Playbook discusses:

  • Risk factors to consider when using AR systems in work environments
  • Risk assessment tools and methods
  • Usability considerations
  • User medical evaluation criteria
  • Cleanliness and disinfection procedures
  • Safety awareness training, and more

 

“Enterprise AR often brings new devices, new working methods, and new modes of user interaction into the workplace. With that in mind, organizations adopting AR need a thorough understanding of health and safety risks and how best to mitigate them,” said Mark Sage, Executive Director, the AREA. “The playbook helps organizations avoid safety issues before they occur and helps ensure AR solution meet an organizations expectation for productivity and cost savings.”

 

The AREA Safety Committee provided expert input and insight to produce the playbook.

 

Download the Augmented Reality Best Practice Safety Playbook for more information and a list of contributors. To learn more about AREA membership and the work of the AREA Safety Committee, please get in touch with AREA Executive Director Mark Sage at [email protected].

 

About AREA

The Augmented Reality for Enterprise Alliance (AREA) is the only global non-profit, member-based organization dedicated to the widespread adoption of interoperable AR-enabled enterprise systems. Whether you view it as the next computing paradigm, the key to breakthroughs in manufacturing and service efficiencies, or the door to as-yet unimagined applications, AR will have an unprecedented impact on enterprises of all kinds. AREA is a managed program of Object Management Group® (OMG®). Visit https://thearea.org for more information.

Note to editors: Object Management Group and the OMG acronym are registered trademarks of the Object Management Group. For a listing of all OMG trademarks, visit https://www.omg.org/legal/tm_list.htm. All other trademarks are the property of their respective owners.

 

Media Contact:

Karen Quatromoni

[email protected]




Microsoft Power Apps Make AR Part of Core Business Solutions

Moreover, Uitz and Pile expect enterprise users to add those Augmented Reality / Mixed Reality capabilities themselves – without help from AR solutions developers.

 

The key is Microsoft Power Apps.

 

“Power Apps is a low-code, no-code application platform,” explained Uitz. “It enables anyone to quickly and easily build and deploy sophisticated applications by using drag-and-drop controls to pull in data from any data source.” Introduced in 2018, Power Apps got its first Mixed Reality capabilities in 2020.

 

“We added straightforward Mixed Reality capabilities that enable you to build sophisticated, device-centric applications that leverage a phone’s built-in sensors to use MR to see images and models in a space as well as measure things,” said Uitz. “We’ve seen many customers leveraging their mission-critical Power Apps business applications for huge improvements in their workflows.”

 

According to Uitz and Pile, the AR-enhanced Power Apps applications tend to focus on three areas. The first is sales team enablement. For example, salespeople are using Power Apps’ MR capabilities to help their customers visualize their products in their environment before they buy. A consumer packaged goods company salesperson could use AR to show a retailer how their product would look when installed in their stores and what it would mean in terms of incremental sales. That visualization can help close deals.

 

The AR visualization capabilities can be useful post-sales, as well. For example, a company can provide the installation team with images from various angles showing exactly where a product – visualized at real-world scale in their customer’s site through AR – needs to be installed.

 

Microsoft is also seeing its customers embrace the new AR capabilities for measuring applications. Armed with just a mobile phone, a flooring contractor can quickly measure an area to provide an accurate estimate of the amount of flooring needed for the space. Because it’s integrated with Power Apps, Dynamics 365, and Dataverse, it can be set up to support the entire business workflow.

 

“The user can open the Power Apps application, press ‘measure,’ take the measurements, press ‘submit,’ and the pricing calculations are done automatically, captured in the system, and an estimate email automatically sent to the customer,” said Uitz.

 

Another popular measuring use case is auditing. For example, users can use the measuring capability to confirm that a building is compliant with their local building code, including enough space for egress and sufficient lines of sight. This can save hours of time and effort doing physical measurements and recording data by hand.

 

“We’re all about democratizing Mixed Reality – making it another tool in the worker’s toolbox,” said Uitz. On top of that, Power Apps provides enterprise security and enterprise scalability, so a user-developed AR-enabled application can easily ramp up from a small, local trial to an enterprise-wide deployment without difficulty.

 

The democratizing of Mixed Reality extends to the hardware requirements, as well. Power Apps’ MR capabilities do not require a HoloLens; they work with any iPhone, iPad, or Android phone that supports ARKit / ARCore.

 

“The goal is to get companies to integrate MR capabilities into their mission-critical workflows to make their lives better,” said Uitz.

 

As these capabilities become better known, Uitz and Pile are expecting more Power Apps users to take advantage of the ability to: view and manipulate 3D content; overlay 3D content and 2D images onto the feed from the camera; measure distance, area, and volume using a device with AR; and identify spaces in the real world through an AR overlay.

 

Meanwhile, Microsoft is continuing to enhance the software to add additional industrial-strength features, and the Power Apps team is open to working with customers to add capabilities for their particular use cases.

 

“More often than not, it’s not a new thing that they want to do,” explained Pile. “It’s something that they’ve always done, but they want to do it faster, or at lower cost, or integrate into existing workflows. That’s where our primary focus is.”

 

Another key focus is getting the word out. Uitz, Pile, and the rest of the Power Apps team have been offering a variety of resources to increase awareness among customers and get them thinking about what AR capabilities can do for their operations. Readers interested in learning more can go here and here.

 

If the Power Apps team is successful, more enterprises will get their first AR experience, not from super-sophisticated “gee-wizardry” AR pilots, but from AR enhancements that deliver immediate value to their everyday solutions.




Inside the AREA Requirements Committee with Brian Kononchik

Inside the AREA Requirements Committee with Brian Kononchik

The work of the AREA is largely driven by its member committees: Research, Interoperability & Standards, Safety, Human Factors, Requirements, Marketing, and Security. Each of these groups is focused on activities that contribute to the development of knowledge about the adoption of enterprise AR and the practical implementation of AR use cases. For AREA members, participation in one or more of the AREA committees is an opportunity to share expertise, interact with other experts, and make a meaningful impact on the future of enterprise AR.

 

This is the third in a series of blog articles exploring the committees and their work. Our guide to the AREA Requirements Committee was Brian Kononchik, the Committee’s chair and Director of Innovative Technologies at Boston Engineering.

 

AREA: Tell us how you got into enterprise AR.

 

Kononchik: About 12 years ago, I started my career working alongside a prominent advanced technology investor and visionary. Together, we worked as contractors and consultants to some big names in the consumer electronics and automotive space. During that time, I realized that the status quo is never good enough, there needs to be more. That’s when I shifted my focus to innovation instead of that status quo. That’s how I became an early adopter of VR technology, holographics, and eventually AR. After a while, I jumped to an engineering firm that did development work for Siemens, which is, of course, a very big name in manufacturing with equipment, PLCs, PLM, and CAD platforms.

 

While I was there, one of the largest submarine manufacturers in the world needed an innovation engineer on their advanced technology team. The assignment was to create an immersive VR experience that utilized existing CAD data to provide an experience where a user can enter a submarine, navigate freely, and plan for job assignments. That led to more projects and eventually including Augmented Reality answering the questions: How can we use AR to help shipbuilders? How can we merge AR and VR to help shipbuilders collaborate and work more effectively?

 

Then about five years ago, the need for AR at scale became more and more prominent. I then went to work for PTC as a Director of Product Management for the Spatial AR initiative working with big name manufacturers in the automotive, semi-conductor, and manufacturing equipment space. That experience eventually brought me over to Boston Engineering where I’m now leading the Industry 4.0, Innovative Technology initiatives. I’m all about giving people value, and enterprise AR is delivering on that promise.

 

AREA: For our readers who aren’t familiar with it, what does the AREA Requirements Committee do?

 

Kononchik: We’re working in collaboration with a lot of big organizations involved with the AREA to help define global standards for hardware and software. When I say hardware, I am talking about mobile devices like the iPhone or Galaxy s20, Assisted Reality devices like RealWear, and fully-immersive, head-mounted displays, like the Magic Leap and HoloLens 2. We’re trying to define a set of standards that people could build hardware against. Having universal standards will allow for increased technology adoption.

 

On the software side, we’re trying to do something similar. We’re trying to lay out a set of standards for people that want to go build AR enterprise applications. After all the requirements are finalized, the next big thing we’re going to do is build out an automated process to help someone understand the starting point for addressing their particular use case. So, you would input your industry and use case. Say you’re in the oil and gas industry and you have the challenge of individuals collaborating while they are not in the same location. That would be a remote assistance use case. You then need to input your environment. In this scenario, you specify that the work is being performed mainly outside. Those are your three starting variables: oil and gas, remote assistance, and outside. We’re working on an automated process that recommends options; the more details you provide, the better the recommendation. We’re not just trying to define requirements. We’re trying to define requirements for use cases and provide a way to streamline adoption within an organization.

 

AREA: It sounds like you’re working two sides of the equation here. You’re pushing for standards on one end and providing guidance to adopters on the other.

 

Kononchik: Correct. We have our set of standards being developed, and then we have the AREA Statement of Needs, or ASoN, tool. It’s designed to help others identify the AR setups related to their use cases. These setups can be actions taken right away to get AR implemented into their organizations.

 

AREA: On the standards side, are you looking at work that standards bodies are already doing and making recommendations about which standards to implement?

 

Kononchik: It’s a combination. For example, we break hardware down into many different categories: wearability, sensors, communications, audio, and so forth. For each category, you have a breakdown of different device types. For example, sensors. You have mobile, assisted, and fully immersive head-mounted devices (HMD). A requirement for a fully immersive HMD should have no fewer than two world cameras because that will help you compute your 3D world maps. A device should also have at least a single RGB camera because that will help with QR code recognition and remote assist scenarios. So, those are some of the hardware type standards being developed.

 

When it comes to safety, though, we reference the industry standards. There are already existing environmental standards, say regarding operational temperature range, that hardware manufacturers must follow. So, we just reference those standards that exist. And then, where standards don’t exist, we’re trying to collectively understand what can be done today and where the market is going to establish new standards which organizations do not necessarily follow today.

 

AREA: The ASoN tool has been available for a while now. What’s the latest on that?

 

Kononchik: It is currently running on an older platform, and we are in the process of upgrading the application and migrating the data over to a new platform. So, people can use it and benefit from it today, but we are looking to cleanse the tool eliminating some bloat that accumulated over time. It is accessible and fully operational today and should be used and benefited from.

 

AREA: Do you have a wish list of things you’d love to see the Requirements Committee get done in the near future?

 

Kononchik: Right now, I want to see us go through and overhaul the hardware and software requirements and collectively agree on what they should be. We’re in the process of that now. Beyond that, my goal is to make the ASoN tool everything it’s promising to be. I would love to see it work so that, if say I were an automotive industry member and I entered in a very few parameters, I would get a full spec readout of what I needed to do next and why I need to do it. It’s not there yet, but I would love to see that.

 

AREA: Is that doable in the near future?

 

Kononchik: I would say within the next year, probably yes. That granularity will be well along its way. We have a lot of great minds working with us in this Requirements Committee – a lot of industry experience and industry knowledge, and not just hardware building and software development-specific. We also have input from the community members who focus on consulting with organizations, and they really understand what customers are looking for. When we join all this knowledge together, it really comprises the three pillars of successful AR implementation and development.

 

AREA: What kind of people are you looking for to become part of the Requirements Committee? Are there certain skills you need or certain types of people that you’re hoping can join the group?

 

Kononchik: We’re looking for multiple types of skillsets. First off, we’re looking for members of the hardware and software communities – smaller startups, enterprises, and places with new innovative solutions. And within those sectors, we’re looking for product leaders. Leaders in product strategy, product management, as well as technical areas to help get a good understanding of the market demands. Really want those feasibility, viability, and desirability type people – so, your business leaders, your product leaders, your design leaders, your technical leaders from both hardware and software. And we’re looking for the “go doers” – the people that go out to a customer and work with them for a year to implement AR. We want to understand the pain points of implementation, adoption, and scalability.

 

The onset of the pandemic really escalated the adoption of AR, and a lot of companies are seeing challenges adopting and scaling AR. Those companies are making do with the pain of adoption. The Requirements Committee can increase the visibility of the challenges adopters are facing and make those challenges more visible to the hardware creators and software development companies. Having visibility like this allows these organization to develop a strategy that fits market demands, satisfying their business, but more importantly, the customer.

 

If you’re an AREA member and would like more information about joining the AREA Requirements Committee, contact Brian Kononchik or AREA Executive Director Mark Sage. If you’re not yet an AREA member but care about ensuring safety in enterprise AR, please consider joining; you can find member information here.

 

 




AREA and Digital Twin Consortium Establish Strategic Relationship

On December 3rd, the AREA took a big step toward facilitating that convergence and maximizing those benefits by signing a Memo of Understanding (MoU) with the Digital Twin Consortium® (DTC). Known as the Authority in Digital Twin, DTC coalesces industry, government, and academia to drive consistency in vocabulary, architecture, security, and interoperability of Digital Twin technology.

Under the MoU, AREA Steering Committee member Christine Perey will serve as the liaison between the two organizations to build a deep and multi-faceted strategic relationship. DTC and the AREA will collaborate to:

  • define industry requirements;
  • increase the degree to which enabling technology components interoperate;
  • align work underway to accelerate adoption of Digital Twins and Augmented Reality in many vertical domains and use cases; and
  • develop proof of concepts projects and programs, and joint marketing efforts.

“We’re excited about the many opportunities our liaison with DTC will enable,” said Mark Sage, Executive Director of the AREA. “For AREA members, it’s a chance to help ensure that enterprise AR and Digital Twin technologies move forward in sync to accelerate adoption, reduce costs, and reap new benefits.”

“Augmented Reality and Digital Twins are an ideal combination, as together they allow users to visualize the invisible,” said Dan Isaacs, CTO, Digital Twin Consortium. “This includes enhancing situational awareness and event intelligence for training in assembly, installation, maintenance, operation, compliance assurance, and safety. For infrastructure projects, this combination enables visibility inside the walls of a building or structure, or the power, connectivity, or water piping and conduits under streets and roadways, and much more.”

As the two organizations begin working together, the enterprise AR ecosystem can look forward to a growing number of jointly developed projects, events, and resources, including webinars, white papers, and presentations.




Press Release – AREA and Digital Twin Consortium Establish Strategic Relationship

Under the MoU, DTC and the AREA will collaborate to:

  • Define industry requirements
  • Increase the degree to which enabling technology components interoperate
  • Align work underway to accelerate adoption of digital twins and Augmented Reality in many vertical domains and use cases
  • Develop proof of value projects and programs, and joint marketing efforts

“We’re excited about the many opportunities our liaison with DTC will enable,” said Mark Sage, Executive Director of the AREA. “For AREA members, it’s a chance to help ensure that enterprise AR and digital twin technologies move forward in sync to accelerate adoption, reduce costs, and reap new benefits.”

“Augmented reality and digital twins are an ideal combination as together, they allow users to visualize the invisible,” said Dan Isaacs, CTO, Digital Twin Consortium. “This includes enhancing situational awareness and event intelligence for training in assembly, installation, maintenance, operation, compliance assurance, and safety. For infrastructure projects, this combination enables visibility inside the walls of a building or structure, or the power, connectivity, or water piping and conduits under streets and roadways, and much more.”

As the two organizations begin working together, Digital Twin Consortium and the AREA ecosystems can look forward to a growing number of jointly developed projects, events, and resources, including webinars, white papers, and presentations.

About the AREA

The Augmented Reality for Enterprise Alliance (AREA) is the only global non-profit, member-based organization that is dedicated to the widespread adoption of interoperable AR-enabled enterprise systems. Whether you view it as the next computing paradigm, the key to breakthroughs in manufacturing and service efficiencies, or the door to as-yet unimagined applications, AR will have an unprecedented impact on enterprises of all kinds. The AREA is a program of OMG. Visit https://thearea.org for more information.

About Digital Twin Consortium

Digital Twin Consortium is The Authority in Digital Twin. It coalesces industry, government, and academia to drive consistency in vocabulary, architecture, security, and interoperability of digital twin technology. It advances the use of digital twin technology in many industries, from aerospace to natural resources. Digital Twin Consortium is a program of Object Management Group (OMG). For more information about Digital Twin Consortium, please visit our website at https://www.digitaltwinconsortium.org/.

Note to editors: Digital Twin Consortium is a registered trademark of OMG. For a listing of all OMG trademarks, visit https://www.omg.org/legal/tm_list.htm. All other trademarks are the property of their respective owners.

 




Join Boeing’s Real-World Factory Floor XR Team

Join Boeing’s Real-World Factory Floor XR Team

While some organizations’ AR/XR efforts are still limited to experiments and proofs of concept, Boeing is moving forward with fully-deployed and industrialized factory floor solutions.

As part of that effort, Boeing’s Product Systems Information Technology & Data Analytics organization is staffing up and seeking Mid-level Factor XR Software developers to join its Factory XR team.

If you’re qualified and eager to help scale up real-world XR solutions at an industry leader, go here to learn more and apply.