1

The Great Resignation in Manufacturing

recent article published by The Washington Post shows some shocking numbers on the amount of Americans leaving their jobs over the past year. It’s no surprise that hotel and restaurant workers are resigning in high numbers due to the pandemic, but what is surprising is the fact that the manufacturing industry has been hit the hardest with “a nearly 60 percent jump” compared to pre-pandemic numbers. This “Great Resignation in Manufacturing” is the most of any industry, including hospitality, retail, and restaurants, which have seen about a 30% jump in resignations.

However, if you dig deeper, this trend isn’t new. This recent increase in job quitting in manufacturing has simply magnified a problem that had already been brewing for years, even prior to the start of the pandemic. In fact, in the four years prior to the pandemic (2015-2019), the average tenure rate in manufacture had decreased by 20% (US Bureau of Labor Statistics).

This accelerating workforce crisis is placing increased pressure on manufacturers and creating significant operational problems. The sector that was already stressed with a tight labor market, rapidly retiring baby-boomer generation, and the growing skills gap is now facing an increasingly unpredictable and diverse workforce. The variability in the workforce is making it difficult, if not impossible to meet safety and quality standards, or productivity goals. 

Manufacturing leaders’ new normal consists of shorter tenures, an unpredictable workforce, and the struggle to fill an unprecedented number of jobs. These leaders in the manufacturing sector are facing this reality and looking for ways to adjust to their new normal of building a flexible, safe and appealing workforce. As a result, managers are being forced to rethink traditional onboarding and training processes.  In fact, the entire “Hire to Retire” process needs to be re-imagined. It’s not the same workforce that our grandfather’s experienced, and it’s time for a change.

The Augmented, Flexible Workforce of the Future

The reality is that this problem is not going away. The Great Resignation in manufacturing has created a permanent shift, and manufacturers must begin to think about adapting their hiring, onboarding, and training processes to support the future workforce in manufacturing – an Augmented, Flexible Workforce.

What does this mean?

  • It means adopting new software tools to support a more efficient “hire to retire” process to enable companies to operate in a more flexible and resilient manner.
  • It means starting to understand your workforce at an individual level and using data to intelligently closes skills gaps at the moment of need and enables autonomous work.
  • And it means taking advantage of data.  More specifically, real-time workforce intelligence that can provide insights into training, guidance, and support needs.

Investing in AI-powered connected worker technology is one way to boost this operational resiliency. Many manufacturing companies are using digital Connected Worker technology and AI to transform how they hire, onboard, train, and deliver on-the-job guidance and support. AI-based connected worker software provides a data-driven approach that helps train, guide, and support today’s dynamic workforces by combining digital work instructions, remote collaboration, and advanced on-the-job training capabilities. 

As workers become more connected, manufacturers have access to a new rich source of activity, execution, and tribal data, and with proper AI tools can gain insights into areas where the largest improvement opportunities exist. Artificial Intelligence lays a data-driven foundation for continuous improvement in the areas of performance support, training, and workforce development, setting the stage to address the needs of today’s constantly changing workforce. Today’s workers embrace change and expect technology, support and modern tools to help them do their jobs.

 

 




AR Smart Glasses: XR Today Expert Round Table with Qualcomm, Arvizio and Singulos Research

One of the fastest-growing sectors in extended reality (XR) has inarguably become augmented reality (AR), which is used extensively among enterprises to conduct remote collaboration and inspections.

The AR industry has seen several crucial advancements in eye and hand tracking, gestures, deployment platforms, and greater interoperability for components and software, leading to huge developments for use cases and technological innovations.

As the Metaverse moves from ‘virtual’ reality to the next significant communications platform, combining spatial computing with the Internet, AR will become a key component of enterprise solutions.

For our XR Today round table, we are pleased to welcome:

  • Hugo Swart, Vice-President and General Manager of XR and Metaverse of Qualcomm Technologies
  • Jonathan Reeves, Chief Executive and Founder of Arvizio
  • Dr Brad Quinton, Chief Executive and CTO of Singulos Research

Our esteemed panellists have discussed the role of their AR solutions in the greater XR market, ongoing trends shaping the industry, as well as their views on the future of the Metaverse.

XR Today: What sets your AR solution apart from the competition? What has your company considered when designing hardware and software solutions for devices?

Hugo Swart: Qualcomm has a unique role in enabling and supporting the entire ecosystem as a horizontal player, which sets us apart.

We deliver best-in-class system-on-a-chip (SoC) platforms that power over 50 XR devices and offer the software and perception algorithms needed to enable XR experiences. We also provide reference device hardware to allow our customers to go to market quickly and a lot of other ecosystem initiatives.

Hugo Swart, Vice-President and General Manager of XR

In terms of considerations when designing hardware, we work very closely with all of our partners to assess the end-user needs and build a platform that will meet and surpass those requirements.

On the software and developer ecosystem side, I would like to point to our Snapdragon Spaces XR Developer Platform born from Qualcomm’s commitment to helping enable and scale head-worn AR, especially at the dawn of the Metaverse.

We wanted to help reduce developer friction and provide a uniform set of AR features independent of device manufacturers or distribution methods.

Jonathan Reeves: Arvizio is an AR software provider with solutions that operate across a range of AR devices, including mobile devices and AR smart glasses.

We believe the market requires a cross-platform approach for software solutions that can operate with a variety of smart glasses and mobile devices. This avoids scenarios where the customer is locked into a single vendor for AR smart glasses.

Dr Brad Quinton: We deliberately designed the Perceptus Platform to support a diverse set of hardware platforms. From Android and iOS mobile phones and tablets to AR glasses, the Perceptus Platform can provide an understanding of objects in their 3D environment in a consistent framework for AR application developers.

Our key considerations were creating a scalable training process that allowed AR designers to quickly and easily define their objects of interest while also making sure our solution could run in real-time using only edge hardware, avoiding the need to transfer sensitive user data to the cloud.

XR Today: Which trends do you see taking shape in the AR sector, and which aspects of AR do you believe are more advanced and which are lagging?

Hugo Swart: A trend we see taking shape and would like to accelerate is the shift from smartphone AR to head-worn AR, which is the intent with Snapdragon Spaces.

The open ecosystem approach allows Snapdragon Spaces developers to build their head-worn AR experience once and have it scale to a range of devices and content distribution channels. Once Snapdragon Spaces becomes available to all developers, we think this will help spur a new trend and era of head-worn AR experiences spanning entertainment, gaming, education, training, health and beyond!

Jonathan Reeves: AR glasses typically fall into two key categories based on their ability to provide spatial mapping. Devices such as Microsoft’s HoloLens 2 and Magic Leap can scan a room and use advanced simultaneous localization and Mapping (SLAM) algorithms to anchor AR content in place with a degree of accuracy.

This can apply in scenarios such as when the wearer moves their head, the content remains anchored in a fixed position. Other AR smart glasses lack spatial mapping and may not provide the degree of accuracy required for enterprise use cases.

Jonathan Reeves, Founder and Chief Executive of Arvizio

To date, achieving accurate spatial mapping has relied on depth-sensing cameras to build a 3D mesh of the space, much like LiDAR sensors in the iPhone Pro and iPad Pro have demonstrated.

To reduce the cost and weight of AR smart glasses, vendors are actively working on SLAM-based tracking approaches using stereoscopic cameras to deliver accurate tracking at a reduced cost.

This is challenging to achieve across a broad range of lighting conditions, but will lead to a significant reduction in cost, size, and weight.

A second key requirement for widespread adoption is hand gesture recognition. Devices such as HoloLens 2 have set the bar for this type of mixed reality (MR) interaction, and low-cost devices entering the market will need to offer a similar level of hands-free operation.

Dr Brad Quinton: The trends we see taking place in the AR sector are that many of the underlying AR hardware challenges are rapidly being resolved with maturing optics, high-speed wireless connectivity and high-quality virtual object rendering.

Where we see AR lagging is in the use of artificial intelligence (AI) to understand the context of the user’s AR experience, to provide high-value, contextually aware experiences and applications.

All modern mobile processors have high-performance neural accelerators, but for the most part, they have yet to be deployed in a meaningful way for AR because of the lack of appropriate tools, platforms and software.

XR Today: Why is interoperability a key component of tailoring your AR solutions for multiple purposes? How has your company accommodated versatility for your clients, both for deployment and continued support?

Hugo Swart: There are many facets to interoperability and our chips are designed to interoperate with multiple display types and technologies, for example.

Another interoperability angle is the support for OpenXR, as we want to make it as frictionless as possible for developers to create immersive experiences. Snapdragon Spaces is also designed, leveraging existing developer tools, to create 3D content like Unity and Epic game engines.

Jonathan Reeves: Arvizio software solutions for AR have been designed to work across a variety of AR devices, including AR smart glasses and mobile devices.

We currently support HoloLens 2, Magic Leap, and iOS and Android devices, and expect to add additional devices supported by Qualcomm’s Snapdragon Spaces initiative in the coming months.

Regarding the ongoing COVID-19 pandemic, remote collaboration has been a key driver in the use of AR and the crisis has made this necessary for business continuity.

Arvizio offers two solutions: the Immense 3D software solution and AR Instructor. Our Immerse 3D software allows multiple users to work with 3D models across locations for design reviews and stakeholder collaboration. Additionally, our AR Instructor offers step-by-step work instruction and remote expert “see-what-I-see” video sharing for additional guidance and work validation.

Dr Brad Quinton, Chief Executive and CTO of Singulos Research

Dr Brad Quinton: Interoperability is key for us because there is still no de-facto standard on AR hardware. We believe that it will be important to support a variety of hardware and operating systems in the near-to medium-term as users and application developers learn which hardware works best for them and their usage scenarios.

XR Today: What are your company’s thoughts on the Metaverse? When do you expect a solid foundation for the platform, and what would it look like?

Hugo Swart: We truly believe in the potential of the Metaverse and that Qualcomm is your ticket to it. Qualcomm has been investing in the underlying and core technologies to enable the Metaverse for over a decade, and we will continue to do so to help all our partners build and realize its full potential.

We are enabling our customers’ different Metaverse ecosystems and deploying our own with Snapdragon Spaces, so we believe the foundation is being built and something will come to fruition in the not-so-distant future.

Jonathan Reeves: We do not see a single Metaverse meeting the needs of all, but rather a set of Metaverse categories with several approaches being offered in each.

We believe four categories of Metaverse will emerge — Industrial, Business, Social, and Gaming — and in each category, there will be a variety of solutions and vendors, each vying for leadership. We believe this is a far more likely outcome than a single, dominant Metaverse platform.

Dr Brad Quinton: We believe that the Metaverse will be fundamentally personal and anchored in our own physical spaces. We see a continuity between AR and immersive VR, where users will select the minimum amount of immersion to achieve the task and experience they want, merging the value of the Metaverse with the comfort of physical reality.

Rather than having to pay the cost of immersion as an entry fee to the Metaverse, they will instead move through an AR-first Metaverse that transitions to immersive experiences when it makes sense.

We believe that mobile processors with advanced AI hardware coupled to 5G networks will be the platform for AR-first Metaverse in the next 1-3 years.

 

 

 




Development of an AR-based process management system The case of a natural gas power plant

Since the beginning of the Industry 4.0 era, Augmented Reality (AR) has gained significant popularity. Especially in production industries, AR has proven itself as an innovative technology renovating traditional production activities, making operators more productive and helping companies to make savings in different expense items.

Despite these findings, its adoption rate is surprisingly low especially in production industries, due to various organizational and technical limitations. Various AR platforms have been proposed to eliminate this gap, however, there is still not a widely accepted framework for such a tool.

This research presents the reasons behind the low adoption rate of AR in production industries, and analyzes the existing AR frameworks. Based on the findings from these analyses and a conducted field study, a cloud-based AR framework, which provides tools for creating AR applications without any coding and features for managing, monitoring and improving industrial processes is proposed.

The design and development phases are presented together with the evaluation of the platform in a real-world industrial scenario.

This work was supported by the Scientific Research Unit (BAP) of Istanbul Technical University under Grant number MGA-2018-41553; the Scientific and Technological Research Council of Turkey (TUBITAK TEYDEB) under Grant number 7170742.

Readers can find out more by visiting the following link 




Appearition Now Supports 5G Edge Powered by AWS Wavelength

SmartConnect uses AWS Wavelength to help provide the capability to develop and scale digital solutions utilizing edge computing on 5G networks.

 

A  headless, composable, application programming interface (API)-driven platform, Appearition enables enterprises to design and implement AR, VR, and XR applications without having to develop the backend architecture. It is device-agnostic, supporting all headset-mounted devices, or mobile devices using Web AR.

 

The platform serves as a low code solution for the development of a variety of AR, VR, and XR applications across multiple industry verticals such as retaileducationconnected workforcetourism, and property development.

 

“5G networks, with their high bandwidth, speed, and low latency drive exciting new innovations and will be the catalysts of louder adoption of immersive technologies. Appearition is excited to leverage AWS Wavelength to help launch this exciting new solution that will enable enterprises and developers build digital solutions using edge technologies,” said Raji Sivakumar, Co-founder and Chief Operation Officer at Appearition.

Enterprises who are looking to create immersive experiences through AR, VR, or XR applications will now be able to do so faster, cost effectively, and at scale. SmartConnect allows Appearition users to take full advantage of the platform’s edge computing capabilities. This can significantly reduce the resource demand on headset hardware and enable enterprises to rapidly prototype AR/VR/XR concepts with the ability to launch software-as-a-service (SaaS)-based products quickly.

High intensity video and graphics processes that would typically require a considerable amount of CPU power from user devices can now be delegated to edge computing to do all the heavy lifting. This removes the limits that many AR/VR/XR applications face by being restricted to the minimal resources a headset can offer.

SmartConnect now allows applications to access more processing power and additional computational resources. This can help expand the capabilities of applications and remove any limits due to limited computing capabilities. It also improves and delivers consistent immersive experiences to end users, regardless of the type of devices.




The Price of Drugs: Exploring New Realities in Pharma

In response, large, mainline biotech firms like Pfizer and Novartis, smaller CMOs (contract manufacturing organizations), equipment manufacturers and others involved in the highly fragmented pharmaceutical sector are looking to emerging technologies to improve efficiency, speed up research and production, widen margins, and guarantee quality and safety.

The author describes what it takes to develop a drug and changes in Pharma, then goes on to discuss immersive wearable tech in pharma:

If you can’t raise prices, then you need to cut costs elsewhere. For pharma companies, this means spending less time and money on R&D and going to market faster. As the drug pipeline shifts to meet demand for personalized medicine (targeted biologics), pharma companies are feeling the pressure to revamp their product lines, factories, and processes to become more streamlined and cost-efficient.

AR/VR for drug discovery

R&D spending in pharma has been rising parallel to the growing complexity of drug development, leading forward-thinking companies to explore AR/VR as a tool for discovering new drugs faster (and therefore cheaper). If VR-trained surgeons are able to complete procedures faster than non-VR trained surgeons, it follows that pharma researchers would innovate faster with VR than they currently can using computer graphics (CAD) and static models of molecules made of wooden balls and wires. Indeed, whether in the classroom or the lab, virtual reality is proving effective for visualizing and conveying difficult concepts while augmented reality can put interactive complex molecules into the scientist’s real-world environment.

Wearing a VR headset, drug developers can step inside a molecule or compound to see how it responds to different stimuli and quickly simulate complex drug interactions. Wearing AR smart glasses or a mixed reality headset, researchers can manipulate molecules and chemical structures in space – folding, knotting, and changing the shape of the molecules right before their eyes – and tweak a drug’s chemical makeup so it bonds to the protein in question, altering its function to the desired effect. AR/VR decreases the number of errors in the years-long process of drug discovery, which is essentially one of trial and error, by helping “drug hunters” iterate and improve (get to the right shape) faster. As a result, companies are able to develop better drugs with fewer side effects. Immersive tech can also improve collaboration among researchers around the world, eliminating barriers like distance and language by allowing two or more scientists to walk through the same chemical structure together from separate locations.

For manufacturing

Training and education

In other manufacturing sectors, augmented and virtual reality are allowing new workers to learn on the job without making mistakes as well as safely practice operating equipment before using a real machine. Likewise, AR/VR can significantly improve training outcomes for pharmaceutical workers. In addition to “practice runs” on complex pharmaceutical manufacturing equipment even before entering a facility; a process engineer wearing safety smart glasses can learn on the job while still meeting high levels of control and quality by accessing step-by-step instructions and other multimedia support for troubleshooting and repairing a machine right in her field of view or connecting via livestream to a remote expert for guidance and support. Operators and scientists can also use VR to learn the proper principles of aseptic technique and the proper procedures for different laboratory and production environments (ex. the specialized containment and personal protection requirements for HPAPIs). Beyond production, AR/VR can help explain new treatments to doctors and patients, and train nurses to administer a new drug or therapy.

Heads-up, hands-free information and documentation

In manufacturing in general, data from connected machines is unlocking the ability to perform predictive maintenance, saving manufacturers millions of dollars in downtime; so a systems engineer wearing smart glasses in a pharmaceutical plant could receive real-time, heads-up and hands-free notifications about, say, a location that will soon need replenishment or an instrument that’s predicted to fail, allowing him to catch and address issues in advance, thereby improving efficiency, speeding up production, and lowering costs. Anywhere along the production cycle, digital information can be beamed in this way to augment an engineer’s view and intuitively show him or her what to do. For instance, an engineer could use smart glasses to scan the QR code on a piece of equipment, automatically bringing up work instructions or an interactive diagram tailored to that machine. Engineers could access batch records heads-up and hands-free and record values and videos via voice command, never needing to take their hands or attention away from a process. This is also an easy and effective method for audit readiness.

Remote support

All of this instant, hands-free access to information – presented heads-up and in context – is designed to enable users to work faster and more accurately, but it’s not just the challenges of visualizing complex drugs and the use of incorrect, out-of-date paper procedures, manuals, and documentation that slow down time to market; the need to fly in specialists to a pharmaceutical facility when something goes wrong is another contributor to what has become a years-long, complicated, error-prone and unrewarding process. Immediate ROI and time saved can be had from adopting AR glasses for remote support, especially when users need vendor advice. With augmented reality software, the expert can even draw on the user’s display to highlight specific buttons or connections and drop 3D arrows into her real-world environment in the facility.

Conclusion

The possibilities for AR/VR in the pharmaceutical sector are great and desperately needed. Pharma companies should be taking cues from other advanced manufacturing sectors, which are already seeing results in training, efficiency, quality insurance, and safety through the use of AR glasses and VR headsets. Of course, pharma is a sensitive industry, and new devices open up new opportunities for hackers to gain patient data and secret drug research. Any investments in emerging technologies must be accompanied by investments in cybersecurity.




Using Augmented Reality To Teach Real Construction

Learning to make a dazzling technology practical

Augmented reality, or AR, is a way of adding digital elements to a live view, often by using the camera on a smartphone. While it’s been available for years, it became popular with the creation of social media tools such as Snapchat filters and mobile device games like Pokemon Go.

On the other hand, virtual reality, or VR, is an experience that seeks to place an individual in an entirely virtual world. This immersion is typically accomplished through the use of VR goggles or a headset.

Ayer says he was first introduced to augmented and virtual reality as an undergraduate architectural engineering student at Pennsylvania State University. He was working as a lab assistant managing the equipment when he says his interest took off.

“I got very dazzled by them,” says Ayer, who is a faculty member in the School of Sustainable Engineering and the Built Environment, one of the seven Fulton Schools.

Out of a desire to utilize AR and VR technologies, he says he found himself seeking problems that would fit the tools instead of the other way around, which made it difficult to measure the technologies’ impact and success.

As he entered his master’s degree program, also at Pennsylvania State University, Ayer says he had to reevaluate how he was looking at technology and the role it plays both in a classroom and in the real world.

“Through grad school, and certainly when I got to ASU, the shift was pretty polar opposite: Don’t start with the technology, start with the human and the problem that the human has, and how that technology is supporting them,” Ayer says.

This perspective is something he attributes to a subtle comment made to him during his doctoral studies by a mentor and co-adviser, Chimay Anumba, who is now at the University of Florida.

“In a very sort of understated way, I remember him just saying to me, ‘Sometimes when you have a hammer, the whole world looks like a nail,’” Ayer says.

Addressing the problems

Over the years, Ayer has come to identify two major challenges AR and VR technology can help students face when it comes to construction education and entering the workplace.

The first is visualizing design concepts from two-dimensional plans that represent a three-dimensional space.

“We take this 3D concept; we have this building around us. And the way we communicate that is we dumb it down to flat paper plans,” Ayer says. “Instead, we can give them augmented reality glasses with the idea of saying, let’s make it easy to understand the design. They just see the model show up almost like it was there, but it’s virtual.”

Ayer says once students can get past the dazzled phase, they can dial in and learn the underlying construction competencies they need to be successful on the job.

For example, Ayer says he conducted a study a few years ago aimed at helping students explore buildings as if they were the end-users, like a facilities manager tasked with keeping a building up and running. He says they gave one group of students an augmented reality setup and the other group a computer setup. Both were given the same task of exploring the building to find flaws in the design. He says both groups could identify the flaws, but the group utilizing AR was able to come up with ways to improve the design and correct the flaws.

“The students using the computer setup, which was still a 3D model on the screen, knew something had to be considered, but couldn’t effectively articulate what about the design was problematic,” Ayer says.

Kieren McCord, a construction management doctoral student in the Fulton Schools, says while doing research with Ayer for her dissertation, she was inspired by the use of AR and the ability it gives students to visualize designs.

“Physical builds are a great way to learn, but they can be extremely cost-prohibitive to bring to a classroom. So, a virtual simulation can be a valuable, cost-effective alternative,” McCord says.

She says there are far fewer physical restraints on virtual environments, meaning if you can dream it, you can create it in a virtual environment.

The second challenge Ayer says he wants to improve is job site safety.

What makes people change behavior is when they see or experience a bad thing.

— Steven Ayer, associate professor of constuction engineering

“We see a lot of times where we use very antiquated modes of teaching safety courses that are ‘chalk-and-talk’ lecture-style learning, which by almost any accounts have been ineffective, and, by empirical data on sites, still don’t stop injuries,” Ayer says.

“People from industry will say, ‘I didn’t care about safety until …’ and they’ll tell you a story of when they saw someone hurt, or someone lost a life,” he says. “And when they’re the one that makes the phone call to the husband or wife saying, ‘Your spouse isn’t coming home today,’ it hits them.”

With that impactful moment in mind, Ayer says he sought to create an experience for students that balanced real-life decisions with the dangerous outcomes created by mistakes.

“What makes people change behavior is when they see or experience a bad thing,” Ayer says. “What we’re doing with virtual reality is putting students and even industry personnel into this environment. But, unlike most virtual reality training environments that give a report card when something goes unrecognized and they fail to identify the hazard, we will show them the impact of their decision.”

Ayer says showing the impact is accomplished through the use of slow-motion video or animations. In addition, the negative effects never impact the AR user, but another character within the virtual environment.

“The situation would be to see if we can have a virtual artificial stimulus, the VR experience, trigger a real psychological response,” Ayer says. “So, now students or industry professionals can say, ‘I didn’t care about safety until I had this really impactful training experience that didn’t actually harm anyone.’”

He says the biggest challenge is overcoming how students and industry professionals first react when they experience these technologies. Many people find it “cool,” he says, but this impression isn’t what he wants to see.

“What are the metrics you would track to know if this provided a return on investment, or saved lives, or reduce rework, or whatever the underlying value is? How we get them back to thinking about that can be a challenge,” Ayer says.

It’s a challenge that he’s willing to take on because he says technology is something that education and industry need to take seriously, as it can be the solution to several problems, not only in construction, but in society as a whole.

“I think in the future, as technology becomes more prevalent, the role the human plays may be more critical because we will be slightly more out of the loop in terms of decision-making tasks and that kind of thing,” Ayer says. “So getting the human to interact with those technologies really well, for the time that they do, will be even more critical.”

 




How is Augmented Reality used in the Construction Industry?

R&D expenses in the construction industry are often substantially lower than those in other industries (it rarely goes higher than 1 percent of revenues). That might be another reason why it is losing ground in the digital age and lagging behind other industries in terms of integrating innovative technology.

According to MIT Technology Review, employing advancements in the field of augmented reality is one approach for the construction sector to enhance production quickly while ensuring long-term effects (AR).

Augmented Reality is a technology that augments actual environments on a mobile device screen by superimposing digital material on top of them. Augmented Reality is built on real-world footprints and simply adds computer-generated data such as animation or three-dimensional objects to it rather than completely replacing any virtual experience.

Augmented Reality in Construction

Because of its capacity to give real-time information, augmented reality is being used in the construction sector to increase productivity, improve safety on building sites, maximize teamwork and collaboration, and time management, cost, and supplies.

There are many sub-processes in Construction where Augmented reality is of utmost importance. Some may be associated with Project planning whereas some may be associated with construction training. Whatever it may be, we’ll be discussing every plausible steps in construction that requires AR.

Additionally Augmented reality is one of the most promising technologies of Industry revolution 4.0. So, it’s important that we leverage the most important technology across every field so as to stay ahead.

The article then goes on to cover some of the processes in construction that require AR for better implementation. These are repeated below only in bullet point form. Readers may want to visit the article in full for an in depth explanation.

 

  1. Planning of Projects
  2. Team Efforts
  3. Information of Projects
  4. Training of Safety and Precaution
  5. Measurement

 

The article then goes on to explore:

  • Restrictions of Augmented Reality in Construction
  • Successful Implementation of AR in Construction
  • Future of Augmented Reality in Construction



AR Provider Rokid Raises $160m Series C Funding to Expand Globally

Chinese startup Rokid has been through a few stages of transformation over its eight years of existence.  The Temasek-backed company started out as a smart speaker maker when the vertical was all the rage in China in mid-2015s, but it has in recent years put more focus on Augmented Reality.

This week, Rokid said it has secured a $160 million Series C round, lifting its total capital raised to $378 million.

Rokid has been exploring enterprise use cases, like enabling remote communication for field workers in the auto, oil and gas, and other traditional industries. Its X-Craft headset, for instance, is resistant to explosions, water and dust and comes with 5G and GPS capabilities.

During the COVID19 pandemic Rokid pitched smart glasses that could detect temperatures of up to 200 people within two minutes.

With a team of about 380 employees, Rokid said it will spend the new proceeds on research and development as well as global expansion, so developed markets could be expecting more of Rokid’s B2B offerings. Indeed, the firm just hired an energy industry veteran to head its sales in the APAC region.

The AREA sends our congratulations on this expansion.

Find out more about Rokid on the Rokid inc AREA member profile.




Operating Room Usage of Vuzix Smart Glasses Continues to Expand Via Solution Providers Pixee Medical and Rods&Cones

Yesterday, Pixee Medical announced that its Knee+ AR computer-assisted orthopedic solution will be commercially launched in the United States, providing a perfect fit for Ambulatory Surgical Centers (ASCs). Launched in Europe and Australia early in 2021, Pixee’s solution will be formally launched in the US at the American Association of Orthopaedic Surgeons (AAOS) Annual Meeting on March 22-26 in Chicago, where the company will meet with surgeons and finalize the organization of its distribution channel.

Additionally, Pixee Medical added that it will soon be adding new features to its Knee+ platform, with soft tissue balancing, kinematic alignment and data connectivity. It will also be expanding its portfolio with a mixed reality product for total shoulder arthroplasty and with an easy-to-use cup orientation and leg length controlling AR tool for total hip arthroplasty. Knee+ is also now compatible with surgical hoods.

Rods&Cones has been a growing consumer of Vuzix M400 smart glasses, which let staff in the operating room, ICU or other medical facility, including surgeons, instrumentalist nurses and other healthcare professionals, provide 4K broadcast quality imagery to others while interacting with patients and staff from a safe distance. To date, Rods&Cones is active in more than 600 hospitals across more than 30 countries and the company anticipates that 2022 will be the year when companies and medical providers start changing their models to prioritize remote technology at the heart of their operations.

Rods&Cones recently announced that its remote access service is compatible with the Pixee Knee+ augmented reality solution for total knee arthroplasty. The compatibility of the two solutions is expected to allow surgeons to use augmented reality during total knee arthroplasty surgeries, while also connecting remotely with other medical experts around the world.

“Firms like Pixee Medical and Rods&Cones are innovators within the healthcare sector and their solutions are facilitating communications and learning, reducing costs, and improving outcomes in operating rooms around the world with the help of Vuzix smart glasses,” said Paul Travers, President and Chief Executive Officer at Vuzix. We look forward to working with these and other providers to help transform the healthcare industry in 2022 and beyond.

 

 




Qualcomm Launches $100M Snapdragon Metaverse Fund

Qualcomm Incorporated announced on March 21 2022 the launch of the Snapdragon Metaverse Fund, established to invest up to $100 million in developers and companies building unique, immersive XR experiences, as well as associated core augmented reality (AR) and related artificial intelligence (AI) technologies. The fund plans to deploy capital through a combination of venture investments in leading XR companies by Qualcomm Ventures and a grant program by Qualcomm Technologies, Inc. for developer ecosystem funding in XR experiences such as gaming, health and wellness, media, entertainment, education, and enterprise.

“We deliver the groundbreaking platform technology and experiences that will enable both the consumer and the enterprise to build and engage in the metaverse and allow the physical and digital worlds to be connected. Qualcomm is the ticket to the metaverse,” said Cristiano Amon, president and CEO of Qualcomm Incorporated. “Through the Snapdragon Metaverse Fund, we look forward to empowering developers and companies of all sizes as they push boundaries of what’s possible as we enter into this new generation of spatial computing.”

Qualcomm Technologies has been a key contributor in every major computing evolution and is a leader in core technologies such as 5G, AI and XR – all of which are critical to the metaverse. As we enter the new era of spatial computing, the Snapdragon Metaverse Fund will help enable and foster innovation across the entire ecosystem through venture investment and developer ecosystem grants for content projects.  In addition, recipients may have the opportunity to gain early access to cutting-edge XR platform technology, hardware kits, a global network of investors, and co-marketing and promotion opportunities.

Companies and developers who are interested to learn more can visit qualcomm.com/metaverse-fund. Applications for the Snapdragon Metaverse Fund will officially open in June.

 

About Qualcomm

Qualcomm is the world’s leading wireless technology innovator and the driving force behind the development, launch, and expansion of 5G. When we connected the phone to the internet, the mobile revolution was born. Today, our foundational technologies enable the mobile ecosystem and are found in every 3G, 4G and 5G smartphone. We bring the benefits of mobile to new industries, including automotive, the internet of things, and computing, and are leading the way to a world where everything and everyone can communicate and interact seamlessly.

Qualcomm Incorporated includes our licensing business, QTL, and the vast majority of our patent portfolio. Qualcomm Technologies, Inc., a subsidiary of Qualcomm Incorporated, operates, along with its subsidiaries, substantially all of our engineering, research, and development functions, and substantially all of our products and services businesses, including our QCT semiconductor business.