1

Case Study Augmented Reality in Construction Planning Holo-Light

Human ability to imagine objects that are not physically present is limited. It is even more difficult for us to mentally place them in an existing environment. How often, for example, has it happened to you that a newly purchased piece of furniture was too large for the intended space?

In construction planning and architecture, this problem is amplified. Whereas in the case of the previously mentioned piece of furniture, only a single part has to be inserted into an existing space, in architecture we are often dealing with entire buildings in which floors, rooms and objects stand in a relationship to one another; and of course, the building itself as a whole must also fit into its surroundings. In this process, our lack of imagination can lead to mistakes with far-reaching consequences.

This is where technology helps our imagination tremendously. Augmented reality (AR) in combination with Building Information Modeling (BIM) ensures that we can “actually” see all objects and relationships.

What is BIM? What is AR?

Building Information Modeling (BIM) is to be understood as a digital method that is used throughout the life cycle of a building. In this process, all data and information related to the construction is stored and mapped in a BIM-enabled software.

Augmented reality is the computer-aided expansion of reality perception. Specifically in construction planning, the BIM models are “projected” into the real environment.

Benefits of AR in Construction Planning

AR applications for construction planning help our imagination tremendously. They support the entire decision-making process both on the side of the construction planner and on the side of the client. AR glasses can be used to better present and understand the planned building. Thus, decisions in the early planning phases can be made more easily and more correctly, which reduces planning and construction costs accordingly.

“Especially in the case of existing building conversion, it is advantageous if you can visualize the superimposition between the model and reality.”

DI Dr. Timur Uzunoglu, Managing Director convex ZT GmbH

AR Use Cases in Construction Planning at convex

At convex ZT GmbH, we use AR technology from the design phase to operation. With Holo-Light’s AR3S software, we bring BIM planning closer to clients and enable greater planning transparency. Building owners feel more involved in the planning process during our AR-assisted planning meetings and can make better decisions. We make AR inspections together with the builders directly on site. These AR inspections provide a direct impression on site in real time and help to weigh alternatives against each other. In revitalizations of existing buildings, it is often challenging to bring the new structures into a functioning harmony with the existing buildings, and AR helps very well there, too.

 




ThirdEye Targets EPA Green Goals for Metaverse

The solution works using sustainability targets from the United States Environmental Protection Agency (US-EPA), which aims to build a carbon-neutral future for the planet.

Citing EPA figures, ThirdEye said the COVID-19 pandemic sharply reduced global transport traffic, which was the “largest contributor to anthropogenic [US] greenhouse gas emissions at 29 [percent].”

ThirdEye’s AR/MR telepresence solutions allow companies to lower their carbon footprint by reducing the overall need for global transport, and the firm’s RemoteEye platform has cut onsite visits to allow significant cost savings, leading to a major improvement in return on investment (ROI).

Nick Cherukuri, Founder of ThirdEye, said his company’s RemoteEye platform aims to include a Carbon Footprint Score for its users to calculate the organisation’s carbon footprint with AR.

Explaining further on the benefits of AR technologies, he continued, stating,

“Not only are AR and MR teleconferencing platforms financially prudent due to traveling less, but by using this technology to share knowledge and operational workflows, there are tremendous carbon emission savings. For example, we can bring education and telehealth to underprivileged areas around the world with augmented and mixed reality”

The company’s RespondEye, which complies with the US Health Insurance Portability and Accountability Act (HIPAA), also allows doctors to tackle health problems for remote patients “anytime, anywhere.” Doctors can later assign patients and carers medical diagnoses and treatment options.

Enterprises can also benefit from the introduction of 3D digital twins to reduce inventory and other digital assets, ThirdEye said, adding doing so would reduce production emissions and costs.

The news comes as the US firm aims to expand its solutions to the Asia-Pacific with its X2 MR smart glasses and a major partnership with Go VR Immersive, a Hong Kong-based XR startup.

Tthe smart glasses would be deployed to remote workers across China, just shortly his firm inked a major partnership with Microsoft to deploy HoloLens 2 MR head-mounted displays in the Asia-Pacific region.

 




Magic Leap’s New AR Headset Will Debut in 2022

A few things mentioned include:

  • Eye examinations can be done at a fraction of the cost
  • Magic Leap’s next generation AR glasses are smaller lighter, faster
  • They have a greater field view – this has doubled in their next gen device
  • Vertical representation e.g. surgery digital content overlaid across the knee and look at virtual screens
  • Bringing light dimmer to bring more focus to what needs to be concentrated on (again surgical use)

Answering criticism about lack of progress, Johnson argued that 4 healthcare companies are testing the devices right now and other industries are working with Magic Leap at the moment. These include:

  • Health
  • Defense and Public Sector
  • Manufacturing
  • Automotive and Transport
  • Oil and Gas
  • Architecture, Engineering, Construction (AEC)

You can watch the video here 




Realwear Navigator First Look at the Future of Assisted Reality

This offers a frontline connected worker platform for the integration of multiple assisted and augmented reality (SLAM) experiences into a high-performance industrial solution

RealWear Navigator™ 500 solution is the all-new head-mounted device product platform specifically designed to engage, empower and elevate the frontline worker for the next several years.

Building on the accumulated experience of the last four years, working with 5000 enterprise customers in 60 countries with solutions based on our HMT-1™ and HMT-1Z1™ platforms, this new product brings targeted innovation in all the key areas that matter most to achieving solid results at scale.

RealWear has been known for establishing and gaining major customer deployments for frontline worker solutions based on “assisted reality”.

The core concept of assisted reality is that it makes a different tradeoff than mixed reality. Assisted reality is better suited to the majority of industrial use cases where user safety is paramount.

The goals of assisted reality are to keep the user’s attention in the real world, with a direct line of sight, for the most part unoccluded by digital objects or “holograms” that require extra cognitive focus for humans to process.

Situational awareness of moving machinery, approaching forklifts or other vehicles, steam escape valves, slip and trip hazards and electrical and chemical hazards is key for RealWear’s customers. These are the same working environments that mandate specific personal protective equipment for safety glasses and goggles, to hard hats, hearing protection, heavy gloves and even respirators. Users in these situations mostly require both hands to be available for the use of tools and equipment, or to hold on to railings, ropework, etc.

In turn the user interface for assisted reality cannot rely on the availability of hands to operate handheld controllers, or to draw gestures in the air.  RealWear’s assisted reality solutions rely on voice recognition that is field proven in very high noise environments, plus the minimal use of head motion detection. The platform uses a single articulated micro-display easily adjusted to sit below the dominant eye that does not obstruct direct vision and provides the user a view similar to a 7-inch tablet screen at arm’s length.

A core concept of mixed reality has been the placement of virtual 3D digital objects overlaid on the physical world – such as 3D models or animations. This requires two stereoscopic see-through displays that are brought to a point of focus that typically is not in the same plane as the real-world object. The resulting vergence-accommodation conflict – where the greater convergence of the eyes when looking at near objects is in conflict with the focal distance, or accommodation of the eye’s lens needed to bring the digital image into focus – is a source of eyestrain, discomfort and in some cases headaches after extended use. In addition, in bright conditions, especially outdoors, mixed reality displays struggle to provide sufficient contrast with the real world and therefore they always either cut a significant amount of light from the real world using darkened glass or have to generate such a bright display that battery life is very short unless tethered with a cord to a separate battery pack. Both situations contribute to eyestrain with extended use.

However mixed reality applications do allow information to be overlaid on the real-world asset which in some use cases can provide an additional boost in productivity in identifying the item to be worked on.

So how could this tradeoff be solved?   Is it possible to tag or overlay information on the real 3D world while also maintaining safety, situational awareness, low eyestrain, hands-free use and full-shift battery life?

We’ve long believed that the answer lies in amping up the amount of “assistance” in assisted reality rather than solely focusing on the amount of reality, with power-hungry, wide field of view, super bright stereoscopic, transparent and ultra-high resolution displays. With advanced camera capabilities and computer-vision processing, key information about real-world assets can be placed on the camera view shown in the single, monocular, non-see-through (opaque) display.

Read more. 

 




Samsung and Microsoft may be working on a future augmented reality hardware

A report by The Elec suggests that Microsoft and Samsung are working together on future augmented reality hardware. It is not disclosed whether this is for the consumer market, enterprise market, or both. All that is known is that the project is AR-related and may involve some sort of hardware Samsung will be producing (rather than Microsoft). Samsung’s investments in DigiLens, the company behind tech found in AR display devices, may further substantiate the idea that the former will be handling the physical gadgetry in its collaboration with Microsoft.

Apparently, several divisions of Samsung are involved in the project, with Samsung Display, Samsung Electro-Mechanics, and Samsung SDI are all tied in. This AR project started in the middle fo 2021 and aims to result in a commercially viable product by 2024.

 

 




RealWear Introduces RealWear Navigator™ 500 Industrial-Strength Assisted Reality Wearable for Frontline Workers

Fully optimized for hands-free use, RealWear Navigator 500 is an innovative platform solution that combines hardware, software, and cloud-ready services with a rugged design that is one-third lighter and slimmer than the previous generation, making it easier for frontline workers to wear the device for their entire shift. The hardware is designed as a modular platform with an upgradeable 48 megapixel (MP) camera system, a truly hot-swappable battery, with Wi-Fi, and an optional 4G (and soon-to-be-available 5G) modem. The voice-controlled user interface includes unique noise-cancelation technology designed for high-noise environments. RealWear has more than 200 optimized partner apps supporting a variety of use cases, such as remote collaboration, guided workflow and IoT and AI data visualization.

Assisted reality [infographic available] is a non-immersive experience and has become the preferred Extended Reality (XR) solution for frontline industrial workers, especially where high situational awareness is necessary. Assisted reality experiences are closer to the physical world, compared to virtual reality (VR) and augmented reality (AR) experiences that immerse workers in the metaverse.

With RealWear Navigator 500, RealWear has again raised the bar for how assisted reality and other XR technologies are deployed at the world’s leading industrial companies. Automotive, logistics, manufacturing, food & beverage and energy companies, among others, can use RealWear Navigator 500 to deliver real-time access to online information and expertise to the world’s more than 100 million industrial frontline workers.

“With pandemic concerns continuing to press upon the global economy, how technology is enabling a ‘new way to work’ is very much in focus, particularly for industrial frontline workers,” said Andrew Chrostowski, Chairman and CEO of RealWear. “Today we’re unveiling something far bigger than a product. The RealWear Navigator 500 delivers the next generation of work with a ‘reality-first, digital-second’ enterprise solution for remote collaboration, operational efficiency, and hybrid work in safety-critical industries. Assisted reality – more so than augmented or virtual reality – is designed specifically for the frontline worker who requires both hands for the job, striking the perfect balance of keeping workers 100% present and self-aware with the ability to safely navigate industrial surroundings. After all, nobody wants to be near hazardous equipment with their head stuck into the metaverse.”

Read the rest of the full press release here. 




Ford Technical Assistance Center Using TeamViewer Frontline Augmented Reality Solution to Streamline Customer Vehicle Repairs Worldwide

The new service is offered by Ford’s Technical Assistance Center (TAC), a centralized diagnostic troubleshooting team that provides support to all Ford and Lincoln dealerships’ technicians who diagnose and repair customer vehicles.  Dealer technicians can initially reach out to TAC specialists via a web-based portal or even on a phone.  With the new See What I See program, TAC specialists can now start a remote AR session using TeamViewer Frontline through a pair of onsite RealWear smart glasses to share, in real time, exactly what the repair technician is looking at.  TAC specialists can add on-screen annotations and additional documentation directly in the line of sight of the repair technicians, as well as zoom in, share their screen, record the session and even turn on flashlights remotely.

“My team diagnoses some of the most complex and complicated vehicle issues,” says Bryan Jenkins, TAC powertrain operations manager.  “I would frequently hear my team say that if they could only see what that technician is talking about, or what the technician is doing or how they’re completing a test, then they could solve the problem more accurately.  A picture is worth 1000 words, but sometimes that still wasn’t quite enough, and we needed a way to see something live and in action.  And that’s what really kicked this whole program off.”

Ford’s See What I See program is an additional layer of support that is already used by more than 400 dealers in the U.S., Mexico, South Africa, Thailand, Australia, New Zealand and the U.K.  Currently Ford is promoting the new program to its full network of 3,100 U.S. based dealers, with a positive response. “Feedback from the dealers has been really good,” says Jenkins.  “From the dealer technician perspective, they just turn on their smart glasses and accept an incoming call, then it is like my specialists are there looking over their shoulder to help resolve the problem.”

“We are very excited to add Ford to our growing list of forward-thinking customers that are leveraging AR solutions to improve business processes,” says Patty Nagle, president of TeamViewer Americas.  “The majority of workers globally do not sit in front of a desk.  Our goal is to enable those frontline workers with AR guided solutions to enable them to do their jobs better by digitalizing and streamlining processes.”




HP is Using HoloLens to Help Customers Remotely Repair Industrial Printers

While many AR companies are focused on building AR products, HP is making an interesting move in using the technology as an add-on to improve an existing line of its business. The company’s newly announced xRServices program promises to deliver remote AR support for its industrial printer customers.

The program employs Microsoft’s HoloLens 2 headset, which HP’s customers can use to access AR training and live guided instructions to fix issues that arrive with complex, commercial scale printers.

HP is pitching the solution as a way to allow even untrained individuals to fix issues with the help of a specialist on the other end who can guide them step-by-step through troubleshooting and repairs with AR instruction. Further the company says the service can be used to provide AR training for various workflows and issues that may arise with the company’s industrial printers.

HP hasn’t clearly detailed exactly what software it’s running on HoloLens to facilitate xRServices, but it seems likely that it is leveraging Microsoft’s Dynamics 365 Remote Assist platform which includes many of the AR functions that HP showcased in its xRServices concept video—like augmented annotation, document visualization, and video chatting through the headset.





Hands-Free Thanks to Augmented Reality

The Bazeley Pilot Facility at the Parkville site in Melbourne, has been trying out Apprentice IO, an intelligent batch execution system that includes augmented reality. CSL, the world’s third largest biotech company, uses pilot plants to test manufacturing processes on a small scale. The company specializes in rare and serious diseases as well as influenza prevention.

Learn about a CSL Behring pilot plant in Illinois.

Working with Apprentice IO in Australia (AREA member) means a near-complete reimagining of core operations, including product development, manufacturing and supply chain solutions, said Sharon Orr, CSL’s Manager, Innovation and Technical Operations, Pilot Scale Operations. As part of the six-month test, the team is exploring alternatives to paper-based batch records, standard operating procedures and work instructions. The exercise has fundamentally changed the way operators of the system think about and approach instructions from the outset, Orr said.

When integrated with lab facilities, the augmented reality headset and linked iPad, can provide-on-the spot feedback, process directives and problem-solving techniques in real time. Paper records require a four-eyes approach for calculations, checking raw material information and weighs, she said. The experimental platform replaces manual cross-checking methods with automated formulas, ranges and barcoding. If results don’t match, then the system flags it to the operator saving time and ensuring compliance.

“Once a procedure has been augmented and approved, I can use this technology to perform a process ‘hands-free’ without having to worry about cumbersome paper-based data collection and manual checking,” Orr said.

Approved standard operating procedures can be accessed at the click of a button, Pilot Scale Operations scientist Hugh Harris said. With the new manufacturing software and augmented reality capabilities, colleagues at different sites would also be able to share data and see what’s happening in real time. CSL may bring the software system to other pilot facilities in Australia and the United States.

“Watching the team explore this new technology at the forefront of next generation manufacturing has been truly inspiring,” said Matthias Zimmermann, CSL’s Executive Director of Bioprocess Development. “They have shown a willingness to embrace the technology and incorporate it within our already existing processes. They have also worked with the software program designers to inform the next version, in some ways advancing this technology together.”

 




Visor-Ex® 01 – Collaboration between ECOM Instruments and IRISTICK

The Pepperl+Fuchs brand ECOM Instruments introduces Visor-Ex® 01 smart glasses for industrial use in hazardous areas.

The intelligent wearable, weighing just 180 g, combines high camera / display quality and reliable communication features in an ergonomic design for user’s utmost comfort. This provides mobile workers with an optimal companion for tasks that require hands-free use as well as continuous communication, for example with a remote expert.

This product is the result of a longterm close collaboration between ECOM Instruments and IRISTICK, bringing together ECOM’s in-depth knowledge of the requirements of hazardous areas with IRISTICK’s profound experience of smart glasses development.

Innovative tool for the mobile worker in hazardous areas

A total of three integrated cameras transform Visor-Ex® 01 into the remote expert’s bionic eye. Two 16-megapixel cameras are centrally positioned to depict the wearer’s natural field of vision – this way the remote expert views what is happening from the same angle and perspective as the mobile worker. A secondary camera offers a 6x optical zoom for zooming in without loss of quality and fast scanning of barcodes and QR codes. The system utilises the ECOM Smart-Ex® 02 smartphone for hazardous areas as a computing unit with LTE connectivity and a pocket unit with a replaceable battery for power supply, all combined in an intelligent ecosystem for a wide range of application scenarios in the industrial sector.

The distribution of functions across the individual system components helps to minimise the weight of the headset unit – without compromising on performance, connectivity or battery life.

By connecting to the Smart-Ex® 02, users can continue to use their tried-and-tested smartphone for harsh environmental conditions without restriction and benefit from all the advantages and security features and controls of the Android 11 operating system, including over-the-air updates. Leading to ease of use and low Total Cost of Ownership.

Visor-Ex® 01 will be certified for ATEX/IECEx Zone 1/21 and 2/22 as well as NEC/CEC Division 1 and 2 and will have protection class IP68. It can be used within a temperature range of -20 to +60 °C.

Read Iristick’s AREA member profile 

For more information: www.visor-ex.com

For sales information : [email protected] or [email protected]