1

ORAU awards 35 research grants totalling $175,000 to junior faculty at its member universities; GDIT and the AREA fund single grants in new specialty areas

ORAU News

The awards recognize faculty members for their work in any of five science and technology disciplines: engineering and applied science; life sciences; mathematics and computer science; physical sciences; and policy, management or education. GDIT’s award funds research in supply chain innovation while The AREA’s award focuses on augmented reality in the workplace.

“Each year, ORAU supports the research and professional development of emerging leaders at the universities who are members of our consortium,” said Ken Tobin, ORAU chief research and university partnerships officer. “The Powe Award program is always extremely popular and very competitive. We are grateful to join with GDIT and The AREA in expanding the research focus of these awards.”

“The AREA is excited about supporting faculty research in higher education to support the use of AR in the enterprise,” said Mark Sage, AREA executive director. “Our mission is to further the adoption of interoperable AR-enabled enterprise systems.”

Alex McGuire, GDIT’s vice president and supply chain officer, added, “As a supply chain innovator, we’re honored to support ORAU grant recipients and their research to advance and apply next-generation science and technology.”

The Powe recipients, each of whom is in the first two years of a tenure track position, will receive $5,000 in seed money for the 2024-25 academic year to enhance their research during the early stages of their careers. Each recipient’s institution matches the Powe award with an additional $5,000, making the total prize worth $10,000 for each winner. Winners may use the grants to purchase equipment, continue research or travel to professional meetings and conferences.

Since the program’s inception, ORAU has awarded 910 grants totaling more than $4.55 million. Including the matching funds from member institutions, ORAU has facilitated grants worth more than $9 million.

The awards, now in their 34th year, are named for Ralph E. Powe, who served as the ORAU councilor from Mississippi State University for 16 years. Powe participated in numerous committees and special projects during his tenure and was elected chair of ORAU’s Council of Sponsoring Institutions. He died in 1996.

 

Recipients of the Ralph E. Powe Junior Faculty Enhancement Awards for the 2024-2025 academic year are listed below:

 

 ORAU Award Recipient Member Institution
Augusta University Evan Goldstein
Catholic University of America Dominick Rizk [GDIT Award]
Duke University Di Fang
Fayetteville State University Chandra Adhikari
Florida International University Asa Bluck
Iowa State University Esmat Farzana
Iowa State University Qiang Zhong
Louisiana State University Sviatoslav Baranets
Michigan Technological University Tan Chen
Oakland University Alycen Wiacek [The AREA Award]
Ohio State University Zhihui Zhu
Penn State University Tao Zhou
Purdue University Justin Andrews
Tulane University Daniel Howsmon
University of Alabama at Birmingham Rachel June Smith
University of Alabama in Huntsville Agnieszka Truszkowska
University of Arizona Kenry
University of Arizona Shang Song
University of Colorado Denver Stephanie Gilley
University of Colorado Denver Linyue Gao
University of Delaware Yan Yang
University of Florida Angelika Neitzel
University of Houston Ming Zhong
University of Memphis Yuan Gao
University of Mississippi Yi Hua
University of New Mexico Madura Pathirage
University of North Carolina at Charlotte Lin Ma
University of North Texas Linlang He
University of Oklahoma Kasun Kalhara Gunasooriya
University of Texas at El Paso Eda Koculi
University of Utah Qilei Zhu
University of Wisconsin-Madison Whitney Loo
Vanderbilt University Alexander Schuppe
Vanderbilt University Lin Meng
Virginia Tech Jingqiu Liao
Washington University in St. Louis Xi Wang
Yale University Huaijin Ken Leon Loh

 

For more information on ORAU member grant programs, visit https://orau.org/partnerships/grant-programs/index.html.

ORAU provides innovative scientific and technical solutions to advance national priorities in science, education, security and health. Through specialized teams of experts, unique laboratory capabilities and access to a consortium of more than 150 colleges and universities, ORAU works with federal, state, local and commercial customers to advance national priorities and serve the public interest. A 501(c)(3) nonprofit corporation and federal contractor, ORAU manages the Oak Ridge Institute for Science and Education for the U.S. Department of Energy. Learn more about ORAU at www.orau.org.

 

Like us on Facebook: https://www.facebook.com/OakRidgeAssociatedUniversities

Follow us on X (formerly Twitter): https://twitter.com/orau

Follow us on LinkedIn: https://www.linkedin.com/company/orau

Follow us on Instagram: https://www.instagram.com/orautogether/?hl=en

About the AR for Enterprise Alliance (AREA)
The AR for Enterprise Alliance (AREA) is the only global membership-funded alliance helping to accelerate the adoption of enterprise AR by supporting the growth of a comprehensive ecosystem. The AREA accelerates AR adoption by creating a comprehensive ecosystem for enterprises, providers, and research institutions. AREA is a program of Object Management Group® (OMG®). For more information, visit the AREA website. Object Management Group and OMG are registered trademarks of the Object Management Group. For a listing of all OMG trademarks, visit https://www.omg.org/legal/tm_list.htm. All other trademarks are the property of their respective owners.

 




The evolution of delivering immersive media over 5G/Cloud

Guest blog from AREA member, Ericsson

This blog post introduces a white paper from Ericsson, an AREA Member. The full paper can be read here.

Introduction

With the availability of more Augmented Reality (AR) and Virtual Reality (VR) headsets, people are starting to experience more realistic and interactive immersive services. Thanks to the advanced technology embedded into the headset we are getting more powerful devices, able to compute and render images of increasing resolution and quality. Yet the development of longer and more realistic experiences is progressing slowly, limited by battery consumption, device form factor, and heat dissipation constraints. Many service providers have started to deploy services in the cloud to address these issues. However, running the application in the cloud imposes additional challenges: latency, bandwidth, reliability, and availability of the service. 5G cloud architecture can overcome those issues with solutions that can be applied incrementally, each differently affecting the complexity of the application, but each improving the ultimate experience for the user. Additionally, the ultimate vision for 5G architecture as applies to immersive experiences calls for new relationships among the ecosystem members – the consumer, communications service provider, hyperscale cloud provider, and developer/service provider.

This paper examines key aspects to launch an immersive service using 5G cloud infrastructure. First, reviewing recent offerings and developments, then walking through a set of use cases each exploiting more and more offload to the cloud. We follow with a description of 5G technologies that satisfy the use cases, and finally, reflect on the evolution of the stakeholders’ ecosystem in relation to their technical and commercial relationships to establish an immersive service using 5G.




Iristick announces major capital increase enabling accelerated user adoption

Iristick announces major capital increase enabling accelerated user adoption

Iristick’s smart glasses are being used by professionals across industries in Europe and the US since the company was founded in 2015. Various large-scale proof-of-concept projects are underway, which are expected to lead to accelerated adoption of smart glasses in 2023-2024.

Currently, more than 700 companies rely on Iristick to help improve operational processes, from remote assistance to inspection for mission-critical processes. Beyond the basic use case of remote assistance, Iristick also develops tailored solutions in close partnership with major corporations in healthcare, oil & gas, crop inspection and logistics. In addition, Iristick also supports projects for telemedicine in developing countries as part of its social impact initiative, Social In Motion.

Key customers include Bayer, Siemens Energy, JBT, Houston Methodist Hospital, HG Molenaar, Aviapartner.

Through a partnership with global ATEX market leader ECOM, a Pepperl+Fuchs subsidiary, Iristick developed the breakthrough Visor-Ex® glasses for use in harsh and potentially hazardous environments, which are typical for ECOM’s oil & gas customers.

Iristick will make follow-on investments in platform development, user adoption and sales & marketing in both Europe and United States. The company has appointed Karel Goderis, a software industry veteran, to lead this expansion, in close cooperation with the existing management team.

About Iristick

Iristick, based in Antwerp and New York, is a leading producer of smart glasses that enable handsfree communication and information sharing for the deskless workforce in various industries. Iristick smart glasses are being used globally for remote assistance, step-by-step workflow guidance, pick-by-vision and video conferencing. Iristick also supports NGOs with telemedicine equipment in most underserved parts of the world, enabling teleconsultation and remote expert guidance in rural areas. The award-winning Iristick smart glasses are the most balanced and lightweight on the market, and they seamlessly connect with both iOS and Android smartphones.

Published on Feb 22, 2023




Magic Leap 2 is now commercially available

Widespread availability of the Magic Leap 2 comes after a successful Early Access Program with companies like Cisco, SentiAR, NeuroSync, Heru, Taqtile, PTC and Brainlab. During this period, Magic Leap continued to refine and improve the device for training, communication, remote assistance use cases in clinical settings, industrial environments, defense, and retail stores.

“The Magic Leap 2 is the smallest and lightest augmented reality device built for the enterprise,” said Peggy Johnson, CEO of Magic Leap. “After working with customers across industries like healthcare, manufacturing and the public sector, we’re proud to release a device that features innovative breakthroughs critical to driving widespread adoption, including Dynamic Dimming™ technology, the industry’s largest field of view, and unparalleled image quality and text legibility. Magic Leap 2 will take the current use cases to the next level, and we can’t wait to see what our customers create.”

Magic Leap 2 integrates new innovations to address the historical barriers that have prevented the widespread adoption of AR technology and are critical to making AR a valuable tool for daily use in the healthcare, manufacturing/light industrial, retail, and defense sectors.

Key features and innovations of Magic Leap 2 include:

  • An open platform that empowers enterprises and developers with flexibility, cloud autonomy, and data privacy
  • 20% lighter and 50% smaller in volume than Magic Leap 1
  • Proprietary optics breakthroughs that enable best-in-class image quality, color fidelity, and text legibility
  • Largest field of view (up to 70° diagonal), compared to similar, currently available AR devices
  • Dynamic Dimming™ technology, a first-to-market innovation that enables Magic Leap 2 to be used more effectively in brightly lit settings with greater image solidity

Each of these advancements is designed to increase utility, comfort, and sustained use, in order to deliver what the enterprise market has been asking for — a device that can provide an immediate return on investment and can be worn for extended periods of time.

Three Commercially Available Editions

Magic Leap 2 is available in three editions:

Magic Leap 2 Base edition is best for stand alone use by professionals and developers that wish to access the most immersive augmented reality device available.

Magic Leap 2 Developer Pro provides access to developer tools, sample projects, enterprise-grade features, and monthly early releases for development and test purposes.  Only for internal use in the development and testing of applications. Use in commercial deployments and production environments is not permitted.

Magic Leap 2 Enterprise is designed for environments that require flexible, large-scale IT deployments and robust enterprise features. This tier includes quarterly software releases fully manageable via enterprise UEM/MDM solutions. Use in commercial deployments and production environments is permitted. Magic Leap 2 Enterprise comes with 2 years of access to enterprise features and updates.




Theorem Solutions – Placing models using QR codes in Augmented & Mixed Reality

How to Use QR Codes in HoloLens 2 Mixed Reality

Video: Using the QR Code Offset tool in Microsoft HoloLens 2The QR code offset feature using QR Code Detection in Microsoft HoloLens 2, allows a QR code to be utilized as an origin point when visualizing 3D models in MR. In Theorem Solutions’ Visualization Pipeline, users can set where the digital model will appear in relation to a QR code. Then any time you use a QR code to load the model it will appear in the same place.

This helps put models in context and allows users to see if something will fit in a certain location. For example, when seeing if parts would fit within an automotive setup, a QR code can be used to set the origin in the center of a car and digital models of parts can be positioned using the offset feature. This allows users to be more exact with the placement of their models when working with physical objects and digital models together.

Additionally, provided the QR code isn’t moved, this feature allows users to load a model in the same place every time. This gives users greater flexibility to their work process, allowing users to look at multiple models in succession, and then revisit a previous model with the assurance that the model will remain exactly where it needs to be.

Using QR Codes in Augmented Reality

For example, the Image Tracking feature in Theorem-AR can be used to load a large factory layout or production line and position it over a QR code. This is ideal for when you are looking to visualize large designs on a table top to be able to see all the data at once. The ability to utilize this on your handheld device makes XR technology much easier and accessible for such use cases.

Use Cases

QR codes can give users flexibility when working with digital objects interacting with a physical environment, and it is designed to be adaptable to a wide variety of use cases. But here are some examples of possible use cases that are enhanced by QR codes.

Precise Placement in MR- QR codes are particularly useful in XR when consistent precision is required. If you needed to line up two holes for a bolt to go through, for instance, you could use the QR Code offset feature to position the digital model correctly and ensure everything lines up.

Scaling large data in AR- Additionally, the ability to set a point of origin in the real world is useful when visualizing large data in AR. Having a positionable point of origin makes scaling much easier within AR, particularly when it comes to larger datasets. With QR codes you can scale the model larger or smaller, but the point of origin will remain the same.

In Summary

QR codes can be used in Augmented and Mixed Reality in a variety of different ways depending on the use case. They allow users to set a point of origin in the real world using a QR code, and with Microsoft HoloLens 2 users can position models in relation to the QR code.

With QR codes you:

• Can load 3D models into the same position multiple times
• Gain more precision when arranging digital models alongside physical objects
• Are able to scale models easier in AR
• Have greater flexibility with how your digital models are positioned in relation to the real world




Vuzix AR Glasses For EMTs

During the experimental program, select ambulances are given access to a Vuzix M400, lightweight smart glasses capable of projecting virtual images over the real-world, which EMTs can use to convey critical information to hospitals before their arrival via two-way audio and video calls.

By allowing doctors and nurses access to a patients vital signs, ECG readouts, and facial expressions in real-time, Vuzix claims that various departments can perform examinations and preliminary medical treatment before the ambulance even arrives. Hospital staff can also advice EMTs during in-transit emergency treatments, such as a blood transfusion or surgery.

“Among their expanding healthcare uses, Vuzix smart glasses can be an important life-saving tool for EMTs that require critical interaction and support from the hospitals to which they are headed,” said Paul Travers, President and Chief Executive Officer at Vuzix, in an official release.

“Our glasses are lightweight, comfortable and completely wireless, making them ideal to be used alongside the other head-mounted equipment EMTs must wear. We look forward to seeing an expansion of this trial by its participants, as well as adoption for similar usage by other providers in Japan and around the world.”

Vuzix AR smart glasses are currently being tested in select ambulances operating out of the Shunto Izu Fire Department in Japan, with plans to expand to additional ambulances in the future. The collaborative effort is being spear-headed by Juntendo University, Shizuoka Hospital, the Shunto Izu Fire Department, and AVR Japan Co., Ltd.




Theorm-AR: Multi Model – Visualisation using familiar devices

What is Multi-Model Loading?

We’ve recently increased our Augmented Reality (AR) capabilities to include multi-model loading, to meet the evolving industry requirements and customer needs for XR. Users can now load multiple models at once into the same scene, making the technology even more flexible.

Previously, only one model at a time could be loaded into an AR session. However, with multi-model loading, users can now visualize and mark-up multiple models at once. This gives users greater flexibility in their everyday working processes. Allowing them to quickly alternate between looking at one model and another to see how they compare, line-up or fit the available space. Pre-defined digital layouts that were previously only available in Mixed Reality and Virtual Reality are also now available in Augmented Reality. With Theorem-XR supporting multiple devices and data types, and only needing to prepare data once, this additional functionality in AR is closing the gap for what devices can be used in XR use cases.

How is it Used? Real Augmented Reality Examples

Factory layouts are an excellent example of a use case where Theorem-AR’s new multi-model loading is vital.

Part of a factory layout being visualized in the Theorem-AR application.

Being able to load pre-defined layouts on your smartphone or tablet enables you to work on much larger use cases such as defining shop floor plans in XR. You can visualize the relative scenery, components, and poseables, all on your handheld device.

It also gives you a good idea of how people will interact with a proposed factory layout. Including identifying what is in reach from a certain position, determining whether areas are accessible as well as assessing any risks. This can all be done in a re-configurable environment, allowing users to completely plan and adjust their layouts from the desktop before reviewing in AR.

The advantage of having this feature in Augmented Reality is that you can place the equipment models in your current environment. This means that you can visualize solid models in the room the equipment is planned to be in. The ability to analyze a proposed layout in this way means users can ensure layouts are correct before attempting to implement them. And since AR doesn’t require expensive headsets it’s easy to adopt for everyone involved.

A picture of the markup tool being used in the Theorem-AR app on the Samsung Galaxy Tab S7.

Enhance Your Design Processes

Another feature that is improved by multi-model loading is the ability to snap to a physical object with a digital model. With this feature, a physical object can be used as a reference point in order to automatically overlay a digital version. Users can now also arrange other parts around the digital model on desktop, which will appear when using this Snap To feature in AR. This allows users to test space requirements for a collection of parts using one part as a reference.

This combined with existing features, such as the mark-up tool to add notes and drawings, opens up the opportunity for engineers to collaborate with each other by identifying and easily sharing obstacles or flaws within a design.

To Recap

Extended reality is an excellent tool to remotely visualize design data from anywhere, and AR makes adoption even easier thanks to only requiring a handheld device such as a smartphone or tablet, which we all have access to. With the addition of multi-model loading users can now do even more with their data in AR; all while using a familiar technology that requires minimal training to use.

Factory layout planning is the best example of this, with users now having the ability to visualize layouts in the real world. Additionally, with design reviews, users can review multiple models from anywhere in the world.

Multi-model loading provides more options to address new use cases with AR, using devices that everyone has access to. Working around 3D design data has never been easier.




Factory layout Experience – Theorem Solutions

Optimize designs in immersive XR

The Factory Layout Experience enables a planning or layout engineer, working independently or with a group of colleagues, locally or in remote locations, to optimize Factory layouts through the immersive experience of eXtended Reality (XR) technologies. Seeing your data at full scale, in context, instantly enables you to see the clashes, access issues and missing items which a CAD screen cannot show.

On the shop floor there are literally 1000’s of pieces of equipment- much of it bought in and designed externally. Building designs may only exist as scans or in architectural CAD systems, and robot cells may be designed in specialist CAD systems. There will be libraries of hand tools, storage racks and stillage equipment designed in a range of CAD systems, and product data designed in house in mechanical CAD. To understand the factory and assess changes, all of that has to be put together to get a full picture of where a new line, robot cell or work station will fit.

A catalogue of 3D resources can leverage 2D Factory layouts by being snapped to these layouts to quickly realize a rich 3D layout. Advanced positioning makes it very easy to move, snap and align 3D data. Widely used plant and equipment is readily available, there is no need to design it from scratch for every new layout. Simplified layout tools enable you to position, align and snap layout objects quickly, which can be used by none CAD experts, enabling all stakeholders to be involved in the process, improving communication.

Testing Design and Operational Factors

Human centred operations can be analysed using mannequins that can be switched to match different characteristics. You can test design and operational aspects of a variety of human factors, to determine reachability, access and injury risk situations, ensuring compliance with safety and ergonomic standards.

It enables companies to avoid costly layout redesign by enabling all parties involved to review the layout collaboratively, make or recommend changes, and capture those decisions for later review by staff who could not attend the session.




Rokid displayed their AR glasses to AWE 2022

Liang Guan, General Manager at Rokid, enthusiastically stated:
“Numerous top-tech companies currently explore AR, XR, or the metaverse. As early as 2016, Rokid has been proactively expanding our AR product pipeline across leading technological areas of optics, chips, smart voice, and visual image. Today, we have X-Craft deployed in over 70 regions and Air Pro has been widely used in 60+ museums around the world. Moving forward, Rokid will keep delivering real value to enterprises through its line of AR products.”

Rokid products empower the frontline workforce, providing real-time analysis, views, and documents to the control center. Many media and participants were surprised after trying Rokid products. Saying that the various control modes provided by Rokid AR glasses are very convenient for users to operate and can effectively improve work efficiency.

Rokid X-Craft, demonstrated live at the AWE 2022, has officially received ATEX Zone 1 certification from TUV Rheinland Group. Becoming the world’s first explosion-proof, waterproof, dustproof, 5G, and GPS-supported XR device. This is not only a great advance in AR and 5G technology but also a breakthrough in AR explosion-proof applications in the industrial field. Many users at the event said after the trial that safety headsets are comfortable to wear and are highly competitive products in the market. It not only effectively ensures the safety of front-end staff, but also helps oil and gas fields increase production capacity.

Rokid Air Pro, a powerful binocular AR glasses, features voice control to help you enjoy a wide variety of media including games, movies, and augmented reality experiences. Rokid Glass 2, provided real-time analysis, views, and documents to the control center, and successfully improved traffic management and prevention to ensure the long- term stability of the city.

 

 




Enterprises head to San Diego to discuss impact of AR VR MR (XR) and Metaverse technologies at the 9th Augmented Enterprise Summit

Hundreds of the world’s most profitable and well-known companies are already using augmented, virtual, and mixed reality (XR) to deliver the benefits of digitization to the modern workforce. These organizations are seeing ROI in the form of faster training, shorter design cycles, higher quality, less downtime, reduced waste, and increased customer satisfaction.

Hear from those driving XR and other emerging technologies in enterprise at the 2022 Augmented Enterprise Summit when it returns in-person October 18-20 at the newly renovated Town & Country Resort Hotel in San Diego. As always, the event will feature a world-class lineup of enterprise end users along with the largest curated expo of enterprise-ready XR solutions. Organizations at every stage of adoption will get to hear how the world’s biggest companies are leveraging XR, try out the top hardware/software, and connect across industry lines.

“[AES] has consolidated its position as the nexus of the growing enterprise XR ecosystem, with the ability to bring together both augmented reality companies and the large enterprises that are now testing and implementing XR solutions.” – Forbes

The Speakers

Leading innovators from companies like Abbott, Bank of America, Con Edison, DuPont, Ford, GM, Kohler, Marathon Petroleum, and Ulta Beauty will share insight into adopting and deploying XR and related emerging technologies for applications such as remote support, collaboration, work instructions, training, design, marketing, safety, and sales.

The Program

The comprehensive educational program includes case studies across industry verticals with deep discussions on specific immersive applications, best practices, security, enabling technologies like 5G and AI, IIoT, the Metaverse, and more.

The Exhibit

Get hands-on in the carefully curated expo of AR smart glasses, MR/VR headsets, body-worn sensors, exoskeletons, and other connected devices ready for deployment today, along with the platforms and technologies that power them.

For information and tickets, visit www.augmentedenterprisesummit.com. Early bird rates expire August 29. Attendees are encouraged to register early as space is limited.

Official Event Brochurehttps://augmentedenterprisesummit.com/aes-2022-brochure/

Managed and hosted by long standing AREA partner BrainXChange – see their partner page.