1

Terminology Monsters Alive and Well

Most enterprise IT managers use dozens of acronyms and a large specialized vocabulary for communicating about their projects. The mobile industry is particularly rich with layers of terminology. Last year mobile IT professionals were studying and defining policies for BYOD. Now wearable technology is at the top of every mobile technology portal.  

Confusion in Communication

Ironically, Augmented Reality promises to deliver improved communication to the user but is plagued with a potential for confusion in terminology. The glossaries—yes, there are several—have nearly 40 frequently misused terms (each) with only a few overlapping terms. An analysis of the differences between the AR Community Glossary v 2.2 and the glossary in the Mixed and Augmented Reality Reference Model has been performed by Greg Babb, the AREA editor. This analysis will be discussed with experts during the virtual meeting of the AR Community Glossary Task Force on November 24, 2014.

Who Needs a Glossary?

Simply refer to Milgram’s Continuum. There is a virtual world and a real world. The space between these two extremes is “Mixed Reality.”

B-13-11-14-Terminology

It sounds and looks like a simple concept but debate about the segments within Mixed Reality can consume entire meetings.

BP-13-11-14-Terminology2

Is the 1st & 10  in football Augmented Reality? No, it isn’t according to the debate among experts of the AR Community. And when the details of Mixed Reality need to be spelled out and implemented in a distributed computing architecture by many different people, the definitions are insufficient and the concepts blend together. This is an impediment to enterprise AR introduction and adoption.

Diminished Potential

Speaking of blending together, in early November, Hewlett Packard announced its spectacular plans for 2015 as bringing “Blended Reality” to new personal computing products. The sprout PC replaces a keyboard and mouse with touchscreen, scanner and other features that let users take actual objects and easily “put” them into a PC.

Seeing a connection with Augmented Reality, the author of this Business Insider article tried to define Virtual and Augmented Reality. “That’s what you get when you put on Google Glass and it projects Google-y facts or images on the world. Or you run an app like Star Chart on your smartphone, hold it up to the sky and it superimposes the constellations on your view of the sky,” wrote Julie Bort to hundreds of thousands of readers.

Forget the fact that Google Glass does not really provide Augmented Reality and ask the executive who is running a multi-billion dollar business if they want an app to project constellations on their warehouse or factory ceiling. Augmented Reality’s potential is not only unclear; it actually gets diminished by comparisons of this nature (nevertheless, let’s not confuse this with “diminished reality,” OK?).

The fact that HP is beginning to pay attention to Blended Reality, Mixed Reality or Augmented Reality should not come as a surprise, given the integration of the Aurasma group into the company and the variety of services that could be provided on HP servers for managing and delivering AR experiences. But the new sprout PC looks awfully similar in some ways to demonstrations of Intel’s Real Sense. If these similarities are deep, then perhaps it is time for Intel and HP to invest in educating their target audiences about these new technologies. And a consistent vocabulary would come in handy.

To make sure that people do not jump to the conclusion that Blended Reality is something invented in 2014 by HP, the Business Insider article points out that Blended Reality was first introduced in 2009 by the esteemed Institute for the Future (IFTF). “The IFTF envisioned it as a sort of tech-enabled sixth sense, which will be worn or maybe even implanted into our bodies and interface with our computers,” concludes the Business Insider piece.

If that is how HP is using the term, there are even bigger problems than the definition of Augmented Reality terminology.

Mixed and Augmented Reality Reference Model

One of the solutions for this obstacle to Augmented Reality deployment is the Mixed and Augmented Reality Reference Model. The candidate specification is available for review and will be voted on within ISO/IEC JTC1 SC 29 WG 11 (MPEG) in 2015.

To learn more about the Mixed and Augmented Reality Reference Model, visit this AREA blog post.




Future for Eyewear is Bright (If Enterprise Requirements Can be Met)

The topic of hands-free displays or eyewear for Augmented Reality was popular at InsideAR 2014. It was discussed using many names (e.g., smart glasses, eyewear and HMD, to mention a few) and shown at many of the exhibition stands. On the stage, there were no less than six presentations focused entirely on hands-free displays. Even those speakers not focused on displays mentioned the opportunities they would offer once customer requirements were met.

New Report Forecasts Four Waves

During his InsideAR remarks, Ori Inbar of AugmenedReality.org described the latest addition to an already significant body of market research on the topic of hands-free AR displays. As Ori mentioned in his preface, the other reports to date do not agree on the size of the market, the terminology or how to seize the opportunities they will offer. It is not clear if or how this report compares or addresses the uncertainty.

Entitled simply “Smart Glasses Report,” Ori’s new report compiles findings from tests and interviews conducted with companies providing products and components. The scope does not include devices designed for use with Virtual Reality content.

The purpose of the report is to answer two burning questions: Will smart glasses ever come into their own? And when will this happen? To the first question, the answer is that those contributing to the report felt it was inevitable. As to the second question, it depends.

FutureBright-glasses-1

Ori predicts there will be four waves of technology:

  1. Technology enthusiasts: 10 models of glasses will sell 1 million units within the next year or two
  2. Initial winners will emerge and sell a total of 10 million units by 2016
  3. The early majority market will be composed of fewer competitors and will sell 50-100 million and reach critical mass by 2018
  4. Mainstream adoption will occur between 2019 and 2023 with the shipment of one billion units

FutureBright-glasses-2

Business opportunities will depend on the wave and type of company. Ori predicts that by 2020, there will be one 800 pound gorilla and a few challengers. He also predicts that prior to, and even during 2016, most sales of eyewear will be for enterprise use.

That depends on those devices meeting the requirements of the enterprise customers.

Enterprise Requirements for Head-mounted Displays

In his InsideAR 2014 speech, Dr. Werner Schreiber of Volkswagen provided a very detailed analysis of the requirements that head-mounted display vendors need to meet if they are to achieve traction in enterprise. To set the stage for his analysis, Schreiber began by saying that AR is not going to be popular in production processes until head-mounted displays meet certain requirements. In other words, the use of tablets and smartphones is just not convenient when the tasks people are doing require both hands.

One of the most important requirements described (in fact the first of at least 10) is power consumption. The devices will need to begin with a battery life of at least 10 hours. Another requirement was field of view (FOV). In Schreiber’s analysis, the FOV must be at least 130 degrees, or a moving FOV that is 50 degrees.

Of course, support for corrective lenses is not negotiable nor are systems that minimize wiring. If there must be wiring, it needs to include easy connectors both at the display and other power or computing devices that may be needed.

If the hardware can be designed to meet the minimum requirements, there remain significant software challenges. Schreiber’s ideal platform would permit automatic:

  • Creation of computer data
  • Removal of unused details
  • Creation of workflow
  • Consideration of possible collisions
  • Selection of necessary application tools and materials
  • Publishing of user-generated annotations into the experience

That is a lot of requirements to meet before 2016.

Do you have a product or strategy that will address these needs? Let us know about your product or opinions on these requirements.




The IEEE Standards Association at Inside AR 2014

This is a guest post by the IEEE Standards Association (IEEE-SA), on their participation at the 2014 edition of InsideAR in Munich.

There has been a lot of hype about Augmented Reality, and concrete examples help us all to grasp how far we have come and how much is yet to be done in the field. For this reason, we at IEEE Standards Association (IEEE-SA) were delighted to see all the new technologies showcased at InsideAR 2014. We also enjoyed talking with the diverse crowd of visitors to our booth, which included folks from wearable tech, mobile software and business development, research, academia and marketing.

Many visitors were aware of IEEE-SA’s activities and were interested in knowing more about our “Bringing Standards to Life” AR app experience, showing how IEEE standards (with focus on IEEE 802® standards) impact their daily lives. Some were interested in IEEE-SA’s role in AR, and whether there were any current standards available.

IEEE believes AR is an enabling tool for a number of technologies, including the broad tech that IEEE serves. – Mary Lynne Nielsen, Global Operations and Outreach Program Manager at IEEE Standards Association

Tools for Expanding Augmented Reality Markets

At IEEE we help companies interested in AR to plan for the future. For example, we offer tools for business planning, such as our complimentary copy of IEEE Scenarios for AR in 2020. Also, the standards-development process offers a path for engineers to realize the full potential of Augmented Reality, as the adoption of open standards fosters innovation and market growth through economies of scale and wider interoperability.

https://www.youtube.com/watch?v=OszYuLx_Onk

We lead campaigns and projects to advance open and interoperable AR. This makes sense, given the scope of IEEE expertise across technology areas that contribute to AR and the proven track record of IEEE for serving as a facilitator and catalyst in widely adopted technologies, such as networking communications and the smart grid.

The IEEE-SA offers a platform for developers and users to innovate for open and interoperable AR. For example, the IEEE-SA’s standards-development process is based on broad global participation and consensus—in alignment with the “OpenStand” principles for global, open, market-driven standards. And, indeed, a wide variety of IEEE standards and projects relevant to AR already exists today.

To facilitate participation from emerging AR domains, the IEEE also explores the establishment of new study groups, projects or standards based on requirements for all segments of the AR ecosystem. To that end, an IEEE-SA Industry Connections activity has been launched to, in part, identify use cases, glossaries, and best practices in the AR technology space.

Furthermore, through participating in meetings of the AR Community and conferences like InsideAR, the IEEE-SA proactively engages with other leaders around the world to encourage global AR market growth.

Conclusion

Overall, we found InsideAR 2014 to be a well-organized and very enlightening event, shedding light on the endless opportunities in the AR space, as it relates to technology overall. There were great sessions covering a range of topics that could provide inspiration across many other industries. We’re looking forward to next year’s event.




Reality Creeps into VR

Virtual and Augmented Reality professionals are increasingly finding their projects converging. Augmented Reality projects can overlay 3D data prepared for VR environments onto the viewed physical world. Virtual Reality specialists are discovering that their skills and tools can be applied to many more use cases when they use the real world as the source for models and as the scene for new experiences.

More Realism

The convergence of AR and VR is the result of several trends. The first is the introduction of commercially ready RGB-D devices. The 3D models generated from RGB-D systems can provide 3D objects for VR. In “The Quest to put more reality into Virtual Reality,” an article published in the MIT Technology Review, Philip Rosendale, founder of Linden Labs and the visionary behind Second Life, describes how using the latest systems to “capture” reality can reduce time and costs that used to be required to recreate reality using 3D graphic tools. High Fidelity, Rosendale’s latest startup, is using depth cameras and advanced facial detection and tracking algorithms to simulate the expressions of people on the faces of their avatars.

Another trend that contributes to the bleed over between AR and VR is the re-use of digital assets. Models created for VR and simulation are increasingly useful for recognizing real world objects, especially low-texture objects such as those made from glass and steel. These materials are highly reflective so the surface properties can trick recognition and tracking algorithms. By using the contours and edges of the model as the unique signature and comparing them with the real world properties of an object, Augmented Reality recognition systems are more efficient, less likely to have errors and to need calibration.

Moving About

Another reason that Virtual Reality professionals are increasingly interested in AR is the need for users to have assistive technologies when they are performing tasks in the physical world. With VR in a cave or using a Powerwall, users must stay in a small, confined area. Even with VR goggles, such as Oculus Rift, a user either sits or stands with limited mobility since cables connect the user to a computer and obstacles in the physical world are not dynamically introduced into the scene and can be dangerous.

By reusing procedures designed for simulation and training in Virtual Reality and adapting them to AR, the investments a company has made in high-quality digital assets have a potentially greater return. Conversely, new Augmented Reality projects may enter the test phase and reach performance objectives more quickly when their designers do not have to start from a “blank slate.”

Are you noticing these trends in your projects?

Join AREA members and others working at the convergence of AR and VR at the SAE AR/VR Symposium in Dearborn Michigan on November 18 and 19, 2014.

SAE-banner




It Is All About People

In his presentation on the InsideAR 2014 stage, AREA member Carl Byers of NGRAIN Corporation shared with the audience his conclusion that Augmented Reality is “all about people.” When in the middle of a technology-centric event taking place in the center of the densest AR-populated region of the world (Munich, Germany), it is important to reframe why all activities and investments matter: Augmented Reality helps people to see digital data in context.

The “all about people” guideline applies in medicine as well. Improving patient outcomes is at the heart of Dr. Kavin Andi’s research at St. George’s Hospital at the University of London.

Dr. Andi is an oral and maxillofacial surgery consultant who also practices microvascular reconstructive facial plastic surgery. In his InsideAR presentation Dr. Andi explained how Augmented Reality could provide value to:

  • Designing and communicating tumor removal and reconstructive processes
  • Detecting airway obstruction
  • Planning bone and tissue harvesting

The presentation also introduced some of the many tools surgeons use to achieve positive patient outcomes. Some tools are physical: scalpel, saw, clamps and hoses of many types. And others use software. In addition to the many credentials he has earned in his journey from molecular biology to reconstructive surgery, Dr. Andi has invested heavily in mastering the use of a dozen different software products in a graphics pipeline.

Beginning with scans of patient bodies, the processes he has defined permit surgeons to better prepare for surgery. By studying and planning procedures to be performed in the operating theater in minute detail in advance, the time spent in the actual theater is reduced. 

Patient scanning is highly sophisticated, involving measurements of tumor and bone density, blood flow and other critical factors. But as in other fields where big data is accessible to the expert, the data needs to be accompanied with analytical tools, and in the case of surgery with real time visualization.

The first gap Dr. Andi needs technology to address is advanced segmentation. Better segmentation of the data will separate the volume of the tumor from the surrounding area that is affected. Then, with this data superimposed on the patient, Augmented Reality can help surgeons and assistants—and ultimately the patient—to visualize the proposed treatment.

Leaving diseased tissue in the patient, or removing too much tissue due to low accuracy visualization can impact patient outcomes. That is why surgeons also need (for registration and tracking with hands-free displays) to have sub-millimeter accuracy on deforming surfaces.

When this can be achieved, Dr. Andi envisages that he and other surgeons will be able to perform complex procedures with 3D digital overlay on the patient instead of constantly referring to a display on the side.

To learn more, watch Dr. Andi’s InsideAR 2014 presentation below.




Augmented Reality in the Gigabit Age

Augmented Reality will be ubiquitous in the year 2025, according to one of the predictions shaped from the input of over 1,400 people and described in a new Pew Research Internet Report entitled “Killer Apps in the Gigabit Age,” released on October 9, 2014.

How can We Capitalize on New Bandwidth?

The respondents were asked to share their views on new killer apps in the gigabit age: will there be new, distinctive, and uniquely compelling technology applications that capitalize upon significant increases in bandwidth in the US between now and 2025?

The replies were then distilled into seven themes (Figure 1).

BP-AR-reality-1-10-14
Source: Pew Research Center, Sept. 2014, “Killer Apps in the Gigabit Age”

“In 2025, Augmented Reality will enhance people’s sense and understanding of their real-life surroundings and virtual reality will make some spaces, such as gaming worlds and other simulated environments, even more compelling places to hang out.”

Many of the changes described by the expert respondents will emerge from a decade of maturation of the technologies we currently refer to as the Internet of Things. There will be much more than fast networks involved. In 2025, everything is continually connecting, capturing, storing and transmitting observations, as well as receiving data from other sources.

 Augmented Reality in 2025

The three key components of Augmented Reality—hardware, software and content—are directly impacted, even redefined, by the advance of technology and bandwidth. The biggest trends in hardware for AR-assisted experiences will be miniaturization and use of harvested power. With smaller sensors and processors, there is an increased ability to embed and distribute the components of an AR solution into multiple objects, both on humans and in the environment. With the ability to harvest locally generated or locally stored power, batteries will become smaller and their capacities greater.

Surprisingly, personal display technologies—a necessary hardware component for Augmented Reality experiences—are not often discussed by the respondents of the Pew study. Perhaps there is tacit agreement that there will be personal head-worn displays; the emphasis is greater on the use light and lasers producing high-resolution digital objects and representing physical world features, including people, with real time holography.

BP-AR-reality-2-1-10-14

Respondents frequently describe software, the second key component of AR, as being less distinct and visible as part of computer-assisted systems than in 2014. Many experts predict that the “app economy” will be a distant memory. Software will run in the background, barely reaching the user’s awareness.

Impacts on People

In addition to technology changes, the respondents recognize that there will be enormous societal changes combined with rapidly evolving economic and cultural shifts. The study explores the human elements of life surrounded by sensors and actuators.  Visions converge on many points: improved healthcare services, more engagement between people at a distance and discreet “apps,” such as are prevalent today, will disappear.

Concerning other dimensions of life in 2025, there is controversy. Some describe increased security and privacy and others the opposite.  Perhaps this dichotomy is not reflecting of contradictions, but is rather a reflection of the deepening digital divide, between those who are more digitally empowered having greater privacy and the rest being more exposed.

Aside from how travel behavior will differ, this study does not shed light on the professional side of life in 2025, nor how Augmented Reality and its enablers will impact business due to streamlined commercial transactions, greater human productivity and lower risk to material assets.

How would you respond to the study’s key question? How will your AR-assisted business capitalize upon significant increases in bandwidth over the next decade?

 




Big Data Projects are Trickling Down

Big Data is not just for IBM. Many organizations with important requirements are also benefiting from Big Data projects to improve the quality of their products and services, detect and take advantage of new business opportunities and accelerate decision making with fewer errors.

Earlier this year, Dell Software released a study conducted by Competitive Edge Research Reports, a subsidiary of Triangle Publishing Services Co., on Big Data projects and planning. The study, “BIG DATA: Midmarket Companies See Early Success,” concludes that even medium-size firms are now able and eager to benefit from Big Data initiatives.

The report’s key take-home messages could easily apply to Augmented Reality projects in the same and other enterprises. First, Augmented Reality is not just for large organizations.

Second, the report makes a recommendation for Big Data and AR project advocates, regardless of the size of the organization: visionaries who want to use data to change their businesses must have strong senior management as well as IT department support to succeed.

A closer look at their words of wisdom also bear out the need for IT departments to partner with those who ask for big technology investments to make sure they are targeting outcomes that will push business forward.

Lessons Learned

Partnership and collaboration are important in any business, but this study reveals that for big, unproven IT projects, the support and collaboration of senior executive stakeholders is critical to success. Executives have to be willing to go to bat for the best team, to obtain financial resources and data access and to make changes in their businesses to take full advantage of the project’s outcomes.

Another key to success related to the first is for the project to use real time data acquisition and analytics to improve business performance. For some that means making an organization more agile and responsive to its customers. For others, performance is measured in improved product quality (and error reduction). Prototypes and projects in development do not show the great benefits that can come about once technologies are fully deployed and embraced. Once Big Data projects are in production, respondents report significant improvement in key business intelligence metrics. Cost reduction justifications were not among the top six metrics.

BP-big-data-prof2-3-10-14-500-778

There is definitely a lot to learn when designing and putting in place Big Data projects. Much of the necessary talent for complex big data project success can be sourced from within an organization, the study revealed. The respondents said that investing in staff—both to bring specialists in-house when needed and to train existing employees—was quickly justified over outsourcing parts or all of these projects.

The Big Data study participants said they could do more and recommended that others following their steps invest deeply in five enabling technology groups:

  • Real time processing
  • Predictive analytics
  • Data cleansing
  • Data dashboards
  • Visualization

AR-enabling business systems will improve data acquisition speed and quality, potentially reducing the need for data cleansing and real time data visualization.

Learning from Others

In this study, 300 executives in companies with 2,000 to 5,000 employees shared how they are implementing data-driven approaches to decision making processes. Though not the focus of the study, some decisions must be made with physical world objects and are often taken by people lacking the time and skills to request or perform data analyses. The data on which to base decisions needs to be readily available and meaningfully communicated if it is going to be useful. That is where AR-assisted systems become important.

AREA charter members are reducing the roadblocks to Augmented Reality introduction in their organizations through appropriate collaboration with their IT groups. They are also sharing lessons learned in their projects with one another. Lessons documented in the Competitive Edge Big Data study fit Augmented Reality projects perfectly and there are probably many more Big Data project lessons that will guide enterprise AR project leaders in the future.

Have you been speaking with the Big Data advocates and project leaders in your organizations? What lessons have they shared with you?

Big data does not solve all problems in an organization. There are many myths that should be examined more carefully before selling these to your senior management. Read this article about Gartner Group’s view on big data myths.




Three Enterprise Augmented Reality Myths

Many visitors to the AREA web site seek to prepare themselves and their teams to sell the benefits of Augmented Reality to the senior executives of their organizations. The goals of their campaigns are to receive the executive stakeholders’ support for financial resources, to ensure that the project or initiative is in alignment with the primary business goals and to get the executive allegiance to resolve major (and minor) headaches when they arise in the course of the project.

To achieve these goals, they will need data, proof of others having achieved great results, as well as strong convictions. Passion and convictions may even be more important and certainly more readily available than tangible results reported by others at this early stage of Augmented Reality’s evolution. But they also lead to myths forming in the minds of audiences that are later proven wrong.

Avoid planting or maintaining illusions about these three common myths.

Augmented Reality will Work with Anything

Conceptually, Augmented Reality works on any target in the physical world. To bring digital information into alignment with a person, place or thing, however, requires two important precursors:

  1. The target needs to have an experience associated with it using data that has been extracted from the target. The unique set of features associated with the physical world target need to be stored digitally
  2. The system the user has available must be able to detect the same set of unique features and those are the basis for identifying the experience

Then, to continue the AR experience, the system must also track the same target over time.

In a laboratory or another controlled environment, demonstrations are based on targets that have features that can be reliably extracted and matched. Many objects are unsuitable because they are reflective, or they deform or change in one or more dimensions over time.

In the real world, many other factors can interfere with reliability of target detection and experience delivery even when the target is well known and optimal. There can be changes in lighting when the detection system relies on the camera. There can be interference from large metal objects and bodies of water when the detection system relies on a compass or orientation of the user. The network-based data may not be available for a variety of reasons.

When describing AR to senior management, or to anyone for that matter, do not promise that it will work with anything and under any condition. It simply does not and probably never will.

Augmented Reality Confers Wisdom

When describing the benefits, it is tempting to attribute new and important powers to the users of Augmented Reality-enabled systems. After all, everyone wants to use new technology, right?

While people using well-designed and delivered experiences should be able to perform their jobs better and to make data-driven decisions, the value of AR to the challenge it is designed to address relies heavily on two factors:

  1. The raw data that was used to originally design the experience. If the information shown to the AR user is incorrect or misleading, the use of AR does not make the information correct
  2. The design of the total experience from the point of view of the user’s interaction as well as the integration of the experience into a larger workflow

The data needed for the employee or customer to be able to complete a task or make a decision may simply not be available to the AR-enabled system. In this case, the user’s work is no further enhanced than it would be without the AR-enabled system. Raw data may also be too large or lack analysis, making it difficult to use.

A great deal of study and experience goes into planning the data access, ensuring data quality (e.g., fresh data is important but so does data based on longitudinal studies), and data processing prior to delivery to the user. Here the organization’s history and know-how with Big Data initiatives can be very valuable.

In general, the synchronization of information with the real world can accelerate business processes and decision making, but the user remains in control and must use common sense.

Augmented Reality Reduces Costs

Savings Ahead (image)

In general, early studies strongly suggest that when compared with traditional methods, use of AR can reduce time and lower error rates, both factors that impact production and delivery costs. At scale, the costs could be very significant and have profound impact on business performance.

What this statement fails to take into account is the cost of designing systems, equipping people and environments, testing and implementing AR-enabled systems in the enterprise. Begin with the costs associated with testing and prototyping AR-enabled procedures. These will be far higher than simply adding a Web page or a link to the user manual.

Total cost of ownership must include everything from the beginning to the final training of users. No one has accurately measured all of these costs, although AREA members are in the process of developing the tools and systems to estimate them.

Cost reduction in IT development is never a sound argument to use with management. The better metrics pertain to the return on investment that may be measured in weeks, months or years, depending on the project size and scope.

Proceed with Passion and Caution

In addition to citing the results of others who have conducted studies on the use of Augmented Reality in enterprise, advocates who seek to build support among executives for new AR projects need to use caution when describing the potential impacts of Augmented Reality. The most common myths should be avoided at all costs or the proposed project’s support may wane or completely evaporate.

What are some of the myths you’ve heard about enterprise Augmented Reality? Share them with others so they can avoid them.




Christine Perey to Speak at 8th Annual InsideAR Conference

We are excited to be a part of Metaio’s InsideAR in Munich, Germany, the largest annual Augmented Reality conference in Europe. Every year, the conference brings together international corporations and thought leaders to discuss developments in Augmented Reality and to showcase the latest innovations. Last year’s conference brought together over 800 participants, 45 speakers and more than 400 international companies.

This year, the two-day conference’s themes will be wearable computing and 3D sensors. Among the main stage speakers will be Mary Lynne Nielsen of AREA member IEEE Standards Association and AREA executive director Christine Perey. The two will present an exciting IEEE project in a talk entitled “AR in 2020: Scenarios for the Future.”

Following the AR in 2020 presentation, there will be a panel discussion about enterprise Augmented Reality, in which Christine will share her views on the opportunities and barriers ahead.

We also look forward to meeting our readers and members at the conference. We will be reporting our experiences and impressions about InsideAR on our blog, so keep an eye out for new posts!




Training as an Initial Use Case for Augmented Reality

Among use cases to which Augmented Reality brings value, an increasing portion focuses on training. Training use cases fill an immediate need for user education in a rapidly changing industrial environment and can be successfully implemented with current AR technologies.

Training requirements differ across industries and employee roles. Metrics with which organizations can track training benefits also vary widely. AR-assisted on-the-job or classroom training of assembly and maintenance personnel can ultimately reduce the cost and time of doing tasks, leading to higher operational efficiency.

Assembly

Some manual production tasks such as spot welding and riveting are still prevalent despite the fact that many assembly workflows and procedures are automated. These manual tasks may require not only particular skills to fulfill them, but also demand expertise in a particular area of assembly, as well as personal adaptability to changing production processes.

Agile manufacturing is particularly demanding on workers, as the high cost of modifying production environments means that it often falls to humans to remain flexible in the face of changes. The less automatization and more customization in production environments, the more experienced workers need to consult manuals, work instructions and fellow experts to perform challenging tasks, which in turn can effectively become problem-solving exercises. This puts greater emphasis on training.

Maintenance

Training is also important in maintenance organizations. While many aspects of preventive maintenance have become automated via data collection, some types of corrective interventions require resolution by experts, who might be off site. Redundant systems may be available to provide fail-safe operation and minimize risk of service degradation but human intervention is usually still necessary. Whether preventative or prescribed, assessing risks, organizing a task force and scheduling the intervention according to escalation protocols take time.

Training for system maintenance in order to minimize downtime is a significant endeavor as well as time consuming, especially in cases requiring certification and where an employee’s equipment service credentials must be maintained with repeated training. Often, cross training is necessary to acquire expert knowledge for diagnosing a variety of problems.

Current Technological Limitations

Despite their potential, Augmented Reality technologies are still in their infancy. Mobile devices well-suited to AR-assisted tasks, such as Amazon’s Fire Phone and Google’s upcoming Tango Project, are making their way to the mass market. However, in a shop floor or factory setting, the acceptance and implementation of AR faces significant barriers:

  • Workplaces are often dynamic environments, with frequent repositioning and changing of tool and equipment layouts
  • Factory floors are often fully utilized with little open space
  • Natural light enters the workplace and its intensity and angles change over the course of the day and the seasons

These realities pose challenges for AR implementations in various ways. For example:

  • Technologies for object tracking and image recognition in mainline workflows are not optimized. Although edge-based and markerless tracking capabilities are available in Augmented Reality SDKs, they are not sufficiently powerful or flexible for tracking reliably in all conditions. For example, edge-based tracking can work well under varying lighting conditions, but is still susceptible to engineered surfaces presenting repetitive edges or many smooth surfaces.
  • Marker-based tracking can produce consistent results, but fiducial position and quality in changing and cluttered work environments is difficult to maintain or guarantee.
  • Capital investment and costs for customized development of pilot projects can be high. For example, prices for dedicated AR devices possessing novel features such as multiple cameras or sensors for depth perception, sufficient processing for real time capture of camera pose in a 3D coordinate system and realistic rendering based on image-based lighting are correspondingly steep. Customization requirements such as ruggedization for specific business environments must often be added to the cost.
  • As a result of technological shortcomings in features such as tracking, some companies must explore and develop their own solutions, such as employing multiple cameras or sensors in a “six degrees of freedom” configuration in a controlled environment.
  • Form factors and user interfaces of available head-mounted displays that are powerful enough for Augmented Reality tend to be bulky and provide an immersive experience. This can also create safety risks in busy and cluttered environments.

Although such factors introduce barriers to mainstream adoption of AR in enterprise, many companies are developing pilot projects to assess benefits at a reasonable cost. Such projects are being implemented in controlled environments, which provide opportunities to introduce AR-based training programs.

AR-based Training for Efficient Learning

Well-designed, AR-based training procedures using objects and environments that are as close as possible to real experiences can enhance learning by capitalizing on the following benefits:

  • Reduced cognitive load promotes better focus on both procedures and the object being studied
  • By overlaying AR instructions on objects, associations between the instructions and the object’s features and visual cues are more highly synchronized, thus promoting spatial learning
  • Collaborative features of AR can connect remote experts with trainees, providing instructor-led guidance

Generally, faster learning increases worker availability and encourages better problem solving. An additional benefit to enterprises is that they can use pilot project facilities and set ups for dual project assessment and training purposes.

Is your organization using or contemplating AR-assisted training programs?