1

The AR Market in 2017, Part 2: Shiny Objects Attract Attention

Previous: Part 1, Connecting the Dots

 

There’s a great deal of attention being paid to the new, wearable displays for Augmented Reality. Hardware permits us to merge the digital and physical worlds in unprecedented ways. Wearable hardware delivers AR experiences while the user is also able to use one or both hands to perform tasks. The tendency to pay attention to physical objects is not unique to AR industry watchers. It is the result of natural selection: genes that gave early humans the ability to detect and respond quickly to fast moving or bright and unusual objects helped our ancestors survive while others lacking those genes did not.

Although this post focuses on the hardware for Augmented Reality, I don’t recommend focusing exclusively on advancements in AR hardware when planning for success in 2017. The hardware is only valuable when combined with the software, content and services for AR in specific use cases.

Now, considering primarily AR hardware, there are important trends that we can’t ignore. This post only serves to highlight those that, in my opinion, are the most important at an industry-wide level and will noticeably change in 2017.

Chips accelerate changes

Modern Augmented Reality hardware benefits hugely from the continued reduction in size and cost in hardware components for mass market mobile computing platforms. We need to thank all those using smart phones and watches for this trend.

As the semiconductor manufacturers gain experience and hard-code more dedicated vision-related computation into their silicon-based mix, performance of complete AR display devices is improving. Combined with the technology Intel recently acquired from Movidius (which will produce significant improvements in wearable display performance beyond 2017), Intel RealSense is an example of a chip-driven technology to monitor. Other offerings will likely follow from NVIDIA and Apple in 2017.

When available for production, the improvements in semiconductors for wearable AR devices will be measurable in terms of lower latency to recognize a user’s environment or a target object, less frequent loss of tracking, higher stability in the digital content that’s rendered, lower heat and longer battery life. All these are gradual improvements, difficult to quantify but noticeable to AR experts.

As a result of optimization of key computationally-intensive tasks (e.g., 3D capture, feature extraction, graphics rendering) in lower cost hardware, the next 12 to 18 months will bring new models of AR display devices. Not just a few models or many models in small batches.

These next-generation wearable display models with dedicated silicon will deliver at least a basic level of AR experience (delivery of text and simple recognition) for an entire work shift. Customers will begin to place orders for dozens and even, in a few cases, hundreds of units.

Optics become sharper

In addition to semiconductors, other components will be changing rapidly within the integrated wearable AR display. The next most important developments will be in the display optics. Signs of this key trend were already evident in 2016 – for example, when Epson announced the OLED optics designed for the Moverio BT-300.

It’s no secret that over the next few years, optics will shrink in size, drop in weight and demand less power. In 2017, the size and weight of fully functional systems based on improved optics for AR will decline. Expect smart glasses to weigh less than 80gms. Shrinking the optics will make longer, continuous and comfortable use more likely.

Developers raised issues about color quality and fidelity when testing devices introduced in 2015 and 2016. Color distortion (such as an oil spill rainbow effect) varies depending on the type of optics and the real world at which the user’s looking (the oil spill pattern is particularly noticeable on large white surfaces). The 2017 models will offer “true” black and higher fidelity colors in a wider range of settings. Again, the experts will feel these improvements first and “translate” them to their customers.

Another key area of improvement will be the Field of View. Some manufacturers will announce optics with 50° diagonal (a few might even reach 80° diagonal) in 2017. When combined with advanced software and content, these changes in optics will be particularly important for making AR experiences appear more realistic.

Combined with new polychromatic materials in lenses, lower weight and stronger material in the supports, optics will be more tolerant of changes in environmental conditions, such as high illumination, and will fit in more ruggedized packages.

More options to choose from

Speaking of packaging, in 2016 there are three form factors for AR displays:

  • monocular “assisted reality” hardware that clips onto other supports (frames) or can be worn over a user’s ear,
  • smart glasses that sit on the user’s nose bridge and ears, and
  • head-worn displays that use straps and pads and a combination of ears, noses and the user’s skull for support.

The first form factor does not offer an immersive experience and isn’t appropriate for all use cases, but assisted reality systems have other significant advantages (e.g., lower cost, longer battery life, lighter weight, easy to store) so they will remain popular in 2017 and beyond.

At the opposite end of the spectrum, the highly immersive experiences offered by head-worn devices will also be highly appealing for different reasons (e.g., depth sensing, accuracy of registrations, gesture-based interfaces).

We need to remember that the use cases for enterprise AR are very diverse and so can be the displays available to users. The new wearable AR display device manufacturers entering the fray in 2017 will stay with the same three general form factors but offer more models.

In addition to diversity within these three form factors there will be extensions and accessories for existing products – for example, charging cradles, corrective lenses, high fidelity audio and materials specifically designed to tolerate adverse conditions in the workplace environment.

The results of this trend are likely to include:

  • those selling wearable displays will be challenged to clearly explain new features to their potential customers and translate these features into user benefits,
  • those integrating AR displays will be more selective about the models they support, becoming partners with only a few systems providers (usually leaning towards the bigger companies with brand recognition)
  • buyers will need to spend more time explaining their requirements and aligning their needs with the solutions available in their budget range.

Wearable display product strategists will realize that with so many use cases, a single user could need to have multiple display models at their disposal. One possible consequence of this trend could be reduced emphasis on display systems that are dedicated to one user. We could see emergence of new ways for multiple users in one company or group to reserve and share display systems in order to perform specific tasks on schedule.

Rapid personalization, calibration and security will offer new opportunities to differentiate wearable AR display offerings in 2017.

Enterprise first

All of these different form factors and options are going to be challenging to sort out. Outside enterprise settings, consumers will not be exposed to the hardware diversity in 2017. They simply will not invest the time or the money.

Instead, companies offering new hardware, even the brands that have traditionally marketed to mass market audiences, will target their efforts toward enterprise and industrial users. Enterprises will increase their AR hardware budgets and develop controlled environments in which to compare AR displays before they can really make informed decisions at corporate levels. Third party services that perform rigorous product feature evaluations will be a new business opportunity.

While this post highlights the trends I feel are the most important when planning for success with AR hardware in 2017, there are certainly other trends on which companies could compete.

To learn more about other options and trends in wearable AR displays in 2016, download the EPRI Technology Innovation report about Smart Glasses for AR in which I offer more details.

What are the trends you think are most important in AR hardware and why do you think they will have a significant impact in 2017?

 

Next: AR software matures and moves toward standardization




The AR Market in 2017, Part 1: Connect the Dots

In your profession, you’re one of those who are most aware of future technologies. The proof of this fact is that you have discovered Augmented Reality and decided that it’s sufficiently important to dedicate at least a few minutes or hours to getting oriented and staying informed about the trends.

That’s the first step. But you know enough not to believe everything you read or see in a YouTube video.

The next step, if you haven’t done so already, is to train yourself to separate the biggest hype from the facts. This is not easy, but you should be able to hit this milestone by attending industry events where AR is being demonstrated and you can put your hands on the products in action, even under highly controlled conditions. Visiting one or more of the AREA members in their offices or inviting them to visit your facility will be even more valuable.

You’ll see some mock ups and, if you ask tough questions, you will also see some of the weaknesses and begin to glimpse the complexity of the problems facing adoption of these technologies. Keep a log of these experiences you have with Augmented Reality and the impressions they leave on you.

If you really want to understand the strengths and weaknesses “up close” and have budget, you can develop a project or participate in a group project that focuses on a well-defined use case.

Share what you learn

Once you’ve seen and captured notes about more than 10 live demonstrations in your journal and have personally touched AR, you can begin to “translate” for others what you’re seeing and doing.

But, wait! The insights you’ve acquired could offer a strategic advantage to your company so, why would you share them? Even if you are thinking that you should keep what you’ve gathered to yourself, I encourage you to share because AR is more than just another new technology offering you or your group a competitive advantage. This is going to be a major crowd-sourced, multi-year project. When more people are looking into AR technology, it will improve faster than when only a few are focusing on and investing in it in isolation.

Once AR is good enough to be used weekly (or daily) in more than one use case, it is going to push operational performance to new levels. Then you will be able to use it to full advantage.

AR may become as transformational to your company and industry as the Web and mobile devices during your professional career. But it requires more than one or two examples and adopters in an industry. Reaching a threshold level of adoption in your industry will be necessary. And, to begin meaningful adoption there need to be a few experts. We need people like you to translate the theory and potential of AR in your industry to practice and reality.

I’ve found that I can translate for others what I’m observing by breaking it down into four interrelated topics: hardware, software, content and services. For over a decade I’ve used these four legs of the AR platform to organize projects, to review the history of AR and to capture current status.

In a series of AREA blog posts I am sharing developments I believe will be important in AR in 2017 using this simple framework.

Connecting the dots around us

One observer can’t see all the details of the entire AR landscape, certainly not in all industries where the technology will apply. Fortunately, the AREA is a network of very bright minds that are also looking forward as well as in other directions, at the same time.

Many AREA members are in the trenches of Augmented Reality take on a forward looking challenge when, at the end of each year, they begin preparing their forecast for the following year.

I hope that these posts will permit you to find your place, connect yourself and in your comments to these posts, you will compare and contrast what you’ve observed with my experience.

If we each take a few minutes, hours or a day in this last quarter of 2016 to connect our dots together we will all be better equipped to concretely plan for an exciting year ahead!

Next: What’s new for AR hardware in 2017?




Welcome Lockheed Martin to the AREA

The newest member of the AREA is one of the largest companies in the aerospace, defense, security, and technologies industry – and an Augmented Reality pioneer.

It’s Lockheed Martin. The Bethesda, Maryland-based company, which employs 98,000 people worldwide, joined the AREA as a Sponsor member in October. Lockheed Martin will be represented on the AREA board by Christi Fiorentini, a senior manufacturing applications engineer in Lockheed’s Marietta, Georgia Aeronautics organization.

Fiorentini traces Lockheed’s involvement in AR back about 15 years, when the company’s research and development team began exploring opportunities for the technology. Each of Lockheed Martin’s business units — Aeronautics, Space Systems, Missiles and Fire Control, and Rotary and Mission Systems – has experimented with the technology. About five years ago, Fiorentini’s unit, Aeronautics, began looking into augmented reality for remote subject matter expert applications.

“The technology then wasn’t quite up to par for use in a production environment, so it got put on the back burner,” recalled Fiorentini. “Around October last year, Aeronautics gained a new interest in the technology when we observed many start-ups and smaller businesses bringing AR to fruition.”

lm_logo_notagline_blue

Lockheed Martin is seeking to incorporate augmented reality throughout the product lifecycle, from the initial design phase all the way through sustainment, with a heavy interest in manufacturing.

“We’ve been investigating the technology, going to conferences, and developing proofs of concept to build business cases, because we need to prove that this technology can work within our own boundaries so that we can make the investment,” said Fiorentini. “If we’re going to shift into this realm of technology, it’s a big move, a big status quo change, and so while I do believe the ROI is there, we need to show that it works on our actual use cases to convince our leadership to invest in it.”

That’s why Lockheed Martin joined the AREA.

“I think more people across our business are starting to realize the potential of the technology and so we’re trying to formalize our approach across the entire enterprise,” Fiorentini noted. “We’re working to bring individual players from our different business areas together and define a more strategic approach to exploit this technology. We have some upcoming pilots that we’re working on with some of the leading AR vendors, and we’re members of DMDII, the Digital Manufacturing and Design Innovation Institute. As we engage more with these vendors and other enterprise members investigating this technology, we saw the AREA as being a good place to start pushing what we think should be best practice. We’re a big player in the aerospace and defense industry, so we’re looking at how we can use our influence to shape what the AR industry for enterprise is going to look like and the AREA is a great place to help convey that message.”




New Editor Joins the AREA

The AREA has a new editor. Jim Cassidy joined the organization in October and is tasked with supporting the research and preparation of the AREA’s content by developing content strategy, authoring original and thought leadership content, editing content from members and third parties, and producing newsletters. We sat down with Jim recently to learn more about him and his work with the AREA.

Welcome to the AREA, Jim. Tell us about your background.

Thank you, it’s a pleasure to be here. I’ve been a freelance marketing communications writer for more than 25 years, working with a wide variety of clients across many industries, from consumer packaged goods to healthcare to analytical laboratory instruments. Living and working in eastern Massachusetts, much of my work has been for information technology companies, such as NTT DATA and PTC, so I have a good foundation in both the enterprise technology environment surrounding AR and the vertical markets where it’s making an impact.

What interested you in joining the AREA?

This is a very exciting time for AR. When you look at the most promising applications – take field service, for example – much of the work processes still revolve around printed manuals and parts diagrams, even as products have become more complex and intelligent. AR will have massive impact on productivity in just that one area. In an interview earlier this year, General Electric CEO Jeffrey Immelt said that by helping field engineers fix machinery better the first time, AR could be worth billions of dollars to industrial companies like GE.

The potential is great, new innovations are coming to market on a daily basis, and there’s a lot of interest and anticipation in the market. At the same time, we’re still struggling with how to get from the pilot stage to widespread adoption. There’s a lot of AR information out there, but before people can really use it, it needs to be put into context and disseminated. The AREA can play a central role in that process, and as a professional communicator, I saw that as a great opportunity.

What sorts of content are you hoping to bring to the AREA?

From the enterprise perspective, we need to continue to deliver specific, practical information that helps accelerate adoption – technical dos and don’ts, but also business-focused content that identifies AR opportunities and supports organizations in arguing a business case for adoption. I’d also like to see more case studies of successful AR deployments, and more forward-looking, visionary content from the strategic thought leaders among our members. The more we can foster an open forum for sharing ideas, the more vibrant the AREA community will be – and that will benefit everyone.

Is there anything you would like to say to AREA members?

I’d like everyone – AR providers and enterprises – to know that I’m here to support them in bringing their ideas and experiences to the AREA. I’m available to explore story ideas, do interviews, and act as an editor to help them shape and deliver the content they’d like to share with other AREA members. I’d also like their feedback on what they value most about our content and what we can do better.




Augmented World Expo Europe 2016: A Review

The inaugural Augmented World Expo (AWE) Europe is now history. The big conference and exhibition that for seven years has served as a showcase for all the emerging realities in the AR community in the US is now an international affair, having held its second Asian edition in China last month, followed by AWE Europe October 18 – 19.

20161019_144223_hdr

The event took place at the Berlin Congress Centre, in the heart of the city in Alexanderplatz. The beautiful venue was a fantastic match for the exhibition with its two floors and the large convention hall that hosted two full days of speakers. The main stage saw a number of inspiring talks by names that have made history in both AR and VR like Bruce Sterling. The speakers’ agenda also included an impressive developers track; many providers took advantage of the event to create tutorials and demonstrations of their authoring technologies. One couldn’t help noticing the growing impact that Unity3D is having as authoring tool for AR experiences. In fact, many of the major software vendors showcased their Unity3D plugins to the developers attending.

20161019_111318_hdr

The exhibition hall featured more than 45 companies showcasing software solutions, optics, devices and applications. Interestingly, a large percentage of the exhibiting companies were those that focus their business models around enterprise solutions and industry-related technologies. This strengthens the belief that enterprise AR is a major driver for the success of the technology. A side hall hosted a number of startup companies promoting their innovative ideas (one of which, PuttView, won the “Best in Show” award for its golf practice solution).

Several European AREA members were represented:

  • Bosch showcased a number of solutions for AR-enabled automotive maintenance and servicing at one the largest booths in the show.
  • Catchoom brought in their image recognition and AR platforms demonstrating use cases for both enterprise and marketing use cases .
  • Joinpad centered their demos around industrial use cases, focusing especially on smart glasses solutions for MRO scenarios of complex equipment, developed using their Arrakis SDK for AR applications authoring.

20161019_111224_hdr

The audience attending the exhibition was a mix of tech enthusiasts and industrial customers interested in the benefits of AR and VR for their businesses. Although mostly European, many ticket holders travelled from Canada and the US to participate.

20161019_171313_hdr

All in all, AWE Europe felt like a promising first edition for the AR-focused conference that has set trends for AR development in the States. Even compared to the US edition in June, many demos had evolved to a more mature stage, especially with the proliferation of innovative devices like the Hololens, showing the rapidly development of the market. While AWE Europe is somewhat smaller than its American counterpart, we at the AREA are convinced that it is here to stay and will become a “must go” event for those interested in the potential of this technology.

The AWE organizing committee will share many of the talks on the main stage and the developers track on the AWE YouTube Channel.




When a Developer Needs to Author AR Experiences, Part 1

There’s an established process for creating a new Web page. If it’s not already available, you begin by defining and developing the content. Then, there’s the formatting. Often there’s some scripting to provide interactivity. When the “authoring” is done, a page is published.

It’s not all that different for AR. Once an Augmented Reality project’s use case is clear, the experiences come about through an authoring process that resembles that of preparing and publishing content for the Web.

authoring-cycle

Figure 1. An AR authoring system combines trackables (created using features of the real world and a tracking library) with digital content that is encoded into presentation data and then assigned interactive functions (e.g., see more details, show relevant info, move and freeze in position, hide/close). The AR authoring system uses databases to store the scene elements – trackables, presentation data and interactions. (Source: PEREY Research & Consulting)

Today, Content Management Systems for the Web support the steps for page development with grace. Systems like WordPress and Drupal are so easy to use and commonplace that we hardly notice their existence.

In contrast, there are many AR authoring systems from which a developer can choose and none are as mature as CMS for the Web. The choice of tool and approach depends on the project requirements, skills of the developer and the resources available.

Define the AR Project Requirements

Before choosing an AR authoring system, the requirements must be clear. An AR experience design process should generate a storyboard and, from the storyboard, the following factors are defined:

  • User settings (indoor, outdoor, noise levels, etc.)
  • Need for a user management system to provide experience personalization or tracking
  • Need for live communication with any remote experts during the experience
  • Type of recognition and tracking required (marker, 3D, SLAM, etc.)
  • Need to access device GPS and compass for geospatial context
  • Preferred display device (smartphone, tablet, smart glasses or another type of HMD)
  • Human interaction modalities (hands-free, touch, speech, gaze)

In addition to the above variables that can be deduced from the storyboard, there could be other factors to consider. For example, if the target device is connected by an IoT protocol or if there are any supplementary files (e.g., videos, PDFs, etc.), then these need to be provided to the developer as early as possible. The project manager should also specify the frequency and types of updates that may be required after the initial AR experience is introduced to users.

When these project requirements and parameters are defined, the developer can choose the tools best suited for the AR experience authoring.

Want to know more about your choices of authoring platforms? There’s more in the next post




When a Developer Needs to Author AR Experiences, Part 2

This post is a continuation of the topic introduced in another post on the AREA site.

Choose a Development Environment

Someday, the choice of an AR development environment will be as easy as choosing a CMS for the Web or an engineering software package for generating 3D models. Today, it’s a lot more complicated for AR developers.

Most of the apps that have the ability to present AR experiences are created using a game development environment, such as Unity 3D. When the developer publishes an iOS, Windows 10 or Android app in Unity 3D, it is usually ready to load and will run using only local components (i.e., it contains the MAR Scene, all the media assets and the AR Execution Engine).

Although there’s a substantial learning curve with Unity, the developer community and the systems to support the community are very well developed. And, once using Unity, the developer is not limited to creating only those apps with AR features. The cost of the product for professional use is not insignificant but many are able to justify the investment.

An alternative to using a game development environment and AR plugin is to choose a purpose-built AR authoring platform. This is appropriate if the project has requirements that can’t be met with Unity 3D.

Though they are not widely known, there are over 25 software engineering platforms designed specifically for authoring AR experiences.

authoring-landscape

Table 1. Commercially Available AR Authoring Software Publishers and Solutions (Source: PEREY Research & Consulting).

The table above lists the platforms I identified in early 2016 as part of a research project. Please contact me directly if you would like to obtain more information about the study and the most current list of solutions.

Many of the AR authoring systems are very intuitive (featuring drag-and-drop actions and widgets presented through a Web-based interface), however most remain to be proven and their respective developer communities are relatively small.

Some developers of AR experiences won’t have to learn an entirely new platform because a few engineering software publishers have extended their platforms designed for other purposes to include authoring AR experiences as part of their larger workflow.

Or Choose a Programming Language

Finally, developers can write an AR execution engine and the components of the AR experience into an app “from scratch” in the programming language of their choice.

To take advantage of and optimize AR experiences for the best possible performance on a specific chip set or AR display, some developers use binary or hexadecimal instructions (e.g., C++) which the AR display device can run natively.

Many developers already using JavaScript are able to leverage their skills to access valuable resources such as WebGL, but creating an application in this language alone is slow and, depending on the platform, could fail to perform at the levels users expect.

To reduce some of the effort and build upon the work of other developers, Argon.js and AWE.js are Open Source JavaScript frameworks for adding Augmented Reality content to Web applications.

Results Will Depend on the Developer’s Training and Experience

In my experience, it’s difficult to draw a line between the selection of an AR authoring tool or approach and the quality or richness of the final AR application. The sophistication and quality of the AR experience in an app is a function of both the tools chosen and the skills of the engineers. When those behind the scenes (a) ensure the digital content is well suited to delivery in AR mode; (b) choose the components that match requirements; and (c) design the interactions well, a project will have the best possible impacts.

As with most things, the more experience the developer has with the components that the project requires, the better the outcomes will be. So, while the developer has the responsibility for choosing the best authoring tool, it is the AR project manager’s responsibility to choose the developer carefully.




Digitally Assisted Assembly at Factory 2050

In a previous article, we introduced the University of Sheffield’s Advanced Manufacturing Research Centre (AMRC), a member of the AREA that develops innovative techniques and processes for high-precision manufacturing. A subsidiary, the AMRC with Boeing, collaborates with a variety of research partners in areas such as informatics, automation, robotics and Augmented and Virtual Reality. Besides aerospace, the results of this research into manufacturing are used in the automotive, construction and other high-value industries.

Earlier this year, the AMRC opened the doors of its newest manufacturing facility, Factory 2050, a glass-walled reconfigurable factory in Sheffield Business Park. The facility investigates and showcases new technologies and processes relating to Industry 4.0, including projects to explore digitally assisted assembly technologies to fill a looming skills gap in the aerospace industry.

Augmented Reality in Digitally Assisted Assembly

The Digitally Assisted Assembly (DAA) project focuses on techniques for delivering work instructions to factory operators, including the use of optical projection AR and wearables. According to the AMRC’s digital manufacturing specialist, Chris Freeman, the project allows partner companies to experience visual work instructions through a number of delivery mediums. Research includes:

  • Optimizing AR tracking methods for effectively getting a part’s position to generate a frame of reference.
  • Designing user experiences for work instructions that are projected or overlaid onto a part within the user’s field of view. These include instructions that guide users for tasks such as gluing sequences, fastener insertion, inspection, wiring looms, complex routines and more. The aim of this research is to reduce cognitive load and optimize the user experience for delivery across a variety delivery modes (e.g., projection AR) and devices from tablets to smart glasses.
  • Using location-based services to add contextualized task and environmental information in relation to the user’s position or progress within a task.

With the technology still in its infancy, one of the aims of DAA is to simply demonstrate what can be achieved with the technology. Although smart glasses and wearables aren’t proven or certified for use in manufacturing, they are nevertheless being baselined for further research and possible future production usage. The AMRC are currently following a strategy of first identifying the “low-hanging fruit” from the current state of hardware and software, which means that research associates want to find some of the most obvious and perhaps least expensive options up front.

Skype for HoloLens

Although the AMRC is studying a variety of smart glasses brands such as ODG and Vuzix, remote collaboration use cases with Skype for HoloLens is an interesting application for meeting the needs of certification processes. This use case includes methods for lineside support and remote verification to complement or replace expensive quality management activities requiring the presence of a supervisor. It may even include assistance by remote colleagues when assembly or repair problems are encountered.

Freeman notes that though such use cases aren’t spectacularly advanced in terms of tracking in comparison with other scenarios such as overlaying geometric 3D models on objects being assembled, they are nevertheless disruptive of current manufacturing practices.

Projecting Work Instructions on Large-Volume Objects

Projected Augmented Reality, sometimes referred to as “spatial Augmented Reality,” features one or more optical projectors projecting a beam of light onto a specially designed work surface or even on the parts being assembled. Thus work instructions are displayed directly on surfaces to guide operators. The DAA is currently researching methods for effectively using projection AR in conjunction with both fixtures and robotic arms in work cells.

For example, an operator assembles aircraft parts with the assistance of robots to present a part at a better angle than if it were lying on a work surface. A robotic arm can swivel or position the part as needed by the operator, and projected AR is able to guide operators through a series of specific manufacturing procedures.

Defining Success

As has been discussed in other industry contexts, return on investment on any new technology can be challenging to define (whether it’s for AR or any other). Typical ROI calculations seek to determine the amount of savings a project can bring and when that investment will pay off. In the case of AR, relevant questions include how to quantify the value of conceptualized data and geometries for its usage in performance metrics.

Further research into AR will eventually uncover such answers, but in the near term, human factors and ergonomic studies can also quantify the technology’s effectiveness. For example, the AMRC is currently conducting AR-related training scenarios to determine a variety of metrics such as memory retention and AR’s overall effectiveness, as well as usability and operator response.

Beyond Aerospace

Although research being conducted at Factory 2050 aims to advance the state of the art in aerospace manufacturing, many of the techniques and procedures derived by DAA and other projects will eventually be used in other industries, such as energy and construction. For example, assembly techniques for large-volume aerospace parts can also be applied to assembling prefabricated homes at a factory as part of modular building manufacture. Having recently opened its doors, it’s apparent that the new facilities of Factory 2050 will have an impact on both present and future manufacturing in multiple domains for many years to come.




Calculating ROI for AR Investments: One Approach

In a field as young as AR, organizations seeking to justify investments have had little historical data available to help calculate ROI. The team at AREA member Catchoom have addressed this challenge by putting together a white paper that provides a step-by-step means of calculating ROI for its CraftAR image recognition software based on an actual Catchoom customer in the healthcare industry.

Download the white paper from Catchoom to learn more.




New EPRI Report Offers Insights for Wearable AR Display Customers

Innovation in wearable technology continues to accelerate. Smart watch vendors are making so many announcements there are portals dedicated to helping customers sort through the details. There is also a portal to help customers compare the features of wearable displays for AR.

And new wearable segments are being defined. For example, Snap recently introduced its $130 Spectacles

Is this all good?

Thinly veiled behind the shiny new products is a vicious cycle.

The continual stream of announcements confirms for readers of this blog that the wearable AR display segment is still immature. This means that those customers with limited budgets seeking to select the best hands-free AR display for their projects in 2016 are likely to be disappointed when an update or new model appears, making the model they just brought in-house out of date. Risk-averse organizations may put their resources in another product category.

On the other side of this conceptual coin, the companies developing components and building integrated solutions for wearable AR must continue to invest heavily in new platforms. These investments are producing results — but without clear customer requirements, the “sweet spot” for which the products should aim is elusive. And when customers lack clear requirements, differentiating the latest offerings while avoiding hype is a continual challenge.

Breaking the cycle with specific requirements

When customers are able to prioritize their needs and provide specific product requirements and budgets, there’s hope of breaking this cycle.

The Electric Power Research Institute (EPRI) and PEREY Research & Consulting, both AREA Founding Sponsor members, have collaborated on the preparation of a new report entitled Program on Technology Innovation: State of the Art of Wearable Enterprise Augmented Reality Displays.

Targeting the buyers of wearable technology for use when performing AR-assisted tasks in utilities (and by extension, in other enterprise and industrial environments), the report seeks to demystify the key product features that can become differentiators for wearable AR solutions.

Based on these differentiators, the first multi-feature wearable AR display classification system emerges.

Perey

Source: Program on Technology Innovation: State of the Art of Wearable Enterprise Augmented Reality Displays. EPRI, Palo Alto, CA: 2016. 3002009258.

The report also discusses challenges to widespread wearable AR display adoption in technology, user experience, financial, and regulatory/policy domains.

Descriptions of a few “lighthouse” projects in utilities companies, logistics, manufacturing, and field service provide readers valuable insight into how early adopters are making the best of what is currently available.

This report is available for download at no charge as part of the EPRI Program on Technology Innovation.

If you have comments or feedback on the report, please do not hesitate to address them to the authors, Christine Perey and John Simmins.