The AREA Balances Vision and Pragmatism

The AREA has a vision and, at the same time, we must remain pragmatic. Let me explain.

We’re all familiar with the myths about the industrial revolution: it happened overnight, right? Coal leapt out of the ground and formed coke. Iron became steel and the rest is history. Then, 100 years later, in the late-20th century, computers profoundly changed what people could do with their knowledge and, using networked computers, silicon-driven industries revolutionized how people communicate and how just about everything—human and machine—works.

VisionIn the future, businesses will experience another transformation that will have a big impact on workers who have spent far less time behind computer screens than knowledge workers. Largely without the assistance of silicon-based computational devices, they move themselves and materials around; they build, transform, maintain, use, repair and even take apart objects in the physical world.  They are pragmatic when it comes to the introduction of new technologies.

Soon, the procedures these workers need to follow will leap into their line of sight and at their fingertips, endowing them with the knowledge of those who benefited from the previous cyber revolution.

Improving Workplace Performance

Augmented Reality-assisted enterprise systems will drive significant improvements in many operations, as measured by lower costs and higher productivity. Those whose work requires guidance, decision support or collaboration concerning objects and places in the physical world will, through contextually relevant visualization of information: 

  • Be more productive
  • Operate more safely
  • Consistently comply with all policies and procedures
  • Perform tasks with the lowest possible number of errors

But first, some innovative leaders have to take risks and make investments that may, as when Matthew Boulton continued to finance the research of James Watt, appear imprudent.

Who Are We Talking About?

The steam engine and industrial revolution did not happen overnight. It was only many years after entering into partnership with entrepreneur Matthew Boulton that the concepts and hard work of James Watt produced significant efficiency improvements by comparison with the earliest model steam engines.

The AREA recognizes that many investors will take risks before Augmented Reality is mature. There will also be many engineers whose brilliance of conception and practical know-how will be needed to improve the productivity of workers.

Who Are We Talking To?

We’re talking to you: the developer, the business manager, the IT group, the learning department manager, the innovation group, and the executive office.

You each need different arguments to persuade you of the value of investing in enterprise Augmented Reality.  Our content and informational programs are being designed to match the needs of these diverse groups of stakeholders.

Our target audiences are not limited to those in enterprises that are implementing Augmented Reality for their internal operational needs. We also recognize target audiences in organizations that provide goods and services to enterprise customers. These include the providers of core enabling technologies and vendors of enterprise IT hardware and software, as well as systems integrators of many kinds.

predict future

Pragmatic, Like Our Members

Everyone wants to quickly achieve goals towards AR introduction. But hype builds up unrealistic expectations. Disappointed decision makers may not shoulder the risks again.

In order to help all these different groups present their offers and, on the other hand, understand what they are acquiring or introducing into their businesses, the AREA is pragmatic.

The AREA’s programs are designed to simply and consistently:

  • Reduce the myths and mysteries associated with Augmented Reality
  • Help customers to establish reasonable expectations (where they can be met with existing technologies)

Pragmatism with practical information—not  hype—is as important as vision.




Introducing the AR in Strategic Enterprise Sessions

In contrast to companies that are responding and reacting to changing conditions without a plan, strategic enterprises systematically apply the best planning and management processes.

A strategic enterprise successfully integrates emerging and mature systems to improve processes and outcomes. Managers in strategic enterprises factor in their existing information systems development and maintenance efforts, as well as any new technology introduction when guiding their businesses towards the achievement of goals.

ARiseBlogPost

The AREA and AR in Strategic Enterprises

AREA members met with strategic enterprise managers in Sheffield on July 1. The focus of the event was on how to introduce and integrate AR into strategic enterprises.

Over the course of the day, AREA members shared their experiences and recommendations for choosing use cases, preparing data for use in AR experiences, choosing and training users for AR pilots and introduction activities, measuring impacts and managing risks associated with AR introduction.

screenshot_257

The AREA’s Value Added

The sessions are a perfect example of AREA members demonstrating their thought leadership and collaborating to share knowledge with others. In addition to the valuable discussions made possible during the networking and panel sessions, the recordings of the presentations are now available for viewing on YouTube.

Through the ARise event and its sessions, the AREA and its members are accelerating AR adoption in the corporate environment. As Executive Director of the AREA, I am proud to present the 11-session series and hope you will gain additional insights into the ways Augmented Reality can benefit your enterprise.




Augmented Reality Can Increase Productivity

Technological and cultural shifts that result in enhancements in manufacturing tend to increase complexity in products and processes. In turn, this complexity increases requirements in manufacturing and puts added pressure on organizations to squeeze out inefficiencies and lower costs where and when feasible.

This trend is acute in aerospace, where complexity, quality and safety require a large portion of final assembly to be done by humans. Corporations like AREA member Boeing are finding ways to improve assembly workflows by making tasks easier and faster to perform with less errors.

At ARise ’15, Paul Davies of Boeing presented a wing assembly study in collaboration with Iowa State University, showing dramatic differences in performance when complex tasks are performed following 2D work instructions versus Augmented Reality.

A Study in Efficiency

In the study, three control groups were asked to assemble parts of a wing, which required over 50 steps to assemble nearly 30 different parts. Each group performed the task using three different modes of work instruction:

  • A desktop computer screen displaying a work instruction PDF file. The computer was immobile and sat in the corner of the room away from the assembly area.
  • A mobile tablet displaying a work instruction PDF file, which participants could carry with them.
  • A mobile tablet displaying Augmented Reality software showing the work instructions as guided steps with graphical overlays. A four-camera infrared tracking system provided high-precision motion tracking for accurate alignment of the AR models with the real world.

Subjects assembled the wing twice; during the first attempt, observers measured first time quality (see below) before disassembling the wing and having participants reassemble it to measure the effectiveness of instructions on the learning curve.

Participants’ movements and activities were recorded using four webcams positioned around the work cell. In addition, they wore a plastic helmet with reflective tracker balls that allowed optical tracking of head position and orientation in order for researchers to visualize data about how tasks were fulfilled. Tracker balls were also attached to the tablet (in both AR and non-AR modes).

First Time Quality

To evaluate the ability of a novice trainee with little or no experience to perform an operation the first time (“first time quality”), errors are counted and categorized. The study revealed that tablet mode yielded significantly less errors (on average) than desktop mode.

In the diagram above, the blue bar represents the first assembly attempt and the green bar is the second. The diagram also shows that subjects using Augmented Reality mode made zero errors on average per person, indicating the potential of AR to improve first time quality for assembly tasks.

In the diagram above, the blue bar represents the first assembly attempt and the green bar is the second. The diagram also shows that subjects using Augmented Reality mode made zero errors on average per person, indicating the potential of AR to improve first time quality for assembly tasks.

Rapid assembly

ARIncreaseProductivity-graph2

This diagram measures time taken to complete tasks by mode, both the first and second time. AR-assisted participants completed tasks faster the first time than with other modes

Conclusions

Overall the study witnessed an almost 90% improvement in first time quality between desktop and Augmented Reality modes, with AR reducing time to build the wing by around 30%. Researchers also found that when instructions are presented with Augmented Reality, people gain a faster understanding and need less convincing of the correctness of tasks.

Bottom line is that this study shows and quantifies how complex tasks performed for the first time can benefit from Augmented Reality work instructions. If the task is done with fewer errors and faster, the impact on productivity is highly significant.

Where can Augmented Reality make an impact in your organization?




Just-in-Place: The Case for Augmented Reality in AEC

This post by Dace Campbell previously appeared on the AEC Industry website Beyond Design 

AR: An Extension of Lean

For decades, pundits, prophets, prognosticators, and purveyors of technology have been forecasting the fit of Augmented Reality (AR) for the AEC industry (I know, I’m one of them!) In recent years, as hardware evolves, and BIM matures, we find ourselves on the threshold of AR solutions being truly capable of extending capabilities for architects, contractors, and owners.

Autodesk is no stranger to AR, and we continue to keep an eye on the technology, looking for the right solutions for our customers. We’ve defined (refined?) AR as the real-time display of spatially contextual information, where context is the physical environment. In today’s world of “big data,” we are seeking out ways to support the industry’s efforts to offer the right information and the right materials to the right people, in the right place, at the right time. In manufacturing and construction, we talk about just-in-time (JIT) delivery to support Lean operations and production control. With AR, we extend that conversation beyond Just-in-Time, to Just-in-Place delivery of information. That is, information is served up to the end user in an appropriate, localized, specific spatial context.

Is AR Ready for You?

I recently attended both the Augmented Worlds Expo in Santa Clara, and Autodesk’s NAC3 (North American Construction Customer Council) hosted by DAQRI in Los Angeles. At both events, you could all but taste the anticipation of AR solutions made ready for the AEC industry. The evolution of hardware sensors and processors, wearable form factors, and software development toolkits has bred a diverse range of AR solutions for businesses and consumers alike. DAQRI, in particular, is now offering their Smart Helmet, with a world of potential to disrupt the way we consume and process information on the construction site. On the low end, 3D-printed lenses can be clipped to your smart-phone to support immersive viewing of spatially contextual information for as little as $20!

No AR solution on the market today is without its flaws, and there is plenty of room for improvement when applying AR in AEC, such as: support for collaborative decision-making, hands-free tasks, balancing task-focus and safety, and application in harsh environmental conditions. But there are a lot of things to like about what solutions are here and on their way as technology continues to get better.

AEC Use Cases for AR

At Autodesk, we’ve identified over a dozen use cases for AR in design, construction, and facilities operations and maintenance. We’ve analyzed these according to the business pain points they address, the scope or value of that pain, the potential for integration with our solutions, phase of a project, and level of effort to implement (including user’s motion area, indoor/outdoor mix, scene preparation, tracking accuracy required, display latency allowed, see-through requirements, and data-serving burden).

We’ve also asked ourselves: in which of these uses cases is AR a truly unique solution, as opposed to alternative ways to solve the problem? That is: where is AR desirable, and perhaps even necessary, to eliminate specific industry pains by applying its unique characteristics?

Overlay and Compare

A skeptic can reasonably argue that, while beneficial, AR isn’t a unique or necessary solution in almost all AEC use cases. However, one condition keeps coming up over and over again, where AR can truly and uniquely solve a problem, save valuable time, and improve confidence in decision-making: real-time overlay of information onto the real world to support comparison (and contrast).

To illustrate this, think of these cartoon sketches, in which you are tasked to identify the subtle differences between them, and think of how dramatically the process could be improved simply by overlaying one image over the other in a single display:

 

cartoon sketches

Can you spot the 8 differences?

cartoon sketches 2

Overlay the images into a single display to easily identify the differences!

Sweet Spots for AR in AEC

We perform this comparison exercise again and again in design, construction, and operations – except in AEC it’s much harder than the challenge shown in the cartoon example above. In some cases, the question isn’t simply: “can you spot the 8 differences?” Rather, it’s: “Are there any differences? If so, where? And, how many are there (and how do you correct them)?”

With this condition applied as a filter to the long list of potential AEC uses cases, just a few rise to the top: the ones in which we need to perform compare and contrast tasks in quickly and accurately. In gross terms, they exist where the real world (as-is conditions), the Building Information Model (intended conditions), and the newly constructed world (as-built conditions) each intersect with the other, as shown here:

just-in-place-3

There are three intersections between these realms. Architects deal with the intersection between “as-is” and “intended” when visualizing their designs in/at the project site. Think of AR here as a real-time, interactive, “photo-match” for studying and communicating a design in context. Next, contractors face the comparison between the “intended” design and the “as-built” project, both when performing layout on the site and again when performing quality control to confirm that their work was installed or assembled correctly (see also: Capture Reality, Recapture Time). Finally, owners contend with the overlap between the “as-built” documentation and the true “as-is” world of the facility in operations. Here, they seek to supplement their experience of the living project with meta-data and systems hidden or enclosed by architectural finishes.

In all of these cases, the key project stakeholders look for a solution in which they can view virtual data overlaid on the physical world, intuitively and in real time, to compare and contrast new and old, desired and actual, recorded and reality.

Serving up the proper information in context is key, both just-in-time and just-in-place. After all, there is a time – and a place – for everything.




ARLU—the Right Event at the Right Time

EPRI is proud to collaborate with the AREA on the first ever Augmented Reality in Leading-Edge Utilities (ARLU) this July, where we will lead the industry to discern a disruptive technology and anticipate and solve issues through collaborative effort. In fact, ours is the only industry we know of where Augmented Reality as a disruptive innovation is being openly discussed. This isn’t going unnoticed.  Other industries are pointing at utilities and saying “Hey, look what they’re doing.”  Utilities are rarely perceived as having an active role in exciting new trends.

Three in One

The ARLU event is, in fact, three events in one.  First, it’s a meeting where EPRI and utilities industry representatives will present their Augmented Reality research and projects to vendors developing applications for the utility industry.  Vendors will see where utilities are placing emphasis in their development efforts and learn about the issues they‘re encountering.  Requirements such as size, weight and battery life of wearable technologies will be explored through the presentations, and will impart to participants a deeper understanding of the issues facing introduction of Augmented Reality in utilities.

Next, vendors will present their latest technologies for immediate feedback from industry experts. Not all technologies fit every utility situation and discussions around fit for purpose of presented technologies will be lively and informative. Finally, a workshop on gaps in existing standards will bring multiple perspectives to the problems of creating safe, comfortable and interoperable AR experiences in the utility space. 

Thought Leaders

Having subject matter experts together in one room is the one of the key objectives of this meeting. As we’ve been preparing the ARLU event, we’ve invited some of the brightest people in the utilities and utilities software industry to mix with thought leaders in Augmented Reality. We expect that the impact will last much longer than the two days in July because new ideas will emerge in the weeks and months that follow as the participants who meet in Charlotte continue to develop relationships.

We expect to capture some of the ideas these thought leaders generate and to share the outcomes of discussions with the broader community so that many others can also benefit.

Time is Right

We feel this is the right time for such a conference. Today, judging a technology for what it can do right now is the wrong way to look at it.  Advances occur almost daily and it’s better to first define what’s needed to build a future state of the technology. That’s where Augmented Reality is today. Practical applications are just now being introduced but an explosion of functionality is coming. By the time the average person notices the ubiquity of Augmented Reality, many of the issues we are going to discuss in Charlotte will already have been settled.

Wearable technologies with Augmented Reality are at a stage where real utility applications are possible. At the same time, shifting demographics at utilities are bringing in younger, less experienced workers—as older, more practiced workers are leaving. There needs to be an orchestrated “changing of the guard” where institutional knowledge, gained by years of hard work and experience, is transferred to a younger, more tech-savvy generation. The technologies presented at ARLU will deliver remote expertise and put information at the fingertips of crews composed of less seasoned individuals.

The wise man says it’s better to act on a lightning flash than wait to hear the thunder. That’s why we planned this event in 2015 and look forward to seeing many of the readers of this blog at the first ARLU event.




DAQRI @ AWE 2015

This post was previously published on the DAQRI blog and posted here with permission.

As we head into Augmented World Expo 2015, we have seen this event grow and evolve alongside the industry. Within this last year, we’ve seen more mainstream conversations about Augmented Reality than ever before.  As a result of this increased focus, there is now more than ever, a need to support and encourage innovation in Augmented Reality and computer vision technologies.

This year, we are excited to be showcasing our products and to spotlight our recent acquisition of ARToolKit, the world’s most widely used augmented reality SDK.  By releasing ARToolKit professional SDKs under LGPL v3.0 for free use, DAQRI is committing its resources to the open source community in the hopes that (in the words of our founder, Brian Mullins), “we can kick off the next AR revolution and inspire a whole new generation to pick it up and make things that haven’t been imagined yet.”

On the exhibition floor, Ben Vaughan and Philip Lamb from ARToolworks will be available to discuss ARToolKit and DAQRI’s newly-created open source division that they are heading up. In addition, representatives from DAQRI will be demoing DAQRI 4D Studio and showcasing exciting technologies from Melon, our brain computer interface division.

DAQRI executives will also be presenting throughout the conference:

Monday, June 8:

  • 10:45 am – 11:30 am—DAQRI 4D Studio Tutorial
    Katherine Wiemelt, Sr. Product Director, DAQRI
  • 2:15pm – 3:00 pm—How to Measure Enterprise AR Impacts
    Andy Lowery, President, DAQRI

Tuesday, June 9:

  • 11:30 am – 1:00pm—Smart Glasses Introductions
    Matt Kammerait, VP Product, DAQRI
  • 2:00 pm – 3:00 pm—Entertainment, Games, and Play
    Brian Selzer, VP Business and Product Development, DAQRI
  • 7:00 pm – 8:00 pm—Auggie Awards
    Brian Mullins, Founder and CEO, DAQRI

Wednesday, June 10:

  • 2:45 pm-3:00 pm—From Evolution to Revolution: How AR will Transform Work, in the Future
    Brian Mullins, Founder and CEO, DAQRI



Not as Easy as It Looks

In a modern world everyone assumes that power recharges are just a “plug away.” Just plug your device into the next power outlet and your issues are over. Generating and distributing power isn’t magic. It’s an industry.  

Why Do You Care about EPRI?

The Electric Power Research Institute (EPRI) works collaboratively with more than 450 utility provider members and participants internationally to identify or create the technologies that utilities will need to provide affordable, reliable, safe, efficient and environmentally responsible electric power to the world. By leveraging its research and membership, EPRI is helping utilities investigate the benefit of using AR-assisted workflows to improve worker safety and efficiency.

Why Does EPRI Care about the AREA?

Augmented Reality promises to change how electric utilities will operate in the future. Like in electric utilities, strong industry associations are necessary to promote and develop Augmented Reality technologies and standards to the point that they are productive and as easy to use as plugging your device into the socket.

Our experience with introducing new systems into electric utilities is that you can’t get from here to there, from nascent to mature industry, in a single step. You have to have partners and communities. New tools and techniques gleaned from AREA members in other industries can be applied to utilities, reducing the cost of technology implementation for our members in the near term, and the cost of generating and distributing electricity in the long run. Collaborating with AREA members will result in products that better serve the utility industry and the public.

Members of the AREA represent the thought leaders in an emerging technology that EPRI and its members think will be pivotal to increase efficiency and safety for workers among EPRI‘s utility membership.

EPRI is proud to be a Founding Sponsor member of the AREA. Joining AREA and getting it launched successfully is only the beginning.

Bringing Together the Best and the Brightest

In close partnership with AREA, EPRI is going to bring the AR vendor community closer to the utility vendor community.

As a first step, we are organizing a special two-day workshop to be held at the EPRI Charlotte office on July 27-28, 2015. The AREA members and other AR ecosystem stakeholders will present on their position in the market and technology. The utility customers will ask some tough questions about reliability, standards and security. They will go over the best use cases around workflow in the field and asset management.

In a matter of a few short days, these groups will be able to formulate better strategies for improving the operations of utilities without putting assets and people at risk.

We know that this is an important step towards bringing our two industries closer together. Join the AREA to learn the details of this special program and visit the event page for more information about AR in Leading-edge Utilities.




Why IEEE Joined the AREA

The Augmented Reality for Enterprise Alliance offers a central platform for the Augmented Reality ecosystem to come together in a manner that fosters growth, collaboration and market awareness and development.

This is why the AREA is important for the IEEE as well as for everyone who will use AR in the future. Prior to the AREA, no one has focused on the enterprise AR value chain. In the AREA, those who provide AR solutions and components will improve their processes and products in partnership with their future customers, and customers will be fully engaged in the process of expanding this market.

The Diverse Viewpoints of AREA Members

The Founding Sponsor members of the AREA, including IEEE, seek to bring together the diverse perspectives of the AR value chain in order to provide opportunities for working through common pain points. The resolution of these points will result in a positive outcome for customers, end users and manufacturers. By fostering this level of growth, the AREA is an enticing forum for interested parties to jointly conduct research that supports their organizations’ performance.

Beyond research, the AREA affords its members the opportunity to gain inroads with other organizations that potentially offer them new solutions with diverse stakeholders.

Personally, I’m excited about the opportunity to increase our collective knowledge and educate the marketplace with respect to the impact that AR can have on their businesses’ and customers’ product quality and experiences. From the perspective of advancing technology for humanity, the IEEE Standards Association continues to explore new areas to support technologies that have the potential to impact the world in a positive manner. Augmented Reality offers this possibility in a very important way—and members of the AREA collaborate to show their support of the technology as well as to increase their voice in the market.

Complexity in Emergence

In addition to the benefits AR offers, there are aspects of AR introduction that will be difficult to overcome. The technology is no more immune to themes of cybersecurity, privacy, and identity than other interconnected technologies. Given the pervasive nature of these themes, it is natural that we will, at some point, need to tackle these complex techno-political questions together, as partners in equilibrium with end users.

While resolving cybersecurity, privacy and identity issues is not on the AREA docket in the immediate future, you can imagine the role the AREA will play in the future as an important actor in the AR ecosystem.

Your Role in the AREA

The AREA is now open to all classes of membership. Why should you join?

I encourage you to take a few moments to learn about the AREA’s value proposition to your company and customers. Reflect on how your value chain could benefit from having technology that increases workplace safety and product quality, while reducing manufacturing and operational costs and helping to streamline workflow processes.

Then ask yourself if your company had the chance to educate the market regarding the benefits of AR in the enterprise; would it benefit? What about the chance to be a part of collective, exploratory research that advanced the AR market? If you believe these questions make sense and your company needs to be at this table, then consider joining the IEEE and becoming an AREA member.

I look forward to meeting you at one of our many upcoming events and discussing the important issues that AREA members will be tackling for the benefit of humanity.

 




APX Labs’ Milestones in Enterprise Smart Glasses

Portions of this article were published in SAP Startup Focus in its March 26 newsletter.

Since we began in this field in 2011, countless smart glasses prototypes, working samples and production units have passed through the APX R&D lab. Predating Google Glass, we had developed rapid prototyping capabilities to build smart glasses prototypes using available components. Having entered the smart glasses industry earlier than most, our early engineering efforts were broader than the enterprise software company we have become, with the nascent market necessitating a broader technical coverage spanning all aspects of hardware, software, user interface design, human-computer interaction methods and systems thinking. Dropping smart glasses device engineering and some of the low-level software from our core expertise subsequently opened a path for APX to do more with less. 

Large enterprises across the globe have spent many billions of dollars over decades to build out electronic knowledge bases of information needed to get work done. This means that mission critical data for the deskless and hands-on workforce already exists in the enterprise, and now the imperative is to enable a seamless, bidirectional flow of information between the Enterprise Resource Planning (ERP) ecosystem and users, while refining user interactions in a contextually aware and intuitive manner. Our Skylight product, an enterprise software platform for smart glasses, helps bridge the gap between enterprise information systems and smart glasses users in need of contextually relevant data, accessible heads up and hands free smart glasses.

Our skill today is in keeping up with rapidly changing technology. To illustrate how challenging this can be, let’s look back at the different hardware options available to the enterprise customer. I hope this visually guided tour of smart glasses marking milestone moments within APX’s history demonstrates how quickly technology has advanced in a short period of time, and brings excitement and anticipation for a diversifying ecosystem of emerging devices continuing the next industrial revolution driven by wearable technology.

US Army Smart Glasses, Multiple Generations (2011-2013)

Our company’s history goes back to when we were originally selected to build software for smart glasses used by the United States Army. The biometrics application, nicknamed Terminator Vision, used the onboard camera to capture faces within the soldier’s field of view, send the captured data to a server to determine the identity of the person(s) and display the information in a heads-up and hands-free manner to the user.

Advanced for its time in terms of delivering a fully embedded, single-device-does-all smart glasses solution, these smart glasses featured an end-to-end exchange of field-collected data from the user’s environment, which was analyzed by a back-end system and delivered to the user in real time.

Augmented Reality Smart Glasses Prototype (Late 2012)

Smartglasses2

In 2012 we broadened our software capabilities to address the non-military market targeting global companies with a deskless and hands-on workforce. We commissioned several prototypes to learn more about the nuances of the ideal hardware for enterprise smart glasses. The ones pictured above used two display modules, each containing a microdisplay, a rudimentary 50:50 beam splitter (light from the environment and the microdisplay are mixed evenly to create visible content to the user), and an illumination source. A 3D printed and painted frame for the headset was designed in-house along with the control module enclosure.

This particular prototype allowed us to experiment with different content presentation options (2D, ultrawide 2D and stereoscopic 3D modes), sensor payloads (visible and infrared camera, motion tracker, microphone, etc.) and computing platforms. It demonstrated there is no single perfect design covering all industrial scenarios and confirmed that enterprise smart glasses follow the same paradigm as all other tools used in the workplace—the right tools or glasses for the right job. 

Epson Moverio BT-100EC Prototype (February 2013)

Smartglasses3

For APX’s first prototype for the Epson Moverio BT-100, we added a 9-axis inertial measurement unit (accelerometer, gyroscope and magnetometer) coupled to an Arduino platform, along with a 5MP camera and microphone module enclosed in a 3D-printed module. This in turn was wired into a daughter board for Epson’s control unit containing a battery, a video signal converter, and a USB hub. Finally, we used an Android phone for additional control and management.

This prototype represented a milestone at APX—we  had the ability to produce devices inexpensively for our developers, partners and customers, albeit in a limited fashion (inexpensive at the time meant $3,000-5,000).

Made famous by coverage on and by demos at the YouTube Sandbox at Google I/O 2013, this cemented our presence in the industrial sector with one of the first Epson-derived prototypes. Essentially a functionally equivalent prototype to Epson’s BT-200 smart glasses released a year later, this was the first device APX prototyped in our partnership with Epson.

Google Glass (April 2013)

Smartglasses4

The release of Google Glass was a milestone for the smart glasses industry for many reasons, not least of which because one of the largest technology companies in the world had introduced a fully integrated smart glasses device at the relatively modest price of $1500. This sparked significant interest from startups, venture capital and large corporations. Overnight, smart glasses went from being exotic devices reserved for researchers and the military to publicly available goods.

The Glass product announcement in 2012 led to the acceleration of the development of and spurred others to take a deeper look at the nascent industry. Google’s entry had ripple effects in the hardware industry as well, considerably increasing the pace at which companies have introduced new devices since.

APX’s vision has always been that smart glasses will fundamentally transform the way the global workforce will build, fix and move goods, delivering enhancements in productivity, efficiency and safety. Glass’ innovations and the market presence it created represented an important step in that direction.

Glass of course has seen its ups and downs, recently bringing the consumer- and app developer-facing Explorer program to an end, but the Glass at Work program, of which APX was the founding partner in April 2014, continues to thrive.

Vuzix M100 (December 2013)

Smartglasses5

Vuzix is a very well-known name in the smart glasses industry, having developed see-through displays since 2005 (not surprisingly, also for the military). Its M100 product was the first industry-targeted generally available device, complete with an ANSI-rated safety glasses attachment, and has since paved the way alongside Google Glass in setting the standard of heads-up and monocular smart glasses.

APX and Vuzix have an official partnership with the M100 integrating the fourth release of Skylight and increasing the selection of devices available for enterprises to deploy across a diverse set of use cases.

Epson Moverio BT-200 (March 2014)

Smartglasses6

Epson’s second-generation Moverio product incorporates the sensors we had added to the BT-100EC prototype and are the first generally available stereoscopic see-through smart glasses. The device also integrates Skylight for industrial AR use cases such as two-way video conferencing and workflow information in the worker’s field of view.

These use cases were demonstrated live at SAP CEO Bill McDermott’s SAPPHIRE 2014 keynote address, marking a decisive change for smart glasses in enterprise, with the technology being publicly demonstrated by a major ERP player at its largest conference. With a suggested retail price of $699, the low cost of the device provided additional incentives for enterprises to pilot and experiment with the technology for their workplace scenarios.

Sony SmartEyeglass (February 2015)

Smartglasses7

Since early 2014, Sony has showcased several iterations of the SmartEyeglass concept at multiple industry conferences. At the 2015 Consumer Electronic Show, Sony, APX and SAP Startup Focus partnered together to demo an enterprise smart glasses solution. Sony provided the hardware for the smart glasses, SAP provided ERP data from Work Manager and HANA and APX’s Skylight furnished the user experience that extended the data to wearable devices. This combination enabled user-context awareness, mobile device management and information security rule enforcement, and brought advanced media to and from users equipped with smart glasses.

Recon Jet (March 2015)

Smartglasses8

Although the Jet smart glasses product from Recon Instruments is produced primarily for the sports industry, we believe its design balancing wearability, user comfort, function, robustness and price will have a positive influence on future smart glasses designs for enterprise. The Jet has attributes that are desirable for enterprise applications: a sleek and easily wearable design that can withstand harsh outdoor environments, consumer level pricing and availability, and an interchangeable lens design.

What’s Next?

In only four years, smart glasses technology has evolved from being a research prototype, limited in capability, availability and high cost to being broadly accessible, wearable and enterprise ready. We have seen growing interest from the largest global companies in building the connected workplace for their deskless and hands-on workforce, and we believe the market for smart glasses is just getting started.

Going back to the beginning of our product timeline, we initially invested heavily in smart glasses because we recognized their potential suitability to enterprise use cases. The latest version of our Skylight product is scalable, connects to enterprise data sources and supports commercially available models of smart glasses. We are also preparing for emerging trends that will interconnect smart glasses with other mobile devices. With smartwatches taking center stage in 2015, we are extending Skylight support to a growing wearable ecosystem.

We can’t be more excited, both by how far we’ve come and where we’ll go as the wearable market takes off in the enterprise.  We’re also very excited to work alongside an industry full of partners, customers, and research institutions as a founding member of the Augmented Reality for Enterprise Alliance (AREA).  The enterprise smart glasses market requires active participation of the full value chain of enterprise mobility.  Device manufacturers, software developers, system integrators, consulting agencies, academic and research institutions will all need to collaborate to deliver on the needs of customers.  There is an elevated sense of personalization that smart glasses and other wearable devices bring to users, and defining the optimized user experience will be a critical task for everyone in the industry.  While improving the user experience and capabilities of our own product line, we recognize that the evolution of the entire enterprise smart glasses value chain requires contributions from an entire industry.

The insights gathered from collaborating with other AREA members will help improve the quality of the experience for our customers, developers and partners.  APX is striving to work with fellow visionaries to accelerate the adoption of enterprise smart glasses technology, generating ripple effects much greater than the mere sum of the AREA members’ capabilities.




AR: A Natural Fit for Plant Floor Challenges

Much has been made recently of how Augmented Reality will soon merge our digital lives with our real ones, bestowing new powers to our social and working existence. Recent advances in technology have lulled us into believing that AR in the workplace is just around the corner. Many of us have looked forward to high-tech glasses, watches and other wearables finally delivering that promise, inspired by viral YouTube videos (here, and here ) showing workers wearing glasses with full field of vision AR displays. However, this has yet to materialize.

The recent withdrawal of Google Glass and the general failure of wearables to meet expectations have influenced public perception of enterprise AR as falling rapidly from Gartner, Inc.’s Peak of Inflated Expectations into the Trough of Disillusionment. 

AR Is a Natural Fit for Solving Plant Floor Challenges

Gartner has pigeonholed AR technology into the digital marketing niche. This is possibly the result of highly visible and successful AR brand engagement campaigns, such as for sports teams, automobile companies and even business-to-business marketing. The Augmented Reality feature provided in the IKEA catalog companion application demonstrates how AR can be useful as well as drive consumer brand engagement. These campaigns and useful applications primarily address the outward-facing communication needs of the brands and are measured in terms of greater sales or customer loyalty.

Turning towards business operations, those of us involved in the manufacturing and automation field see AR as a way to address many plant floor challenges. Here are a few examples of common plant floor issues, which we believe are a natural fit for enhancement with mobile AR.

Plant Floor Problem

How AR Helps

1.     When following a procedure, workers often spend time trying to identify the part of the machine or adjustment point that requires their attention.

Visually identify and direct workers to the specific part or adjustment port that requires their attention.

2.     Workers performing an unfamiliar or infrequent task spend time searching in manuals for procedures that match the task or asking for help from co-workers.

Provide contextual visual instructions to show workers how to correctly perform unfamiliar tasks. 

3.     Workers spend time searching for data and resources that uniquely identify the equipment on which they are working.

Identify equipment or processes and visually display relevant data and resources.

4.     Technical resources required to evaluate and efficiently respond to unplanned downtime events are not available in real time.

Provide visual communication tools to provide users and remote resources with a common, real time or “snap shot” view of the equipment or process.

Table: Potential AR Solutions to common plant floor problems

It’s very tempting for an engineering team to develop an eye-catching AR application for a demonstration and to suggest that the technology also easily addresses more complex problems. These solutions are usually implemented by experts using software toolkits, rather than implementing commercial off-the-shelf software. The final implementations delivered for the customer are usually highly customized. In these cases, ROI is difficult to define. iQagent’s approach to solving plant floor problems with AR involves first focusing on the problems to be solved, and then defining a good mobile AR solution to address the challenge.

Interventions are Collaborative Endeavors

One challenge we address is #4 from the table above: technical resources required to evaluate and efficiently respond to unplanned downtime events are not available in real time.

Production downtime costs are often measured in hundreds or thousands of dollars per minute. When a production line goes down, the operator must communicate with remote technical resources in order to get production running again quickly. One factor preventing effective communication is the education gap between the operator and engineer; operators aren’t engineers, and engineers aren’t used to operating the equipment through all phases of the process. Each has specialized technical and procedural knowledge that can contribute to a solution, but traditional channels such as phone, text or e-mail aren’t perfect tools for collaboration. The operator must decide which details are important to convey to the engineer, and the engineer must find the right questions to ask in order to get a clear picture of the problem. Due to the prohibitive cost of production downtime, this effort has a very small window in which to be effective. At some point, the decision must be made to get the human resource on-site in order to return the line to normal production.

We then considered why engineers and operators are more efficient in resolving production downtime issues when collaborating in person. The operator can directly show the problem to the engineer, referring to live process values and performance indicators relevant to the process from the local automation system. The engineer can analyze the problem in real time, asking the right questions of the operator in order to resolve the problem.

A successful mobile solution duplicates the benefits of in-person collaboration, allowing each participant to effectively contribute their specialized knowledge to a common view of the process, including live data and operational details from the automation systems that are relevant to the problem.

This particular solution is a great fit for AR-enhanced software on a mobile device.

Augmented Reality with iQagent

iQagent uses the device’s video camera to identify a unique piece of equipment by scanning a QR code. The software overlays relevant process values and associated data on the camera’s displayed video feed, which can also be recorded as a snapshot or video. This provides a common view of the process required. Operators can also annotate directly on the images or video, making notes and drawing attention to areas of interest for the engineer to analyze, in effect “showing” the problem. When finished, the user saves and e-mails the video to the remote technician, who now has a much more complete picture of the problem, and in many cases, can resolve the issues more efficiently.

We feel iQagent is a great solution to some common plant floor challenges. But having a great product isn’t an end but a beginning. To make any product a success, you have to get it in front of users who need it, and you must support and continually improve the product. This is why we joined the AR for Enterprise Alliance. The AREA enables us to collaborate with other like-minded AR solution providers, end users and customers. Through education, research and collaboration, we will help to move AR out of the Trough of Disillusionment, up the Slope of Enlightenment and onto the Plateau of Productivity.