Augmented Reality for Enterprise Alliance

Three Key Elements of an Enterprise AR System

Organizations seeking to increase their efficiency by means of AR can begin today with little or very low risk. Many of the essential ingredients for success are already in place and additional tools are available at comparatively low cost.

It is helpful to think of AR as being a “plate” sitting upon three key elements: content (data), hardware and software.

Content

The information revolution of the last century has produced digital data floods. The spectrum of data that may be suitable for AR-assisted viewing ranges from small databases of enterprise assets or resources and extends all the way to massive, continually expanding information repositories, often referred to as “Big Data.” Sophisticated analyses must be performed on Big Data to extract benefits that can then be made accessible to AR technology.

In current enterprise information systems, digital information assets often have metadata that associates them with the real-world people, businesses, places or objects to which they pertain. For example, information about facilities or utilities typically includes a street address or latitude/longitude coordinates that enable it to be correctly displayed in a Web browser or other software, when requested. A 3D model of a part in a power plant might display a barcode in a field on the screen that permits another information retrieval system to associate that part with a particular pump or compressor in need of service or replacement.

Hardware

Producing an AR experience requires capturing the context of the user, performing transformations, comparing the user context with triggers in a database and producing signals that present digital data, also known as “augmentations,” to the user’s senses.

Capturing the user context is the task of (hardware) sensors. Sensors come in many sizes and shapes. They can be mounted on, or integrated and embedded into, many objects in the enterprise. Machine observations generated by sensors can be filtered and fused for use. Together or combined, such observations are real time inputs to either a processor on the local device or to a network-connected server. The outputs of the real time computation are the detection and recognition of patterns produced by the physical world.

Once the patterns are recognized and matched with digital data that has been encoded for use in AR experiences, these digital data and associated instructions must be received by a hardware device on or near the user. The hardware must produce signals that are detected by the user’s auditory, tactile or visual senses. Signals to be detected by hearing are sounds; signals to be detected by user tactile senses are vibration, pressure or temperature; and signals seen by the user are light waves.

Today, the hardware used for most enterprise AR projects is comprised of mass market, consumer-grade smart phones or tablets. These platforms provide integrated hardware and software that is inexpensive, familiar to end users and easily managed. There are also devices designed specifically for enterprise AR use, such as commercially available smartglasses and tablets that integrate AR technology. This category of hardware is growing and will diversify in the future. The needs of a project will drive AR architects to choose between consumer hardware, made-for-enterprise standard hardware or custom-built hardware.

Software

All AR projects involve software in multiple ways. There is software used during the preparation (design and publishing) of experiences. This comes in many flavors and levels of sophistication. Software is also crucial to the preparation of data that will be presented to the user during the experience.

Publishing and delivering AR experiences is performed by systems controlled by software in and across networks.

Finally, AR experiences require software to:

  • Detect patterns in sensor observations
  • Interpret user context
  • Track user changes with respect to the target and various triggers
  • Produce hardware-generated sounds, tactile signals or visible augmentations

An application may be dedicated to AR functions, or AR features can be embedded into another enterprise application.

During an AR experience there may also be capture and logging of interactions with physical and digital assets. Software generates time stamps that are associated with readings captured by sensors and stored for many secondary applications.

Conclusion

With content, hardware and software, an AR architect has the three key elements. The important task of combining these into meaningful AR experiences requires skills that can be acquired in classrooms and webinars and through other learning support systems.

The result of combining content, hardware and software is like a three-legged stool upon which there can be many AR-assisted scenarios and use cases.

Back to Getting Started +

Share Article: