The AR Market in 2017, Part 3: Augmented Reality Software is Here to Stay

Previous: Part 2: Shiny Objects Attract Attention

 

There are some who advocate for integrating AR directly and deeply into enterprise content management and delivery systems in order to leverage the IT systems already in place. Integration of AR features into existing IT reduces the need for a separate technology silo for AR. I fully support this school of software architecture. But, we are far from having the tools for enterprise integration today. Before this will be possible, IT groups must learn to manage software with which they are currently unfamiliar.

An AR Software Framework

Generating and presenting AR to users requires combining hardware, software and content. Software for AR serves three purposes:

  1. To extract the features, recognize, track and “store” (manage and retrieve the data for) the unique attributes of people, places and things in the real world;
  2. To “author” interactions between the human, the digital world and real world targets found in the user’s proximity, and publish the runtime executable code that presents AR experiences; and
  3. To present the experience to, and manage the interactions with, the user while recognizing and tracking the real world.

We will see changes in all three of these segments of AR software in 2017.

Wait, applications are software, aren’t they? Why aren’t they on the list? Before reading further about the AR software trends I’m seeing, I recommend you read a post on LinkedIn Pulse in which I explain why the list above does not include thousands of AR applications.

Is it an AR SDK?

Unfortunately, there is very little consistency in how AR professionals refer to the three types of software in the framework above, so some definitions are in order. A lot of professionals just refer to everything having to do with AR as SDKs (Software Development Kits).

In my framework AR SDKs are tools with which developers create or improve required or optional components of AR experiences. They are used in all three of the purposes above. If the required and optional components of AR experiences are not familiar to you, I recommend reviewing the post mentioned above for a glimpse of (or watching this webinar for a full introduction to) the Mixed and Augmented Reality Reference Model.

Any software that extracts features of the physical world in a manner that captures the unique attributes of the target object or that recognizes and tracks those unique features in real time is an AR SDK. Examples include PTC Vuforia SDK, ARToolkit (Open Source SDK), Catchoom CraftAR SDK, Inglobe ARmedia, Wikitude SDK and SightPath’s EasyAR SDK. Some AR SDKs do significantly more, but that’s not the topic of this post.

Regardless of what it’s called, the technology to recognize and track real world targets is fundamental to Augmented Reality. We must have some breakthroughs in this area if we are to deliver the benefits AR has the potential to offer enterprises.

There are promising developments in the field and I am hopeful that these will be more evident in 2017. Each year the AR research community meets at the IEEE International Symposium on Mixed and Augmented Reality (ISMAR) and there are always exciting papers focused on tracking. At ISMAR 2016, scientists at Zhejiang University presented their Robust Keyframe-based Monocular SLAM. It appears much more tolerant to fast motion and strong rotation which we can expect to see more frequently when people who are untrained in the challenges of visual tracking use wearable AR displays such as smart glasses.

In another ISMAR paper, a group at the German Research Center for Artificial Intelligence (DFKI) published that they have used advanced sensor fusion employing a deep learning method to improve visual-inertial pose tracking. While using acceleration and angular velocity measurements from inertial sensors to improve the visual tracking has been promising results for years, we have yet to see these benefits materialize in commercial SDKs.

Like any software, the choice of AR SDK should be based on project requirements but in practical terms, the factors most important for developers today tend (or appear) to be a combination of time to market and support for Unity. I hope that with support for technology transfer with projects like those presented at ISMAR 2016, improved sensor fusion can be implemented in commercial solutions (in the OS or at the hardware level) in 2017.

Unity Dominates Today

A growing number of developers are learning to author AR experiences. Many developers find the Unity 3D game development environment highly flexible and the rich ecosystem of developers valuable. But, there are other options worthy of careful consideration. In early 2016 I identified over 25 publishers of software for enterprise AR authoring, publishing and integration. For an overview of the options, I invite you to read the AREA blog post “When a Developer Needs to Author AR Experiences.”

Products in the AR authoring group are going to slowly mature and improve. With a few mergers and acquisitions (and some complete failures), the number of choices will decline and I believe that by the end of 2017, fewer than 10 will have virtually all the market share.

By 2020 there will be a few open source solutions for general-purpose AR authoring, similar to what is available now for authoring Web content. In parallel with the general purpose options, there will emerge excellent AR authoring platforms optimized for specific industries and use cases.

Keeping Options for Presenting AR Experiences Open

Today the authoring environment defines the syntax for the presentation so there’s really little alternative for the user than to install and run the AR execution engine that is published by the authoring environment provider.

I hope that we will see a return of the browser model (or the emergence of new Web apps) so that it will be possible to separate the content for experiences from the AR presentation software. To achieve this separation and lower the overhead for developers to maintain dozens of native AR apps, there needs to be consensus on formats, metadata and workflows.

Although not in 2017, I believe some standards (it’s unclear which) will emerge to separate all presentation software from the authoring and content preparation activities.

Which software are you using in your AR projects and what are the trends you see emerging?

 

Next: Navigating the way to continuous AR delivery

Back to Blogs +

Share Article: