I remember the day in 1992 when I learned about this new way to get information: it was called Gopher. Via my modem on a phone line that I was using for CompuServe access, I connected my computer to a server. The next year I installed the first browser, Mosaic, on my Apple Macintosh and experienced the Web for the first time. I never asked myself whether there were standards involved but, of course, we know now that they played a role in the Web becoming what it is today. An array of standards gives you the ability to read this page on any device.
Technology standards for interoperability have a long history. The emergence of the Web, and the World Wide Web Consortium (W3C) to define and maintain standards for it, is within memory for some of those reading this page but there are many other industries that had to develop standards as they went. Some adopted standards after experiencing some painful lessons. For example, the plugs and outlets that people use to connect devices to the electrical grid all conform to the power standards defined and adopted in their country. The fact that there are national standards, and not international ones, was a painful lesson. As recently as 15 years ago, we had to carry transformers and plug adapters around with us if we wanted to use household appliances or computers designed for one national standard in places with different standards.
Folks with whom I was working at the time I started using the Web were developing interoperability standards for telephone networks. The regional Bell Operating Companies were trying out new technologies (not only those developed in Bell Labs and provided by “Ma Bell”). They wanted to go around AT&T to reach long distance operators in Europe and elsewhere. My exposure in the mid-1990s to the challenges that they were addressing marked me. There are compelling business imperatives behind the telephone interoperability standards requirement. Standards are the basis for large, highly complex systems reaching millions of people and billions of dollars of revenues. The key concept here: scalability.
Jump ahead to 2009
I had been working in Augmented Reality for several years but it wasn’t until 2009 that devices capable of offering the general public their first mobile AR experiences (smartphones with cameras and GPS) hit the market. As you’ll read in this post by Thomas K. Carpenter, a lot happened in AR that year. A 2009 post on Adweek asked whether 2010 would the “year of Augmented Reality.”
2009 is the year I heard Mike Liebhold and Damon Hernandez refer to “Open AR” in the context of a workshop co-organized by the Institute for the Future. I had an “ah ha!” moment. I felt that if AR was going to really reach its full potential, it was time to begin advocating for interoperability of components and services for Augmented Reality. In order for AR technology to scale, to integrate with everything—every person, place and thing on the planet—it cannot operate in a vacuum. It must be part of the full technology equation. It must benefit from breakthroughs in artificial intelligence and Big Data. AR must be integrated and combined with a host of other underlying technologies to become the user interface for the Internet of Things, our communications and collaboration platforms, and even our power grid. Well, the interface for the physical world. In short, AR components must be interoperable.
In conjunction with Mobile World Congress 2010 and with the support of Dan Appelquist (then at Vodafone, now at Samsung), I organized the first workshop about barriers to the growth of mobile AR. There were many barriers.
Is now the time?
Many of those obstacles we talked about in 2010 are lower today, but low interoperability remains on the list of critical barriers. Over the following years, while talking about this issue and in the context of a dozen more meetings of members of the grassroots Community for Open and Interoperable Augmented Reality, I’ve listened to hundreds of people defend proprietary technologies as part of their Augmented Reality pipeline. Many of them make strong cases for and experience the benefits of their innovating without observing standards. But, many technologies remain limited in their reach and impact. Their ability to enable AR to evolve and spread to new use cases is lower without interoperability.
Phone companies and energy companies have understood for decades that, due to the business model of their providers, closed technology silos have a role but are not scalable. Proprietary technology silos enable one company to define a vertically-integrated stack, suite, or system of components such that developments are entirely under the control of the provider. The provider decides when and how much investment is in the total system or in any part.
Interoperability of data and components with which to deploy AR features in enterprise IT are key requirements for large customers in many industries, but they are currently unmet. Due to the risk of closed AR technology silos delaying or controlling their future investment into AR, many large enterprises are holding back. To change their position, stakeholders in AR must take all steps necessary to accelerate the emergence and success of open and interoperable AR.
I’m not alone in expressing this position. For over seven years, I have had the pleasure of working with other people around the world who share the vision of widespread AR based on interoperability. One of those is John Simmins, a technical executive at the Electric Power Research Institute (EPRI).
In 2016, John and I asked ourselves whether all the conditions would soon align, whether we were getting closer to the AR interoperability threshold. To answer this question, we embarked on an EPRI research project designed to assess all the standards and projects to date and to inform the broader community of stakeholders about the importance of interoperability.
What’s your answer?
The first output of this project, the EPRI Enterprise Augmented Reality Vision, Interoperability Requirements, and Standards Landscape report was released last week. Like the development of standards by members of the W3C when the Web was young, this report is public and designed to serve the whole ecosystem of stakeholders.
And, similar to the work initiated by the SmartGrid Interoperability Panel (originally put in place with the assistance of NIST) for ensuring the future of Smart Grid, this report proposes an interoperability framework.
The AR Interoperability Framework (source: EPRI Enterprise Augmented Reality Vision, Interoperability Requirements, and Standards Landscape Report).
The EPRI report provides the current landscape of relevant standards in the context of the new framework as well as in the AR authoring, publishing and presentation pipeline and the MAR Reference Model. Never before have all the different standards activities that have been started and could contribute to interoperable AR, been cataloged and described in this level of detail. It’s not a light read. It’s a reference work.
The release of the EPRI report is a milestone, but only as important as what follows. Interoperability is still poorly understood by many stakeholders and difficult to achieve in the best of circumstances. That means that this report must be more than a catalog to capture the status of standards in early 2017. It is released at no cost and without restriction so that it can become a discussion starter and the basis for many types of workshops and meetings, and other forms of stakeholder collaboration.
The cover image we chose (which appears at the top of this post) focuses on the interoperability problem and the need for multiple stakeholder to focus and collaborate on the topic. If you feel that interoperability is important for AR to scale now or in the future, we invite you to contribute to the discussion with your colleagues. Organize your own workshops, or look for and participate in meetings organized by others. And, so that we may include them in updates to the report, please let us know about your activities as new standards and/or activities emerge.