1

The AR Market in 2017, Part 4: Enterprise Content is Not Ready for AR

Previous: Part 3: Augmented Reality Software is Here to Stay

 

As I discussed in a LinkedIn Pulse post about AR apps, we cannot expect users to run a different app for each real world target they want to use with AR or one monolithic AR application for everything in the physical world. It is unscalable (i.e., far too time-consuming and costly). It’s unclear precisely when, but I’m confident that we will, one day, rely on systems that make content ready for AR presentation as a natural result of digital design processes.

The procedures or tools for automatically converting documentation or any digital content into AR experiences for enterprise use cases are not available. Nor will they emerge in the next 12 to 18 months. To begin the journey, companies must develop a path that leads from current procedures that are completely separate from AR presentation to the ideal processes for continuous AR delivery.

Leaders need to collaborate with stakeholders to focus on areas where AR can make a difference quickly.

Boiling the Ocean

There are hundreds of AR use cases in every business. All AR project managers should maintain a catalog of possible use cases. Developing a catalog of use cases begins with identification of challenges that are facing a business. As simple as this sounds, revealing challenges increases exposure and reduces confidence in existing people and systems. Most of the data for this process is buried or burned before it escapes. Without data to support the size and type of challenges in a business unit, the AR advocate is shooting in the dark. The risk of not focusing on the best use case and challenges is too high.

There need to be initiatives to help AR project managers and engineers focus on the problems most likely to be addressed with AR. Organizational change would be a likely group to drive such initiatives once these managers are, themselves, trained to identify the challenges best suited for AR.

In 2017, I expect that some systems integration and IT consulting companies will begin to offer programs that take a methodical approach through the AR use case development process, as part of their services to clients.

Prioritization is Key

How do stakeholders in a company agree on the highest priority content to become AR experiences for their top use cases? It depends. On the one hand there must be consistent AR technology maturity monitoring and, in parallel, the use case requirements need to be carefully defined.

To choose the best use case, priorities need to be defined. If users perceive a strong need for AR, that should weigh heavily. If content for use in the AR experience is already available, then the costs and time required to get started will be lower.

A simple method of evaluating the requirements appears below. Each company needs to define their own priorities based on internal drivers and constraints.

ch

A simple process for prioritizing AR use cases (Source: PEREY Research & Consulting).

Markers Won’t Stick

One of the current trends in enterprise AR is to use markers as the target for AR experiences. Using computer vision with markers indicates to the user where they need to point their device/focus their attention, consumes less power and can be more robust than using 3D tracking technologies in real-world conditions.

However, for many enterprise objects that are subject to sun, wind and water, markers are not a strategy that will work outside the laboratory. Those companies that plan to use AR with real-world targets that can’t have markers attached need to begin developing a new content type: trackables using natural features.

In 2017 more enterprise AR project managers will be asking for SDKs and tools to recognize and track the physical world without markers. For most, the technologies they will test will not meet their requirements. If well managed, the results of testing in 2017 will improve the SDKs as suggested in our post about AR software.

The AR Ecosystem and Technology are Immature

While the title of this post suggests that enterprise content is not in formats and associated with metadata to make AR experiences commonplace, the reverse statement is also true: not all the required AR components are ready for enterprise introduction.

Projects I’ve been involved with in 2016 have shown that while there are a few very solid technologies (e.g., tracking with markers on print), most components of AR solutions with which we are working are still very immature. The hardware for hands-free AR presentation is one area that’s changing very rapidly. The software for enterprise AR experience authoring is another. As more investments are made, improvements in the technology components will come, but let’s be clear: 2017 will not be the year when enterprise AR goes mainstream.

For those who have seen the results of one or two good proofs of concept, there will be many people who will need your help to be educated about AR. One of the important steps in that education process is to participate in the activities of the AREA and to share with others in your company or industry how AR could improve workplace performance.

When your team is ready to introduce AR, call in your change management group. You will need all the support you can get to bring the parts of this puzzle together in a successful AR introduction project!

Do you have some predictions about what 2017 will bring enterprise AR? Please share those with us in the comments to this post. 




The AR Market in 2017, Part 3: Augmented Reality Software is Here to Stay

Previous: Part 2: Shiny Objects Attract Attention

 

There are some who advocate for integrating AR directly and deeply into enterprise content management and delivery systems in order to leverage the IT systems already in place. Integration of AR features into existing IT reduces the need for a separate technology silo for AR. I fully support this school of software architecture. But, we are far from having the tools for enterprise integration today. Before this will be possible, IT groups must learn to manage software with which they are currently unfamiliar.

An AR Software Framework

Generating and presenting AR to users requires combining hardware, software and content. Software for AR serves three purposes:

  1. To extract the features, recognize, track and “store” (manage and retrieve the data for) the unique attributes of people, places and things in the real world;
  2. To “author” interactions between the human, the digital world and real world targets found in the user’s proximity, and publish the runtime executable code that presents AR experiences; and
  3. To present the experience to, and manage the interactions with, the user while recognizing and tracking the real world.

We will see changes in all three of these segments of AR software in 2017.

Wait, applications are software, aren’t they? Why aren’t they on the list? Before reading further about the AR software trends I’m seeing, I recommend you read a post on LinkedIn Pulse in which I explain why the list above does not include thousands of AR applications.

Is it an AR SDK?

Unfortunately, there is very little consistency in how AR professionals refer to the three types of software in the framework above, so some definitions are in order. A lot of professionals just refer to everything having to do with AR as SDKs (Software Development Kits).

In my framework AR SDKs are tools with which developers create or improve required or optional components of AR experiences. They are used in all three of the purposes above. If the required and optional components of AR experiences are not familiar to you, I recommend reviewing the post mentioned above for a glimpse of (or watching this webinar for a full introduction to) the Mixed and Augmented Reality Reference Model.

Any software that extracts features of the physical world in a manner that captures the unique attributes of the target object or that recognizes and tracks those unique features in real time is an AR SDK. Examples include PTC Vuforia SDK, ARToolkit (Open Source SDK), Catchoom CraftAR SDK, Inglobe ARmedia, Wikitude SDK and SightPath’s EasyAR SDK. Some AR SDKs do significantly more, but that’s not the topic of this post.

Regardless of what it’s called, the technology to recognize and track real world targets is fundamental to Augmented Reality. We must have some breakthroughs in this area if we are to deliver the benefits AR has the potential to offer enterprises.

There are promising developments in the field and I am hopeful that these will be more evident in 2017. Each year the AR research community meets at the IEEE International Symposium on Mixed and Augmented Reality (ISMAR) and there are always exciting papers focused on tracking. At ISMAR 2016, scientists at Zhejiang University presented their Robust Keyframe-based Monocular SLAM. It appears much more tolerant to fast motion and strong rotation which we can expect to see more frequently when people who are untrained in the challenges of visual tracking use wearable AR displays such as smart glasses.

In another ISMAR paper, a group at the German Research Center for Artificial Intelligence (DFKI) published that they have used advanced sensor fusion employing a deep learning method to improve visual-inertial pose tracking. While using acceleration and angular velocity measurements from inertial sensors to improve the visual tracking has been promising results for years, we have yet to see these benefits materialize in commercial SDKs.

Like any software, the choice of AR SDK should be based on project requirements but in practical terms, the factors most important for developers today tend (or appear) to be a combination of time to market and support for Unity. I hope that with support for technology transfer with projects like those presented at ISMAR 2016, improved sensor fusion can be implemented in commercial solutions (in the OS or at the hardware level) in 2017.

Unity Dominates Today

A growing number of developers are learning to author AR experiences. Many developers find the Unity 3D game development environment highly flexible and the rich ecosystem of developers valuable. But, there are other options worthy of careful consideration. In early 2016 I identified over 25 publishers of software for enterprise AR authoring, publishing and integration. For an overview of the options, I invite you to read the AREA blog post “When a Developer Needs to Author AR Experiences.”

Products in the AR authoring group are going to slowly mature and improve. With a few mergers and acquisitions (and some complete failures), the number of choices will decline and I believe that by the end of 2017, fewer than 10 will have virtually all the market share.

By 2020 there will be a few open source solutions for general-purpose AR authoring, similar to what is available now for authoring Web content. In parallel with the general purpose options, there will emerge excellent AR authoring platforms optimized for specific industries and use cases.

Keeping Options for Presenting AR Experiences Open

Today the authoring environment defines the syntax for the presentation so there’s really little alternative for the user than to install and run the AR execution engine that is published by the authoring environment provider.

I hope that we will see a return of the browser model (or the emergence of new Web apps) so that it will be possible to separate the content for experiences from the AR presentation software. To achieve this separation and lower the overhead for developers to maintain dozens of native AR apps, there needs to be consensus on formats, metadata and workflows.

Although not in 2017, I believe some standards (it’s unclear which) will emerge to separate all presentation software from the authoring and content preparation activities. 

Which software are you using in your AR projects and what are the trends you see emerging?

 

Next: Navigating the way to continuous AR delivery




The AR Market in 2017, Part 2: Shiny Objects Attract Attention

Previous: Part 1, Connecting the Dots

 

There’s a great deal of attention being paid to the new, wearable displays for Augmented Reality. Hardware permits us to merge the digital and physical worlds in unprecedented ways. Wearable hardware delivers AR experiences while the user is also able to use one or both hands to perform tasks. The tendency to pay attention to physical objects is not unique to AR industry watchers. It is the result of natural selection: genes that gave early humans the ability to detect and respond quickly to fast moving or bright and unusual objects helped our ancestors survive while others lacking those genes did not.

Although this post focuses on the hardware for Augmented Reality, I don’t recommend focusing exclusively on advancements in AR hardware when planning for success in 2017. The hardware is only valuable when combined with the software, content and services for AR in specific use cases.

Now, considering primarily AR hardware, there are important trends that we can’t ignore. This post only serves to highlight those that, in my opinion, are the most important at an industry-wide level and will noticeably change in 2017.

Chips accelerate changes

Modern Augmented Reality hardware benefits hugely from the continued reduction in size and cost in hardware components for mass market mobile computing platforms. We need to thank all those using smart phones and watches for this trend.

As the semiconductor manufacturers gain experience and hard-code more dedicated vision-related computation into their silicon-based mix, performance of complete AR display devices is improving. Combined with the technology Intel recently acquired from Movidius (which will produce significant improvements in wearable display performance beyond 2017), Intel RealSense is an example of a chip-driven technology to monitor. Other offerings will likely follow from NVIDIA and Apple in 2017.

When available for production, the improvements in semiconductors for wearable AR devices will be measurable in terms of lower latency to recognize a user’s environment or a target object, less frequent loss of tracking, higher stability in the digital content that’s rendered, lower heat and longer battery life. All these are gradual improvements, difficult to quantify but noticeable to AR experts.

As a result of optimization of key computationally-intensive tasks (e.g., 3D capture, feature extraction, graphics rendering) in lower cost hardware, the next 12 to 18 months will bring new models of AR display devices. Not just a few models or many models in small batches.

These next-generation wearable display models with dedicated silicon will deliver at least a basic level of AR experience (delivery of text and simple recognition) for an entire work shift. Customers will begin to place orders for dozens and even, in a few cases, hundreds of units.

Optics become sharper

In addition to semiconductors, other components will be changing rapidly within the integrated wearable AR display. The next most important developments will be in the display optics. Signs of this key trend were already evident in 2016 – for example, when Epson announced the OLED optics designed for the Moverio BT-300.

It’s no secret that over the next few years, optics will shrink in size, drop in weight and demand less power. In 2017, the size and weight of fully functional systems based on improved optics for AR will decline. Expect smart glasses to weigh less than 80gms. Shrinking the optics will make longer, continuous and comfortable use more likely.

Developers raised issues about color quality and fidelity when testing devices introduced in 2015 and 2016. Color distortion (such as an oil spill rainbow effect) varies depending on the type of optics and the real world at which the user’s looking (the oil spill pattern is particularly noticeable on large white surfaces). The 2017 models will offer “true” black and higher fidelity colors in a wider range of settings. Again, the experts will feel these improvements first and “translate” them to their customers.

Another key area of improvement will be the Field of View. Some manufacturers will announce optics with 50° diagonal (a few might even reach 80° diagonal) in 2017. When combined with advanced software and content, these changes in optics will be particularly important for making AR experiences appear more realistic.

Combined with new polychromatic materials in lenses, lower weight and stronger material in the supports, optics will be more tolerant of changes in environmental conditions, such as high illumination, and will fit in more ruggedized packages.

More options to choose from

Speaking of packaging, in 2016 there are three form factors for AR displays:

  • monocular “assisted reality” hardware that clips onto other supports (frames) or can be worn over a user’s ear,
  • smart glasses that sit on the user’s nose bridge and ears, and
  • head-worn displays that use straps and pads and a combination of ears, noses and the user’s skull for support.

The first form factor does not offer an immersive experience and isn’t appropriate for all use cases, but assisted reality systems have other significant advantages (e.g., lower cost, longer battery life, lighter weight, easy to store) so they will remain popular in 2017 and beyond.

At the opposite end of the spectrum, the highly immersive experiences offered by head-worn devices will also be highly appealing for different reasons (e.g., depth sensing, accuracy of registrations, gesture-based interfaces).

We need to remember that the use cases for enterprise AR are very diverse and so can be the displays available to users. The new wearable AR display device manufacturers entering the fray in 2017 will stay with the same three general form factors but offer more models.

In addition to diversity within these three form factors there will be extensions and accessories for existing products – for example, charging cradles, corrective lenses, high fidelity audio and materials specifically designed to tolerate adverse conditions in the workplace environment.

The results of this trend are likely to include:

  • those selling wearable displays will be challenged to clearly explain new features to their potential customers and translate these features into user benefits,
  • those integrating AR displays will be more selective about the models they support, becoming partners with only a few systems providers (usually leaning towards the bigger companies with brand recognition)
  • buyers will need to spend more time explaining their requirements and aligning their needs with the solutions available in their budget range.

Wearable display product strategists will realize that with so many use cases, a single user could need to have multiple display models at their disposal. One possible consequence of this trend could be reduced emphasis on display systems that are dedicated to one user. We could see emergence of new ways for multiple users in one company or group to reserve and share display systems in order to perform specific tasks on schedule.

Rapid personalization, calibration and security will offer new opportunities to differentiate wearable AR display offerings in 2017.

Enterprise first

All of these different form factors and options are going to be challenging to sort out. Outside enterprise settings, consumers will not be exposed to the hardware diversity in 2017. They simply will not invest the time or the money.

Instead, companies offering new hardware, even the brands that have traditionally marketed to mass market audiences, will target their efforts toward enterprise and industrial users. Enterprises will increase their AR hardware budgets and develop controlled environments in which to compare AR displays before they can really make informed decisions at corporate levels. Third party services that perform rigorous product feature evaluations will be a new business opportunity.

While this post highlights the trends I feel are the most important when planning for success with AR hardware in 2017, there are certainly other trends on which companies could compete.

To learn more about other options and trends in wearable AR displays in 2016, download the EPRI Technology Innovation report about Smart Glasses for AR in which I offer more details.

What are the trends you think are most important in AR hardware and why do you think they will have a significant impact in 2017?

 

Next: AR software matures and moves toward standardization




The AR Market in 2017, Part 1: Connect the Dots

In your profession, you’re one of those who are most aware of future technologies. The proof of this fact is that you have discovered Augmented Reality and decided that it’s sufficiently important to dedicate at least a few minutes or hours to getting oriented and staying informed about the trends.

That’s the first step. But you know enough not to believe everything you read or see in a YouTube video.

The next step, if you haven’t done so already, is to train yourself to separate the biggest hype from the facts. This is not easy, but you should be able to hit this milestone by attending industry events where AR is being demonstrated and you can put your hands on the products in action, even under highly controlled conditions. Visiting one or more of the AREA members in their offices or inviting them to visit your facility will be even more valuable.

You’ll see some mock ups and, if you ask tough questions, you will also see some of the weaknesses and begin to glimpse the complexity of the problems facing adoption of these technologies. Keep a log of these experiences you have with Augmented Reality and the impressions they leave on you.

If you really want to understand the strengths and weaknesses “up close” and have budget, you can develop a project or participate in a group project that focuses on a well-defined use case.

Share what you learn

Once you’ve seen and captured notes about more than 10 live demonstrations in your journal and have personally touched AR, you can begin to “translate” for others what you’re seeing and doing.

But, wait! The insights you’ve acquired could offer a strategic advantage to your company so, why would you share them? Even if you are thinking that you should keep what you’ve gathered to yourself, I encourage you to share because AR is more than just another new technology offering you or your group a competitive advantage. This is going to be a major crowd-sourced, multi-year project. When more people are looking into AR technology, it will improve faster than when only a few are focusing on and investing in it in isolation.

Once AR is good enough to be used weekly (or daily) in more than one use case, it is going to push operational performance to new levels. Then you will be able to use it to full advantage.

AR may become as transformational to your company and industry as the Web and mobile devices during your professional career. But it requires more than one or two examples and adopters in an industry. Reaching a threshold level of adoption in your industry will be necessary. And, to begin meaningful adoption there need to be a few experts. We need people like you to translate the theory and potential of AR in your industry to practice and reality.

I’ve found that I can translate for others what I’m observing by breaking it down into four interrelated topics: hardware, software, content and services. For over a decade I’ve used these four legs of the AR platform to organize projects, to review the history of AR and to capture current status.

In a series of AREA blog posts I am sharing developments I believe will be important in AR in 2017 using this simple framework.

Connecting the dots around us

One observer can’t see all the details of the entire AR landscape, certainly not in all industries where the technology will apply. Fortunately, the AREA is a network of very bright minds that are also looking forward as well as in other directions, at the same time.

Many AREA members are in the trenches of Augmented Reality take on a forward looking challenge when, at the end of each year, they begin preparing their forecast for the following year.

I hope that these posts will permit you to find your place, connect yourself and in your comments to these posts, you will compare and contrast what you’ve observed with my experience.

If we each take a few minutes, hours or a day in this last quarter of 2016 to connect our dots together we will all be better equipped to concretely plan for an exciting year ahead!

Next: What’s new for AR hardware in 2017?




When a Developer Needs to Author AR Experiences, Part 1

There’s an established process for creating a new Web page. If it’s not already available, you begin by defining and developing the content. Then, there’s the formatting. Often there’s some scripting to provide interactivity. When the “authoring” is done, a page is published.

It’s not all that different for AR. Once an Augmented Reality project’s use case is clear, the experiences come about through an authoring process that resembles that of preparing and publishing content for the Web.

authoring-cycle

Figure 1. An AR authoring system combines trackables (created using features of the real world and a tracking library) with digital content that is encoded into presentation data and then assigned interactive functions (e.g., see more details, show relevant info, move and freeze in position, hide/close). The AR authoring system uses databases to store the scene elements – trackables, presentation data and interactions. (Source: PEREY Research & Consulting)

Today, Content Management Systems for the Web support the steps for page development with grace. Systems like WordPress and Drupal are so easy to use and commonplace that we hardly notice their existence.

In contrast, there are many AR authoring systems from which a developer can choose and none are as mature as CMS for the Web. The choice of tool and approach depends on the project requirements, skills of the developer and the resources available.

Define the AR Project Requirements

Before choosing an AR authoring system, the requirements must be clear. An AR experience design process should generate a storyboard and, from the storyboard, the following factors are defined:

  • User settings (indoor, outdoor, noise levels, etc.)
  • Need for a user management system to provide experience personalization or tracking
  • Need for live communication with any remote experts during the experience
  • Type of recognition and tracking required (marker, 3D, SLAM, etc.)
  • Need to access device GPS and compass for geospatial context
  • Preferred display device (smartphone, tablet, smart glasses or another type of HMD)
  • Human interaction modalities (hands-free, touch, speech, gaze)

In addition to the above variables that can be deduced from the storyboard, there could be other factors to consider. For example, if the target device is connected by an IoT protocol or if there are any supplementary files (e.g., videos, PDFs, etc.), then these need to be provided to the developer as early as possible. The project manager should also specify the frequency and types of updates that may be required after the initial AR experience is introduced to users.

When these project requirements and parameters are defined, the developer can choose the tools best suited for the AR experience authoring.

Want to know more about your choices of authoring platforms? There’s more in the next post




When a Developer Needs to Author AR Experiences, Part 2

This post is a continuation of the topic introduced in another post on the AREA site.

Choose a Development Environment

Someday, the choice of an AR development environment will be as easy as choosing a CMS for the Web or an engineering software package for generating 3D models. Today, it’s a lot more complicated for AR developers.

Most of the apps that have the ability to present AR experiences are created using a game development environment, such as Unity 3D. When the developer publishes an iOS, Windows 10 or Android app in Unity 3D, it is usually ready to load and will run using only local components (i.e., it contains the MAR Scene, all the media assets and the AR Execution Engine).

Although there’s a substantial learning curve with Unity, the developer community and the systems to support the community are very well developed. And, once using Unity, the developer is not limited to creating only those apps with AR features. The cost of the product for professional use is not insignificant but many are able to justify the investment.

An alternative to using a game development environment and AR plugin is to choose a purpose-built AR authoring platform. This is appropriate if the project has requirements that can’t be met with Unity 3D.

Though they are not widely known, there are over 25 software engineering platforms designed specifically for authoring AR experiences.

authoring-landscape

Table 1. Commercially Available AR Authoring Software Publishers and Solutions (Source: PEREY Research & Consulting).

The table above lists the platforms I identified in early 2016 as part of a research project. Please contact me directly if you would like to obtain more information about the study and the most current list of solutions.

Many of the AR authoring systems are very intuitive (featuring drag-and-drop actions and widgets presented through a Web-based interface), however most remain to be proven and their respective developer communities are relatively small.

Some developers of AR experiences won’t have to learn an entirely new platform because a few engineering software publishers have extended their platforms designed for other purposes to include authoring AR experiences as part of their larger workflow.

Or Choose a Programming Language

Finally, developers can write an AR execution engine and the components of the AR experience into an app “from scratch” in the programming language of their choice.

To take advantage of and optimize AR experiences for the best possible performance on a specific chip set or AR display, some developers use binary or hexadecimal instructions (e.g., C++) which the AR display device can run natively.

Many developers already using JavaScript are able to leverage their skills to access valuable resources such as WebGL, but creating an application in this language alone is slow and, depending on the platform, could fail to perform at the levels users expect.

To reduce some of the effort and build upon the work of other developers, Argon.js and AWE.js are Open Source JavaScript frameworks for adding Augmented Reality content to Web applications.

Results Will Depend on the Developer’s Training and Experience

In my experience, it’s difficult to draw a line between the selection of an AR authoring tool or approach and the quality or richness of the final AR application. The sophistication and quality of the AR experience in an app is a function of both the tools chosen and the skills of the engineers. When those behind the scenes (a) ensure the digital content is well suited to delivery in AR mode; (b) choose the components that match requirements; and (c) design the interactions well, a project will have the best possible impacts.

As with most things, the more experience the developer has with the components that the project requires, the better the outcomes will be. So, while the developer has the responsibility for choosing the best authoring tool, it is the AR project manager’s responsibility to choose the developer carefully.




How Optical Character Recognition Makes Augmented Reality Work Better

Today, companies in many industries seek to develop AR and VR applications for their needs, with the band of existing Augmented Reality solutions extending from gimmicky marketing solutions to B2B software. Helping production companies train their workers on the job by augmenting service steps onto broken machines is one of those solutions.

Augmented Reality could assist designers or architects to see a product while it is still in development. It could facilitate a marketing and sales process, because customers can already “try on” a product from a digital catalog. Or it could assist warehouse systems so that users get support in the picking and sorting process

The list of opportunities is endless and new use cases are constantly arising. The whole point of using AR is to make processes easier and faster. While at first, Augmented Reality and devices like smart glasses seemed way too futuristic, new use cases make them increasingly suitable for everyday life in the workplace.

Recognizing Objects and Characters

Augmented Reality is based on a vital capability: object recognition. For a human being, recognizing a multitude of different objects is not a challenge. Even if the objects are partially obstructed from their view they can still be identified. But for machines and devices this can still be a challenge. For Augmented Reality this is crucial though.

A smartphone or smart glasses can’t display augmented overlays without recognizing the object first. If needed for correct augmentation, the device has to be aware of its surroundings and adapt its display in real time according to each situation, all the while changing the device’s camera viewing angle. Augmented Reality applications use object detection and recognition to determine the relevant information needing to be added to the display. They also use object tracking technologies to continually track an object’s movements rather than redetecting it. That way the object remains in the frame of reference even if the device is moved around.

Character recognition is also crucial for a device’s understanding of the environment, as it not only needs to recognize objects, but according to the use case, it might also have to “read” it. This provides an even better discernment of the types of information that are important to process.

OCR Anyline

Optical Character Recognition

Optical Character Recognition (OCR) deals with the problem of recognizing optically processed characters, such as those in the featured image above. Both handwritten and printed characters may be recognized and converted into computer readable text. Any kind of serial number or code consisting of numbers and letters can be transformed into digital output. Put in a very simplified way, the image taken will be preprocessed and the characters extracted and recognized. Many current applications, especially in the field of automation and manufacturing, use this technology.

What OCR doesn’t take into account is the actual nature of the object being scanned. It simply “looks” at the text that should be converted. Putting together Augmented Reality and OCR therefore provides new opportunities; not only is the object itself recognized, but so is the text printed on that object. This boosts the amount of information about the environment gathered by the device, and increases the decision-support capabilities offered to users.

The Potential of OCR

Data import still requires high processor power and camera resolution and is expensive. Nevertheless OCR offers a viable alternative to voice recognition or input via typing.

Using OCR with smart glasses offers improvements for different kinds of business processes. Imagine a warehouse worker who needs both hands free to do his job efficiently. Using smart glasses to overlay virtual information on his environment can make him more efficient. But the ability to automatically scan codes printed on objects just by glancing at them frees his hands for other tasks.

Another example would be the automation of meter reading. When a device identifies the meter hanging on a wall, as well as its shape and size, and then automatically scans its values, a greater amount of meters can be read per day. This use case could be useful to energy providers.

When you look around, you will realize how many numbers, letters and codes need to be either written down or typed into a system every single day. Such processes, which can be very error prone, can become much less painful using OCR.




3D Studio Blomberg at Augmented World Expo 2016

Our team at 3D Studio Blomberg, along with key partners, travelled to Santa Clara, California, to attend the Augmented World Expo. The event is the largest annual conference and exhibition about Augmented Reality worldwide, with over 4000 attendees and 250 exhibitor booths. During the two days, I had the opportunity to make interesting new contacts, meet other AREA members, see and try a variety of innovative AR and VR solutions and attend the enterprise AR tracks hosted by the AREA.

Larger Players Entering the Market

Judging by the offerings on display at AWE, the ecosystem for enterprise AR products and services is expanding. Players like PTC (Vuforia), Osterhout Design Group (ODG) and Microsoft through HoloLens had observably increased their footprint at the event, and even the presence of VR products at an AR show confirmed the overall trend of a growing ecosystem. Microsoft presented its HoloLens product hosted by Vuforia and its technical capabilities are impressive. We view all this as a positive development as it will bring increased competition and more innovative market offerings.

AR in Enterprise Sessions

The AREA-hosted AR in enterprise track featured speakers and AREA members on a diverse range of topics from IoT to security. The sessions were interesting but they highlighted the array of challenges still facing companies seeking to implement Augmented Realty in the workplace. One fundamental takeaway was that widespread adoption of AR in industry isn’t solely a technological issue of AR, but rather is the result of steady improvements in the surrounding mix of technologies such as IoT, Big Data, etc. As these enabling features and technologies improve, they make the value proposition of AR even more compelling.

Another insight from the sessions was the idea of mental models and how we imagine innovations should work—but that actually turn out to be quite different in reality. We need to avoid this pitfall when thinking about AR and the problems it solves.

Lastly, partnerships are essential for expanding the ecosystem and assuring its success. For example, ODG makes great smart glasses but they need partners that create virtual content in order to get the most out of their products. All of these key ingredients will produce the necessary lifting power to make AR a killer app.

Conclusion

AWE was a rich, rewarding experience that we and our partners in attendance enjoyed immensely. As content providers for AR-enabled enterprise systems, we appreciated the opportunity to meet a variety of potential partners to which we add value. We’re looking forward to turning the ideas gained from the conference into reality, and to contributing to the exciting and growing marketplace for Augmented Reality.




New Executive Director Reports on AWE ’16 and Members Meeting

As the incoming executive director of the AR for Enterprise Alliance, I was very excited to attend my first Augmented World Expo and to meet some of the 34 members of the AREA.

AWE is one of the largest and best-attended events worldwide about Augmented Reality, and typically hosts thousands of attendees and hundreds of companies. This year’s event was no exception and did not disappoint. I was pleased to meet a high number of innovative AR companies from the AREA provider segment and attend demos of their groundbreaking solutions. It’s clear to me that AR in enterprise is here to stay and the AREA occupies a strategic position in growing the entire ecosystem to the benefit of everyone.

Benefits of AR in Enterprise

The event gave me the opportunity to speak with a range of attendees from many companies and markets. It was exciting to be asked so many different and interesting questions on many topics and one conclusion that came up time and again was the importance of AR in enterprise. The potential benefits and savings of AR is getting the attention of C-suite rather than just the innovation and technology teams. The trajectory towards a real reduction in time, costs and errors is a critical for companies as they look to streamline their business and increase the return on investment.

Enterprise AR Track at AWE ‘16

The focus on enterprise was supported by an impressive number of customers and providers presenting their experiences during the Enterprise AR track—sponsored by the AREA. I learned a lot from all the presentations but it was also instructive to listen to the members of the AREA’s customer segment. They were insightful and provided a unique perspective on the benefits and issues they experienced when implementing AR solutions within their companies. It’s clear that there are many lessons to learn and the AREA is well placed to help the AR ecosystem make effective and informed decisions based on shared knowledge and experience.

The AREA at AWE

At AWE we experienced a constant stream of people visiting our stand and asking questions. Many expressed appreciation of the AREA’s work and benefits achieved for the ecosystem. A number of them even mentioned regularly visiting the AREA website when trying to find information about AR, and that the AREA’s content was insightful and informative.

For those who hadn’t heard of us, it was useful to discuss our mission, benefits, membership options and growth. Much interest was expressed and I hope new members will join based on these discussions.

AWE was my first real experience meeting the enterprise AR community and it was a very useful and insightful experience. I look forward to following up with the many attendees I met and help drive the AREA’s development and its role in supporting this nascent ecosystem.

AREA Members Meeting

After AWE, we held an AREA Members Meeting in Palo Alto, California, on June 3. It was an honor to chair my first such meeting. AREA in-person meetings occur around three times a year and they’re a great opportunity to meet with members, discuss progress made, define future strategic plans to further develop the ecosystem and have some fun.

Thanks to Atheer for hosting the event at the beautiful Palo Alto Art Center.

The morning agenda items included:

  • Progress updates from the various AREA committees
  • Upcoming events in which the AREA can support its members

The afternoon included various brainstorming sessions around the content and the way the AREA positions itself to potential new members.

The day was full of insightful and interesting discussions, and from a personal perspective it was great to interact with many leaders and understand how we can work together as an alliance to support and grow the ecosystem and provide thought leadership to possible new customers and providers of AR.

If you are interested in joining the AREA, please complete this form.




Enterprise Augmented Reality at Laval Virtual 2016

This year’s Laval Virtual conference showcased innovations in Augmented Reality and introduced wide-ranging discussions on the topic of Augmented Realty in enterprise. On the second day, AREA board member Christine Perey hosted a round table session on the use of Augmented Reality to promote productivity. Participants included Manuel Asselot (Robocortex), Sebastian Knoedel (DIOTA), Marie-Julie Pecoult (Diginext), Pontus Blomberg (3D Studio Blomberg), Yann Froger (EON Reality) and Jim Novack (Talent Swarm).

Christine_Laval_Virtual-p

Olivier Larroque of Capgemini provided his impressions and summarized the essential questions and answers discussed by the invited panelists in a post on the blog of RA’Pro, an AREA member. The original post in French is translated and provided in English here.

Which industry currently leads the way with Augmented Reality in Europe?

Among the panel participants there was agreement that aerospace is a leading industry. Adoption of AR in aerospace is driven by reuse of 3D content in its product life cycle management (PLM). AR use cases are most recommended, for example, where they can overcome a technician’s lack of experience in performing a task, or where they can assist in risky or complex operations. Such use cases are being applied in a highly regulated environment where only one error on the assembly line can cause the loss of an aircraft or satellite.

What are characteristics of use cases that are most compelling for investors?

Ideally, you should identify critical points of a business process where human errors generate the greatest cost. The return on investment (ROI) on AR as reported by companies such as Boeing and Newport News Shipbuilding shows how crucial it is for an enterprise to embark on such a project.

Boeing conducted a comparison of three different guidance methods on a satellite assembly procedure of 50 steps using instructions:

  • On a stationary PC
  • On a table (PDF)
  • Overlaid in the field of view (AR) using a tablet

The results were dramatic: during first-time assembly, the AR-enhanced tablet users with no prior experience with the steps committed one error, while those using the PC committed eight errors. During the second time following the steps, those using AR committed no errors.

Laval Virtual Roundtable

Which prerequisites should be in place before a company implements AR?

Optimization of existing 3D content for Augmented Reality is actually more important than products or toolchains. Maintaining a flexible and modular approach in adopting these new technologies provides the ability to move among varying hardware and software products and packages, and helps organizations to remain a step ahead of the market.

What do you think of smart glasses?

The first step in adopting Augmented Reality is to use an AR-enhanced tablet for testing and then migrate to smart glasses if appropriate.

Smart glasses, of course, have compelling features (hands-free working, portability, etc.), but one should avoid falling into “shiny object syndrome,” or the desire to adopt technology at any price without first examining all its ins and outs. It’s essential to study what’s actually required, as well as the technical limitations.

What are the implications of adopting AR as a disruptive innovation?

The implications are threefold:

  • Social resistance to change: unions and conservative individuals within the company may be reluctant to change if AR is seen as an aspect of robotics. Communication should be oriented to assisting humans and what they do best. Moreover people, the employees, should be included at the heart of the discussions.
  • Enhancing procedures for Augmented Reality: start with simple tasks in workbooks or manuals. AR takes advantage of our visual processing and operators tend to instinctively apply it to minimize errors as it superimposes instructions to be followed in real time.
  • Dealing with technological realities: over-the-top special effects in concept videos have instilled high expectations for AR. We should rather address and educate the customer about the technology’s limitations in terms of hardware (field of view, tracking, etc.) and software. The technology should be thoroughly tested to ensure it matches the use cases that the customer is targeting.

If the best hardware is not currently available, when is the best time to get started with AR?

Ideally right away. AR adoption is a long process with many different aspects (social, technological, security, etc.) that involves deep collaboration among all domains of an enterprise (operational, management, legal, etc.).

Conclusion

Augmented Reality will allow companies to approach the way their employees work more visually, with a new way of representing objects, learning new tasks and transmitting knowledge. Recall the progress made between the first MS-DOS screen and what we have today with personal computers. With new devices such as smart glasses and other products like Microsoft HoloLens, the DAQRI Smart Helmet or Magic Leap, the changes will also be very dramatic.