1

Jay Kim of Upskill on the Pristine Acquisition and the Future of AR

In the aftermath of Upskill’s acquisition of Pristine, the AREA caught up with Upskill’s Chief Strategy Officer, Jay Kim, to get his perspective on the deal and what it means for AR.

AREA: What were the motivating factors behind this acquisition?

Kim: Both companies saw capabilities in each other that would be a force multiplier, especially at this stage in the AR market. The early AR adopters are piloting solutions and starting to figure out how to scale enterprise-wide, but for every early adopter, there are probably a dozen companies that are still experimenting with a variety of different projects and providers, trying to understand where smart glasses can provide the highest level of economic impact today.

What we saw in Pristine was a best-in-class video remote collaboration solution called EyeSight, which offers a number of unique qualities. It’s robust against some challenging environments with connectivity and bandwidth issues. It is a 100-percent cloud-delivered solution. It supports a number of different smart glasses devices to enable HD-grade video streaming. It is also easy to deliver and there is elegance in its simplicity.

Ultimately, we see this acquisition as a strategic advantage for a couple of reasons. First, it enables Upskill to engage the broader market with a lighter touch solution, using their portfolio of apps, to drive exposure to AR technology across a number of industries. That means we’ll be able to accelerate customers’ exposure to smart glasses and the benefits of our technology. Second, it provides us with an even stronger remote assistance and knowledge capture solutions than what we had previously available, which can be integrated into our core product, Skylight. And finally, with the acquisition, we have brought on more than a dozen new staff to our team in key areas where we needed to add talent. Culturally it was a good match and it also deepened our bench of industry experts.

AREA: So, you’re giving enterprises an easier entry point and a clear migration path to taking on more AR capabilities over time?

Kim:  Upskill has built a very powerful industrial AR platform – Skylight – that integrates quite nicely into large enterprise IT environments. Of course, Skylight is cloud capable, but can be delivered in any enterprise IT environment, which for most of the customers we work with, means it needs to be on premise. Pristine saw our platform as an opportunity to take some of the product capabilities that they had built, and now gain a logical growth path that enables them to scale AR enterprise-wide and deliver far more than just remote collaboration, assistance and capture capabilities. That’s what the acquisition means for both of us – the ability to have a much larger addressability of the market, as well as a greater range of use cases we can support.

AREA: Tell us about the use cases.

Kim: With EyeSight, we now have a product that is tailor-made to address field service applications in industries where cloud delivery of software is relatively common. There is a huge need for remote assistance and collaboration solutions. It’s done in a way where different users can almost perform self-service. They can just turn on the system, launch EyeSight and get going. That’s very, very impactful. And there are several examples of where this type of application is best utilized, some that we recently covered in an Upskill webinar.

We’re working with Coca-Cola, for instance, to provide remote support for technicians in one of their bottling facilities. The issue they face is that the suppliers for much of the equipment in the plant are based in Europe. So when troubleshooting or repairs are needed to fix machines on the line, previously the only way to solve them was to fly a technician in from Europe, which was costly in and of itself. However it only paled in comparison to the cost in downtime, which can equate to thousands of dollars for every second the line is not operating. Now with EyeSight deployed, when issues arise staff onsite can immediately initiate calls with their supplier counterparts in Europe. The remote experts can accept the call either from a browser or a mobile device and instantaneously see the equipment, diagnose the issue, and walk the Coca-Cola technicians through the process of repairing it without ever having to step into the path.

AREA: Can you give us some idea of potential new AR capabilities that will be enabled by this acquisition?

Kim: Pristine’s people are among the world leaders in engineering product design and UX design in this industry. We’re extremely excited about that. With those sharp minds on the team, it will enable us to accelerate some of the thinking around our next-generation product. That includes our ability to get into more complex augmented reality scenarios as we cement our leadership in the assisted reality category and look to a more immersive augmented reality world.

AREA: What does this acquisition say about the state of AR adoption in the enterprise?

Kim: There are two key points. One is the growing enterprise awareness around having solutions that span the entire value chain, from point solutions, such as EyeSight, to more holistic platforms like Skylight. A lot of the folks that have experimented with different point solutions are starting to evolve their line of thinking to say, “Okay. We’ve got it, it’s good. What’s next?” There is a greater need to understand how large enterprises are going to deploy their AR strategies to impact the greatest number of people in the organization. Skylight is an excellent fit for what they are trying to do. I think this acquisition signals that we’re looking at an era where point solution providers will need to understand how their solutions will scale across businesses and that is quickly becoming table stakes for providers.

AREA: Do you think this acquisition provides any clues about the future direction of the AR industry?

Kim: We see people taking multiple parallel converging paths in their approach to AR. You have a number of vendors that focus on handheld devices – the smart phone and tablet form factors – and delivering compelling, camera-based registered experiences. Then you have folks like Upskill that are entrenched in the assisted reality domain. This acquisition does signal the fact that the assisted reality domain is going to mature quickly. It centers on the fact that the fundamental value proposition of assisted reality is around delivering a hands-free user interface to the data and assets already available to industrial workers.

The third path is the hands-free immersive AR solutions. You’ve got companies that are trying to do that based on projection-based systems and devices like HoloLens that deliver that fully immersive registered experience. I don’t think there’s any question that eventually the world is heading towards this – immersive augmented reality solutions everywhere. We’re taking it another step towards making that vision a reality. Of course, I represent the software side, so there have to be corresponding hardware advances that enable us to get there, but it is clear that the augmented reality is quickly maturing to a solution that is scalable and impactful today, while laying down a foundation to allow even further advances in technology.




ABI Research Enterprise Wearable Revenue $55bn by 2020

According to the latest study by ABI Research, revenue from Enterprise Wearable devices is said to hit $55 billion by 2022. This figure has risen significantly according to appstechnews.comas it was predicted to reach $10.5bn by the end of 2017 with a CAGR of 39%.

Stephanie Lawrence, research analyst at ABI Research in a statement said “Wearables have much less in-built security and authentication protocols than other devices and so require robust security platforms to ensure data safety. Supporting platforms allow managers and IT teams to determine what information the wearables have access to, monitor their usage, create customised applications, and remotely control the devices. This ultimately safeguards the data from being compromised.”

According to the article, Wearable platforms require specific applications that can support a wide variety of devices  ABI Research believe ‘most’  wearable platforms can be supported by enterprise mobility management (EMM). To read the rest of the article and read the full report, click here.




Augmented Reality Smart Glasses Market

An increasing number of IT companies and startups showcased AR related products and this year’s CES and MWC.  Readers may be interested in a report that has also been released, providing an overview of the latest development of AR and VR Smart Glasses which also examines the potential of Smart Glasses in the future.

Some of the main points that have been listed in the report include:

  • Analysis of key issues that are likely to affect the potential of Augmented Reality Smart Glasses
  • Development timeline of key vendors in different development phases
  • Latest development of Augmented Reality Smart Glasses and includes accumulated shipment volume forecast by market sector for the period 2015-2020

Main vendors that are mentioned in this report include:

  • AREA Members; Atheer, DAQRI

Other vendors include ODG, Caputer, Epson, Facebook, Fujitsu, Glass Up, Google, Intel, Konia, Kopin and Magic Leap.

To read the report in full click here.




Finger Food’s Mixed Reality Scenographic Tool

Finger Food Studios, is developing a new interactive scenographic tool for Microsoft HoloLens according to their press release. It is being tested in a show creation context by Cirque du Soleil. Finger Food’s innovative holographic set design tool took centre stage at the Microsoft Build (MS Build) conference Day 2 Keynote.

During the Keynote, HoloLens creator Alex Kipman invited Cirque du Soleil to demonstrate Finger Food’s Mixed Reality scenographic tool. This full-scale holographic tool can not only save on time and costs, but also allow for greater creative freedom and collaboration.

As demonstrated on stage yesterday, Finger Foods’ scenographic tool enables the creation of interactive 3D blocks and shapes that can be quickly transformed into a full scale rendering of a stage, including holographic performers. A tool like this would enable creative teams to work together to imagine, test and bring life to their ideas at full scale even before the first casting session or any building begins.

To read their full press release click here.




Finger Food Studios Debuts HoloBridge Framework

PRESS RELEASE: Canadian Tech Firm debuts HoloBridge IoT data visualization framework at Hannover Messe Industrial Fair

Finger Food Studios becomes OSIsoft’s first Red Carpet Incubation Program (RCIP) Mixed Reality Developer with Deschutes Brewery Demo.

Working with OSIsoft LLC and Microsoft, Finger Food has created an innovative holographic application for top US craft brewery, Deschutes.

With HoloBridge, Finger Food brought Deschutes’ data to a new level of understanding. Drawing on Finger Food’s strengths in game design and development, user interaction and collaboration, the company created a way for Deschutes to not just see, but interact with their data at scale.

Finger Food’s Mixed Reality (MR) application features life-size holographic brewery equipment with live streaming data, offering context and insights from the PI Integrator for Microsoft Azure and Microsoft Cortana Intelligence ready to be viewed and analyzed.

To read the full press release by Finger Food Studios please click here.




When is Technology Interoperability Important? By Christine Perey

I remember the day in 1992 when I learned about this new way to get information: it was called Gopher. Via my modem on a phone line that I was using for CompuServe access, I connected my computer to a server. The next year I installed the first browser, Mosaic, on my Apple Macintosh and experienced the Web for the first time. I never asked myself whether there were standards involved but, of course, we know now that they played a role in the Web becoming what it is today. An array of standards gives you the ability to read this page on any device.

Technology standards for interoperability have a long history. The emergence of the Web, and the World Wide Web Consortium (W3C) to define and maintain standards for it, is within memory for some of those reading this page but there are many other industries that had to develop standards as they went. Some adopted standards after experiencing some painful lessons. For example, the plugs and outlets that people use to connect devices to the electrical grid all conform to the power standards defined and adopted in their country. The fact that there are national standards, and not international ones, was a painful lesson. As recently as 15 years ago, we had to carry transformers and plug adapters around with us if we wanted to use household appliances or computers designed for one national standard in places with different standards.

Folks with whom I was working at the time I started using the Web were developing interoperability standards for telephone networks. The regional Bell Operating Companies were trying out new technologies (not only those developed in Bell Labs and provided by “Ma Bell”). They wanted to go around AT&T to reach long distance operators in Europe and elsewhere. My exposure in the mid-1990s to the challenges that they were addressing marked me. There are compelling business imperatives behind the telephone interoperability standards requirement. Standards are the basis for large, highly complex systems reaching millions of people and billions of dollars of revenues. The key concept here: scalability.

Jump ahead to 2009

I had been working in Augmented Reality for several years but it wasn’t until 2009 that devices capable of offering the general public their first mobile AR experiences (smartphones with cameras and GPS) hit the market. As you’ll read in this post by Thomas K. Carpenter, a lot happened in AR that year. A 2009 post on Adweek asked whether 2010 would the “year of Augmented Reality.” 

2009 is the year I heard Mike Liebhold and Damon Hernandez refer to “Open AR” in the context of a workshop co-organized by the Institute for the Future. I had an “ah ha!” moment. I felt that if AR was going to really reach its full potential, it was time to begin advocating for interoperability of components and services for Augmented Reality. In order for AR technology to scale, to integrate with everything—every person, place and thing on the planet—it cannot operate in a vacuum. It must be part of the full technology equation. It must benefit from breakthroughs in artificial intelligence and Big Data. AR must be integrated and combined with a host of other underlying technologies to become the user interface for the Internet of Things, our communications and collaboration platforms, and even our power grid. Well, the interface for the physical world. In short, AR components must be interoperable.

In conjunction with Mobile World Congress 2010 and with the support of Dan Appelquist (then at Vodafone, now at Samsung), I organized the first workshop about barriers to the growth of mobile AR. There were many barriers.

Is now the time?

Many of those obstacles we talked about in 2010 are lower today, but low interoperability remains on the list of critical barriers. Over the following years, while talking about this issue and in the context of a dozen more meetings of members of the grassroots Community for Open and Interoperable Augmented Reality, I’ve listened to hundreds of people defend proprietary technologies as part of their Augmented Reality pipeline. Many of them make strong cases for and experience the benefits of their innovating without observing standards. But, many technologies remain limited in their reach and impact. Their ability to enable AR to evolve and spread to new use cases is lower without interoperability.

Phone companies and energy companies have understood for decades that, due to the business model of their providers, closed technology silos have a role but are not scalable. Proprietary technology silos enable one company to define a vertically-integrated stack, suite, or system of components such that developments are entirely under the control of the provider. The provider decides when and how much investment is in the total system or in any part.

Interoperability of data and components with which to deploy AR features in enterprise IT are key requirements for large customers in many industries, but they are currently unmet. Due to the risk of closed AR technology silos delaying or controlling their future investment into AR, many large enterprises are holding back. To change their position, stakeholders in AR must take all steps necessary to accelerate the emergence and success of open and interoperable AR.

I’m not alone in expressing this position. For over seven years, I have had the pleasure of working with other people around the world who share the vision of widespread AR based on interoperability. One of those is John Simmins, a technical executive at the Electric Power Research Institute (EPRI).

In 2016, John and I asked ourselves whether all the conditions would soon align, whether we were getting closer to the AR interoperability threshold. To answer this question, we embarked on an EPRI research project designed to assess all the standards and projects to date and to inform the broader community of stakeholders about the importance of interoperability.

What’s your answer?

The first output of this project, the EPRI Enterprise Augmented Reality Vision, Interoperability Requirements, and Standards Landscape report was released last week. Like the development of standards by members of the W3C when the Web was young, this report is public and designed to serve the whole ecosystem of stakeholders.

And, similar to the work initiated by the SmartGrid Interoperability Panel (originally put in place with the assistance of NIST) for ensuring the future of Smart Grid, this report proposes an interoperability framework.

The AR Interoperability Framework (source: EPRI Enterprise Augmented Reality Vision, Interoperability Requirements, and Standards Landscape Report).

The EPRI report provides the current landscape of relevant standards in the context of the new framework as well as in the AR authoring, publishing and presentation pipeline and the MAR Reference Model. Never before have all the different standards activities that have been started and could contribute to interoperable AR, been cataloged and described in this level of detail. It’s not a light read. It’s a reference work.

The release of the EPRI report is a milestone, but only as important as what follows. Interoperability is still poorly understood by many stakeholders and difficult to achieve in the best of circumstances. That means that this report must be more than a catalog to capture the status of standards in early 2017. It is released at no cost and without restriction so that it can become a discussion starter and the basis for many types of workshops and meetings, and other forms of stakeholder collaboration.

Collaboration Matters

The cover image we chose (which appears at the top of this post) focuses on the interoperability problem and the need for multiple stakeholder to focus and collaborate on the topic. If you feel that interoperability is important for AR to scale now or in the future, we invite you to contribute to the discussion with your colleagues. Organize your own workshops, or look for and participate in meetings organized by others. And, so that we may include them in updates to the report, please let us know about your activities as new standards and/or activities emerge. 




Caterpillar Augmented Reality

An article on Automation World tells us how Smart connected products, combined with Augmented and Virtual Reality tools, are transforming how the company, Caterpillar, services equipment and interacts with customers.

The article talks us through its customer service offerings in the areas of the three industries it serves —resource industries (mining), construction, and energy/transportation (power/compulsion systems for marine, oil & gas, turbines and locomotives).

Today, service is evolving again, as it is done remotely using telematics devices that can do onboard diagnostics and send alerts and diagnostic codes back to the equipment experts. In addition, Caterpillar is now moving into the next layer of technology in the service domain which includes the Industrial Internet of Things (IIoT), as well as Augmented and Virtual Reality.

The company’s modern technology mission: “To be cheaper, smarter and faster,” said Terri Lewis, digital and technology director at Caterpillar, during her keynote presentation at the Automation Conference & Expo.

In fact, the Caterpillar IIoT initiative was launched as a vision in the 1990s with a goal of leveraging the Internet for service. Today, the company has 186 dealers and about 500,000 connected assets around the world upon which service technicians can conduct an inspection simply by sending photos for analysis via a smartphone.

“We have entire mine sites connected and interacting to make split-second decisions,” Lewis said. “Now we want to connect people and products.”

In the not-too-distant future—about three years—Caterpillar will be using Augmented Reality (AR) and Virtual Reality (VR) for sales, operations and service applications, Lewis said.

The article concludes with a mention by Lewis of the AR lessons learned: the use of the new technology means that challenges have been overcome.




Augmented Reality Meets Manufacturing

An article on Machine Design.com under the section ‘industrial automation’, which postulates on the recent developments in IIoT which have brought about great potential for AR to revolutionise the manufacturing sector.

The article states that in fact, AR completes the entire digital ecosystem as a part of Industry 4.0. It connects workers with equipment, letting them interact with sensor data to identify exact components that need maintenance, replacement, or upgrading.

The article goes on to give an of how Augmented Reality is being used in manufacturing by Thyssenkrupp, a leader in elevator manufacturing, already uses AR to guide workers during maintenance using Microsoft HoloLens, a self-contained, wearable holographic computer. The smart glasses let workers inspect the elevator virtually before going to the site. Once there, they can quickly identify components that need replacement or maintenance. They can also instantly access tutorial videos or talk to trainers in real-time for assistance. The company also uses AR to measure stairs and capture digital point-cloud data needed for quickly designing and manufacturing stairlifts.

Developing lightweight 3D CAD models for AR is one area seeing significant activity. Native 3D CAD models are large and represent actual products to be manufactured, including all of the nuts and bolts. However, the detailed native CAD product model is not fit for AR visualization. For AR purposes, the models need to load quickly. And that is why re-modeling or a conversion process is needed. Although part of this conversion can be automated through AR publishing tools, it still takes a manual effort to ensure the details required in the AR environment aren’t compromised.

To read the full article, please click here.




THX ahead with Smartglass Certification Program

A press release dated May 30 2017 reveals that THX, a leading provider of superior audio-visual experiences, has partnered with ODG (Osterhout Design Group), a leading developer of Augmented, Virtual and Mixed Reality Smartglasses, to pave the way for excellence in headworn devices.

The driving force of the partnership is the new THX Virtual Cinema Display Certification program, which ensures a world-class viewing experience for Augmented and Virtual Reality Wearable devices.  

The ODG R-9 smartglasses will become the first THX certified product in the program, setting the standard for display excellence and a high bar for future headworn certifications. It will ensure that the R-9 glasses, and any future devices that are certified, are calibrated to the same color and resolution standards that are used in professional Hollywood studios, giving consumers access to visual content from the ‘best seat in the house’, no matter where they are.  

“Augmented reality is the future of audio and visual advancement,” explained Bill Rusitzky, CRO of THX Ltd. “We chose to partner with ODG to certify their product’s best-in-class cinematic experience because we are dedicated to establishing high standards in next-generation technologies, as we have achieved with existing technologies over the past 30 years. We are excited to continue this commitment to consumers, and come together with ODG to deliver premium AV experiences in a very new way.”

The full press release can be read here.




WSS to Unveil Augmented Reality Project

Wilhelmsen Ship Services (WSS) will be revealing their Augmented Reality project at Nor-shipping 2017. This new technology, according to Seatrade Maritime News, allows visitors to see WSS’ latest products inside and out. You can download an app onto your phone and scan a printed image to unlock the AR system.

Kjell Andre Engen, evp marine products for WSS says “AR helps us illustrate what really makes our maritime specific solutions portfolio tick and we believe it will change how and why customers make their purchasing decisions Although being showcased as a marketing tool at an exhibition WSS believes the AR system has much wider uses in the maritime industry. With the app able to combine a desk worth of product catalogues, manuals and data sheets Engen said AR would become “absolutely invaluable for crew training.”

This new AR project was developed in conjunction with Imagination Computer Serv. using their Magic Lens software. Wilhelmsen Ship Services will be demonstrating this AR system at the Nor-shipping 2017.