1

Google Glass 2.0—Primed for the Enterprise: Foldable, Rugged and Waterproof

When it was introduced in February 2013, Google Glass 1.0 was far ahead of its time. Consumers and developers identified many issues that needed to be addressed and, although some have adopted the hardware, it was deemed unsuitable for widespread use by consumers or enterprise customers.

Over two years later, in early summer 2015, Google began showing key developers and sharing with the media that it is working on the next generation of Glass, code named “Project Aura” powered by Intel.

Google Glass Ready

The new device is geared for professional users. Employees using the information provided via the wearable display will be able to perform tasks with fewer human errors while enhancing productivity and operational efficiency.

The new “ruggedized” Google Glass hardware design is said to be easy to fold and more durable in work environments. Some options include the ability to clip the tiny display unit on the existing eyewear.

Perhaps Google Glass 2.0 is primed to grow in many industries such as oil and gas, warehousing, manufacturing, agriculture and mining.  The likely impacts depend on the use cases and company readiness for change.

The Benefits of Hands-Free Displays in Warehousing Operations

 In April 2014, DHL published a report describing how logistics operations can be improved with the assistance of hands-free wearable devices. The use cases fell into four categories:

  • Warehouse operations
  • Transportation optimization
  • Last mile delivery
  • Service and repair and other enhanced services

The evidence to support the assertion that warehouse picking can be improved, the first use case identified in the DHL study, is mounting.

Google Glass can also be used for reducing the cost of warehouse redesign as well as factory planning but studies about metrics for these use cases are not available at this time.

The Future of Google Glass

Will Google Glass 2.0 address the issues seen in the first prototype?  This remains to be seen, but with several confirmed reports on the changes and improvements Google is making with Glass 2.0, it is evident that Google is all-in on changing the future of computing through wearables and, ultimately, with Augmented Reality.

Have you tested Google Glass 2.0? Share your thoughts and feedback below.




Technical Communicators Must Evolve to Support Augmented Reality

As other AREA blog posts and pages on this website attest, Augmented Reality can be very beneficial but it doesn’t happen by itself. The preparation and delivery of AR experiences in professional settings involves the cooperation of many groups and investments from diverse points in a larger corporate information value chain. One of those groups is responsible for technical documentation.

As a professional technical communicator, I believe that introducing AR will also be rewarding to those people and organizations delivering their content in new, contextually driven systems. However, the development and delivery of AR-enriched content also comes with a new set of challenges.

From Topic-Based Content to Experiences

Changes in technologies, skills, priorities and procedures will be necessary. Accepting responsibility for and producing AR-enriched content will involve a shift in the mindset of technical communicators who, like most of their customers, are accustomed to developing traditional, topic-based or video content. In other words, technical communicators will have to embrace a more holistic view of content: experiences.

This means that, in addition to performing their traditional information development tasks, technical communicators will need to begin designing and supporting the delivery of content that changes in real time, based on the user’s context.

Crowded Display

We Need New Approaches

When content is destined for use on AR-enabled systems, our technologies will need to change. We’ll also need to adopt new approaches designed to:

  • Position and format the experience content so that it doesn’t obstruct the viewer’s line of sight to the real world target, as well as present other objects that could introduce risk or errors.
  • Anticipate and correct error conditions in real time, under constantly changing light and environmental conditions.
  • Design overlay information so that it doesn’t overload the user’s ability to process and use the information effectively.
  • Leverage sophisticated software that produces and manages 3D models, and reduce current reliance on traditional 2D graphics and illustrations.
  • Take into consideration the higher processing power required to render digital models, graphics or other supplementary data over the real world in real time, while taking into consideration its impact on battery life.
  • Plan for both the user’s device to access high-performance networks (especially when the content is in 3D format and stored on corporate servers), and for when those connections have high latency or are interrupted.
  • Work with the strengths and limitations of new end user hardware such as smart glasses or helmets, watches and other wearable sensors, and design new software tools that are unique to these, and rapidly evolving.
  • Adopt still other types of new hardware and software to capture the objects, develop, view and test the experiences when under development.
  • Design to comply with new yet-to-be-defined policies and tools for certification, data security and encryption.
  • Notify users when their every action is being captured and recorded, and control this capture, while managing the changing acceptance (or resistance to) these technologies.
  • Manage the use of cameras in restricted environments in order to reduce risk of confidential information being exposed and pirated.
  • Measure benefits gained from, and additional costs and complexity associated with the delivery of AR experiences.

All of these changes and new skills associated with AR-enriched content development will require many years of testing, some of it by trial and error. Eventually refinement will lead to mature and widely accepted best practices.

New Standards in Augmented Reality

I believe that these skills and best practices must also be accompanied by the development of formal standards for technical communicators to follow in AR design and development. I’m co-chairing the OASIS AR Information Products Technical Committee in order to study what’s needed for the wider adoption of AR technology and associated experience development methods by technical communicators. Over time the committee members will also work together to develop standards that will guide technical communicators and improve their ability to deliver content in AR experiences. Then, the suggested benefits of using AR-assisted systems will be achievable across a great many industries.




The Augmented Reality Provider Landscape Shifts, Again

Developers of Augmented Reality experiences select tools and technology for a project to match use case requirements. If the use case involves a page in a book or the side of a package, then in these cases 3D tracking is overkill. If the project accesses records in a company’s ERP, there must be plug-ins or a customization. If the customer needs reports (e.g., number of objects recognized, interaction of the user, etc.), then the platform needs to support their production. If the target is a movie poster, the security considerations are entirely different than if the target involves a proprietary industrial process.

After five years of Metaio’s dominance of the AR software provider landscape, developers’ options are changing dramatically. This post reviews the recent changes in this provider landscape, how these impact developers and suggests that those who license and purchase development tools could use this period of research and evaluation as an opportunity to communicate more clearly about their project requirements to all the tool and technology vendors.

A Rapidly Changing Provider Landscape

In early 2015, Metaio’s ecosystem ranged from dedicated individuals producing one or two experiences, to Fortune 100 companies. Some were researchers designing prototypes; others were automotive industry giants like BMW and Audi who used Metaio’s robust tracking algorithms for precision engineering and design. Then, in mid-May 2015, a message appeared on Metaio’s website saying that it would stop selling licenses immediately, and that support for its Augmented Reality services and software technologies would end on December 15 of the same year. The mysterious announcement took the company’s global developer ecosystem by surprise.

Many, if not most, of those developers’ authoring experiences for enterprise and industrial projects were using Metaio’s software tools. Metaio’s change in direction put developers in an uncomfortable position. Many were furious. Others expressed frustration. To this day there remain many questions about the circumstances that led to the announcement. Regardless of the changes to a company that the developer ecosystem had grown to trust, serious business issues remain:

  • What will happen to the channels published in a platform operated by Metaio?
  • What will developers use in the place of Metaio’s tools?

Many developers are now doing what more could have done consistently over the previous years: investing their resources to evaluate other potential tools and technologies. The best developers will resume proposing projects to their customers once they have thoroughly tested the alternatives.   

Gaps for Enterprise Augmented Reality

While there are alternate enterprise Augmented Reality technology providers with solutions and services worthy of evaluation (see table below), none offer the breadth and maturity, the professional documentation and support that Metaio provided for its SDK, Creator, Suite, Cloud and Continuous Visual Search matching system.  

Enterprise AR authoring providers and products

Source: © 2014 – 2015
Company Platform
DAQRI 4D Studio and AR Toolkit
Wikitude Wikitude SDK
Inglobe Technologies AR Media (and other)
BuildAR BuildAR
Catchoom CraftAR (and other)
NGRAIN Vergence (and other)
Diota DiotaPlayer, DiotaConnect
EON Reality EON Studio (and other)
Bitstars Holobuilder
Fraunhofer IGD Instant Reality
Kudan Kudan SDK

Metaio’s dominance wasn’t limited to breadth of offering and AR developer mind share. Among its peers, it probably also generated the greatest revenue from licensing its software tools and providing services. To deliver value to customers and drive development of its technology suite, Metaio employed over 75 of the world’s most qualified and experienced enterprise AR engineers.Table 1. Enterprise AR authoring providers and their products

Those that can have been furiously hiring engineers to write code and build out their teams and offerings but breadth and depth like what Metaio offered doesn’t happen in a matter of months. 

Vuforia’s Focus on Consumer Use Cases

No one knows precisely how much of the Metaio developer ecosystem overlapped that of Qualcomm Vuforia, but anecdotal evidence suggests that developers who had use for both, leveraged their qualities for entirely different projects. 

Vuforia is strongly optimized for delivery to consumers on smartphones: entertainment, cultural heritage, education and marketing use cases. For this reason, developers who explored its use for their enterprise or industrial projects did not place Vuforia’s current offerings at the top of their list of preferred enterprise-ready AR tools.

In an October 12 press release, PTC, a global provider of enterprise platforms and solutions for creating, operating, and servicing connected objects, announced that it had reached an agreement to acquire the Vuforia technology, and its developer ecosystem, from Qualcomm Connected Experiences, Inc., a subsidiary of Qualcomm Incorporated.

The acquisition of Vuforia by PTC suggests that while Metaio technology is probably being integrated into a platform and tools for consumer-facing solutions, the tools most popular for consumer-facing AR experiences (i.e., the Vuforia SDK) will evolve to better meet the needs of developers seeking to address enterprise use cases.

The Landscape Continues to Evolve

The reversal of relative positions of the two popular Augmented Reality SDKs with respect to their target markets and one another is one of several trends.

First, the list of developer options is expanding. Firms that were previously quiet have the opportunity to engage with developers who are more interested in learning of their offers. Google is getting closer to its Glass at Work 2.0 release. Microsoft is showing HoloLens and the tools it has designed for authoring (aka “Holo Lens Studio”) to more developers. Some firms with significant experience and investments in enterprise Augmented Reality are becoming more attractive, or at least more visible. For example, Diotasoft, a French technology provider with loyal enterprise customers including Renault, PSA Peugot Citroen, Total and Dassault Aviation announced a rebranding (the company is now called “Diota”) and launched a new platform for enterprise Augmented Reality.

Another trend is a shift in positioning. PTC and Vuforia’s statements in their October 12 press release emphasize where they see the greatest potential for impact. They draw a line between Augmented Reality and the need for people to visualize data stored in and managed by PTC’s Internet of Things-oriented systems. This echoes the suggestion made by Gerry Kim, professor at Korea University, in a meeting of the AR Community on October 6: Augmented Reality is the human interface for IoT.

As the number of options increases, so does the potential cost of integration. In a highly fragmented market one large enterprise could easily end up with solutions addressing different use cases based on multiple different and incompatible SDKs.

AR Data Integration

An Opportunity to Mandate Open Solutions

A unique opportunity lies in the middle of the increasing fragmentation and investment in new technology providers.

What if, instead of accepting the status quo of many competing and incompatible AR platforms, large enterprise customers and their developers were to clearly demonstrate their need for open systems?

Developers can seize the next few weeks and months to prepare a campaign describing new or existing systems with which they would prefer to create and manage enterprise content. They can document the barriers to interoperability and mount pressure on enabling technology providers. What if, prior to a purchase or licensing decision, the provider of an AR authoring platform were required to demonstrate interoperability with content generated from Metaio’s SDK?

Openness does not mean Open Source. Openness is a condition that is based on explicit or implied agreements between vendors. Providers of technologies must agree upon common data formats, and provide interfaces and APIs that are well documented and designed for interoperability with solutions of potential competitors.

Without issuing a clear mandate for AR technology providers to support a greater level of integration and interoperability with enterprise IT systems, developers should not be surprised if their options remain highly rigid and difficult to integrate. Unless some forward thinking people don’t take action, developers and their large enterprise customers must be prepared to face many more years investing in brittle transcoding systems or other approaches to “work around” the lack of openness and interoperability.

How are you going to respond to this rapidly shifting AR technology provider landscape? Are you taking this opportunity to share your requirements with new vendors? 




Augmented Reality Industry Leader: Bob Meads, CEO iQagent

Today Christine Perey, Executive Director of the AREA, interviews Bob Meads, CEO of iQagent and member of the AREA board. Bob is pioneering the use of mobile Augmented Reality on the plant floor to increase worker efficiency and safety.

Q. What is the level of interest in enterprise AR among people in your company?

The level of interest in this technology is high; however, we don’t like to put technology first. As I have written about previously, AR is a great fit for plant floor challenges. But using AR (or any technology) for its own sake is a flawed approach if you want to sell a product. We identify the problems we want to solve, and fit the best technology to solve them elegantly. The litmus test of a great AR solution is at first you don’t notice it’s an AR solution. Your attention is captured by the system’s usefulness and applicability to the problem it addresses. The realization that it uses AR comes as an afterthought.

Q. How does your company, group or team plan to make an impact in enterprise Augmented Reality?

We plan to bring to the enterprise market mobile apps that solve real problems, in keeping with our “practical” approach to Augmented Reality.

Q. In your opinion, what are the greatest obstacles to the introduction of AR in enterprise?

The three barriers we encounter most frequently are in adequate infrastructure, security issues and resistance to new technology. Using AR technology as part of a plant solution will overwhelmingly be issued on mobile devices. So the barriers to using mobile devices become barriers to using AR on the plant floor. It can be a big investment for a plant to create a wireless infrastructure that covers the plant floor well. Many plants also haven’t fully embraced the use of electronic documents versus paper ones, despite the obvious benefits. Mobile devices also tend to raise alarm bells with IT for many reasons. Then there is concern over ROI, that once the infrastructure is added, these new mobile devices and software will not actually be used or won’t provide a return on investment.

Q. Are you focused on a particular industry? If so please describe it, as well as the customers with whom you work.

While we serve most industries, automotive, chemical/pharmaceutical and food & beverage are where we focus. This is because these plants have lots of automation, and, therefore, lots of data and resources that the plant staff access on a daily basis. The ROI of our product, iQagent, is very dramatic for these kinds of plants.

Q. How do you characterize the current stage in enterprise AR adoption? What will it take to go to the next stage or to accelerate adoption?

In my opinion, AR technologies are still in the trough of the Gartner Hype Cycle, but slowly coming. The potential for enterprise AR concept to help workers visualize data and resources as they relate to real world equipment or processes in enormous. It limits the skillsets needed to perform adjustments or repairs, reduces human error, and lessens the need for training. It’s a giant win-win. So why isn’t it already in widespread use? Because AR solutions tend to be highly customized and developed for specific customers. This approach is expensive, introduces risk and extends the ROI for the customer. This is due, in part, to the lack of standards. The breakthrough for AR in the enterprise will come when there are more off-the-shelf AR solutions that are easy to integrate and deploy and provide obvious benefits and immediate ROI. Right now most AR products are toolkits because there are no AR standards out there. If standards were created and adopted, it would be easier for AR providers to create off-the-shelf solutions. This in turn reduces risk, lowers cost and provides a well-defined ROI for the customer.

Q. We’d like some historical context for your current role. How did you get interested in or develop your role in enterprise Augmented Reality?

I have been in industrial automation software and integration for 20 years, and have always loved technology. iQuest, my automation company, specializes in using different technologies to solve plant floor problems. When the iPad was released, we began looking for ways to leverage it on the plant floor. We started with identifying common problems we could solve with a mobile app, and then developed iQagent and the concept of “practical” augmented reality, or, in the words of Ars Technica, “Just Enough AR.”

Image courtesy of IQagent

iQAgent offers support to Windows 8.1




DAQRI @ AWE 2015

This post was previously published on the DAQRI blog and posted here with permission.

As we head into Augmented World Expo 2015, we have seen this event grow and evolve alongside the industry. Within this last year, we’ve seen more mainstream conversations about Augmented Reality than ever before.  As a result of this increased focus, there is now more than ever, a need to support and encourage innovation in Augmented Reality and computer vision technologies.

This year, we are excited to be showcasing our products and to spotlight our recent acquisition of ARToolKit, the world’s most widely used augmented reality SDK.  By releasing ARToolKit professional SDKs under LGPL v3.0 for free use, DAQRI is committing its resources to the open source community in the hopes that (in the words of our founder, Brian Mullins), “we can kick off the next AR revolution and inspire a whole new generation to pick it up and make things that haven’t been imagined yet.”

On the exhibition floor, Ben Vaughan and Philip Lamb from ARToolworks will be available to discuss ARToolKit and DAQRI’s newly-created open source division that they are heading up. In addition, representatives from DAQRI will be demoing DAQRI 4D Studio and showcasing exciting technologies from Melon, our brain computer interface division.

DAQRI executives will also be presenting throughout the conference:

Monday, June 8:

  • 10:45 am – 11:30 am—DAQRI 4D Studio Tutorial
    Katherine Wiemelt, Sr. Product Director, DAQRI
  • 2:15pm – 3:00 pm—How to Measure Enterprise AR Impacts
    Andy Lowery, President, DAQRI

Tuesday, June 9:

  • 11:30 am – 1:00pm—Smart Glasses Introductions
    Matt Kammerait, VP Product, DAQRI
  • 2:00 pm – 3:00 pm—Entertainment, Games, and Play
    Brian Selzer, VP Business and Product Development, DAQRI
  • 7:00 pm – 8:00 pm—Auggie Awards
    Brian Mullins, Founder and CEO, DAQRI

Wednesday, June 10:

  • 2:45 pm-3:00 pm—From Evolution to Revolution: How AR will Transform Work, in the Future
    Brian Mullins, Founder and CEO, DAQRI



Why Augmented Reality and Collaboration Make for a Safer and Better World

Augmented Reality (AR)-enabled systems show a mechanic how to repair an engine, or perhaps in the future will guide an inexperienced surgeon in a delicate heart operation. In my opinion, it’s when AR is combined with human collaboration that the magic begins. AR will soon work its way into a variety of applications that are bound to improve our lives, but more importantly, I am convinced it’s to become a catalyst for greater human understanding and world peace.

Augmented Reality Can Bring Us Closer

Everyone’s heart raced when Jake Sculley, the wheel chair-bound Marine in the movie Avatar, first connected his thoughts to those of his avatar, walked and then ran. His mission was to infiltrate the society of the natives, learn their customs and, having gathered that information help destroy their world. Of course, we all know how the story ends…It’s difficult to do harm to those we know. The first step in Hitler’s campaign to eliminate those he considered unworthy was to convince his followers that the others were less than human. In fact, this is a universal technique involved in incitement to violence against another group. It is only when we finally get to know someone that, even if we don’t agree, we can begin to understand and care about them.

Sharing Experiences

AR allows a user to see an enhanced view of reality, placing graphic images and 3D models over the real background. This will be great for building and repairing things by ourselves, but when we combine that capability with modern telecommunications, remote users will be able to participate in those processes with local users in real time, and appear to the wearer of the glasses as if standing alongside them. We won’t just see our grandkids in a Skype screen; we will take them with us on new adventures around the world or in our backyard. An astronaut in space will literally see the hand of the equipment specialist on earth pointing to the board to be replaced as they speak.

Gutenberg changed the world because the printed page could easily display the manuals that apprentices used for learning the trades that freed them from the fields. Radio and then television added sound, motion and recently 3D to the flood of information. Telecommunications has brought the cost of distributing it to practically zero. Now AR combines these capabilities and creates an infinite number of parallel worlds that you may create and visit, as well as acquire skills in from one-on-one instruction. It’s the closest thing to teleportation this side of Star Trek.

Non-verbal communication is said to account for between 55 and 97% (depending on the study) of communication between people. AR will provide practically the same information due to its enabling of “belly to belly” proximity. You will be able to virtually sit in a conference room and interact with other remote participants, watch a theater performance in your living room or tag along with a friend on an exotic trip to a foreign land. That friend will be able to see you, too.

New Ways of Displaying Information

Talk about disruptive. This is downright neutron bomb material. Why do you need a laptop or tablet when you see the screen suspended in mid-air, with the glasses projecting a keyboard on any surface? Gone are large-screen TVs, when everyone sat stationary watching the game from the same angle. Why wouldn’t they prefer it in perfect 3D? Forget glass cockpits in airplanes; why not have all the instruments projected in your field of view? How about infrared images of deer or pedestrians in fog or at night shown on the windshield of your car, to avoid hitting them in time?

Augmented Reality and Collaboration

But, again collaboration use cases will take the cake. The level of empathetic bonding that occurs when you’re in the room with another person will make current social messaging seem like sending smoke signals. Professionals in other countries will virtually know you and work together on projects as I am proposing using the Talent Swarm platform. Along with such proximity-enabled work will come a better understanding of other countries and cultures.

Collaboration is key, but it can’t happen at scale if everyone needs to buy and use exactly the same hardware and software. Collaboration across networks and companies as diverse as the places where humans live and work builds upon deep interoperability. Interoperability with existing and future systems will require a globally agreed-upon set of open standards. We will work within the AREA to strongly advocate for interoperable systems and push for global standards together with other AREA members. Once we have collaborative AR platforms, the benefits of this technology will rapidly serve all people of the world. Becoming an AREA founding sponsor member is, for Talent Swarm, not only common sense, but putting a stake in the ground, demonstrating our leadership for a more productive and peaceful world. We will avoid embarking on another wasteful battle such as VHS vs. Beta, nor allow a single company to reduce the opportunities or lock others out. Christine Perey, Executive Director of AREA, refers to it as our mandate: to ensure that an ecosystem of AR component and solution providers is in harmony with the customers’ needs, and able to deliver the diversity and innovation upon which economic success is based.

Path to the Future

With a concerted group goal centered on the advancement of AR, and with many technological developments both in the works and being introduced at an increasingly fast pace, we will one day look back to 2015 and say, how did we ever get along without Augmented Reality?




Augmented Reality at CES 2015 is Better and Bigger Than Ever

There’s something for everyone at CES. Do you need a way to store your earbuds so the cables aren’t tangled? What about printing icing on a cake?

Roger Kay, a technology analyst who writes for Forbes, recommends breaking up the event into ten parts. It’s not about the horrendous taxi lines or other logistical issues of dealing with so many people in a relatively small area. I walk everywhere I go. I leisurely covered twenty-four miles on the flat Las Vegas ground in four days; there are buses to and from the airport. Kay wants his topics served out in concentrated exhibition floor zones.

Like for Kay, many of CES’ themes lie outside my areas of interest and despite the headaches caused by the crowds, having the option to see and sample the developments in a variety of fields is one of the reasons I return each year.

Finding what I need to see isn’t a matter I treat lightly. A month before heading to Las Vegas I begin planning my assault because the CEA’s web site is horrendously inefficient and their new mobile app pathetic. Using brute force, I locate all the providers of head-mounted personal displays, the providers of hardware that is or could be AR enabling, and the “pure” AR firms with whom I already have relationships. I also plan a long, slow visit through the innovation zones, such as Eureka Park. I know another half day will be dedicated to Intel, Samsung, Sony, LG Electronics and Qualcomm. Then I search for outliers by name.

A few days prior to the event I begin following the news feeds on social media and technology trade blogs. While there, I also scan the headlines for surprises. 

Highlights of my CES 2015

For reasons that don’t have to do with Google Glass, vendors are making good progress in the personal display space.  The first reason is that more companies are experimenting with new combinations of familiar technology components, particularly with hardware. Optinvent is recombining their optical technology with a set of headphones. Seebright is adding a remote control to your smartphone. Technical Illusions is combining reflector technology and projectors with new optics. It’s like gene mixing to produce new capabilities and life forms.

Vuzix demonstrated the new waveguide technology in their optical see-through personal displays for Augmented Reality.

Vuzix demonstrated the new waveguide technology in their optical see-through personal displays for Augmented Reality.

That’s not to say that designs for the “traditional” optical see-through display form factor are standing still. Getting new investments, such as Vuzix received from Intel, is a major accelerator. ODG’s sales of patents to Microsoft in 2014 produced sufficient revenues for the company to develop a new model of their device targeting consumers.

The second reason for the significant advances in the personal display product category is the evolution of components. I saw firsthand in many exhibits, the “familiar” components these displays are must include, such as motion and other sensors, eye tracking kits and optics. All are rapidly improving. For these components, “improving” means smaller size packaging and lower power consumption. 

It was good to focus—if only briefly—on the familiar faces of AREA members such as APX Labs and NGRAIN who were participating in the Epson developer ecosystem booth, and to see the latest Epson products, which seems to be increasingly popular in enterprise. I found APX again in the Sony SmartEyewear zone, where I was able to try on the Sony prototype. I also caught up with executives and saw impressive new AR demonstrations by companies whom I don’t often see attending my events. If you’re interested, I encourage you to click on these links to learn about MetaInfinityAR, Occipital, ScopeAR, Technical Illusions, LYTE, XOeye Technologies, FOVE, Jins Company, Elvision Technologies, Avegant  and Augumenta. I’m sorry if I neglected to include others that I saw at CES.

Although they were around and showing AR or AR-enabling technologies, and we may have crossed paths unknowingly, I didn’t have a chance to meet with Metaio, Lumus, Lemoptix or Leap Motion.

I spent more time than expected visiting and observing the booths of Virtual Reality headset providers who were at CES. There were several exhibition zones dedicated to Oculus VR, with the new Cresent Bay device.  The lines waiting to try on the new Razer OSVR (Open Source VR) system were stunningly long. It amazes me that a small company like Sulon could afford such a huge footprint in South Hall to set up private briefing rooms for its Cortex display for AR and VR, and yet exhibit openly outside.

Elsewhere there were hordes swarming at the Samsung Gear VR and the Sony Project Morpheus zones. What good are all these headsets without content? I stopped in at JauntVR, which seems to be getting a lot of attention these days. I’m sure there were dozens more showing VR development software, but VR is peripheral to my focus.

I was impressed by the NVIDIA booth’s focus on Advanced Driver Assistance Systems this year, demonstrating real time processing of six video feeds simultaneously on the Tegra K1 Visual Computing Module. There were also excellent demonstrations of enterprise use of AR in the Hewlett Packard exhibit. Intel dedicated a very significant portion of its footprint to Real Sense. And, similarly, the Vuforia zone in Qualcomm’s booth has expanded by comparison to 2014. The IEEE Standards Association offered an AR demonstration to engage people about their work.

Automotive companies were also showing Augmented Reality. I saw examples in the BMW pavilion, in Daimler’s area, the Bosch booth, and Hyundai’s prototype cars.

At the other end of the spectrum there were many exciting new products in the pico projector category. MicroVision and Celluon were both showing HD pico projectors for use with smartphones; such technology will certainly be considered for projection AR in enterprise. ZTE and Texas Instruments also introduced their latest pico projector models at CES 2015.

Digging in Deeper

Although no longer in Las Vegas and despite my careful advance planning, I continued with my CES homework for at least a week. For example, I watched the archive of the “New Realities” panel and played back other videos that cover AR and VR at CES on CNET, Engadget, Tested and Financial Times

The IEEE published an analysis of AR at CES in Spectrum that reaches the same conclusion I drew:  the “C” in CES is for Consumer but a lot of consumer technology is going into corporate IT.

I hope I will have digested all that I gathered at CES 2015 before I begin preparations for 2016.