1

Here’s What AR and Other Similar Technologies Can Do for Your Business

What is one piece of advice you’d give to businesses looking to invest in Augmented Reality (AR) technology?

It’s the simple and classic advice, really. If you are an enterprise looking to bring AR into your organization, be very clear on what business problem you are trying to solve. Companies often want to “try out” new technology, to play with the latest gadgets and see what they do rather than focusing on solving a real business problem.

There are many AR use cases that provide real benefit by improving the performance and efficiency of the company operations. It is important to understand your business problem, then pilot a suitable AR solution and measure the outcome. This may include reducing time to complete a task, minimizing errors, and/or lowering costs of interruptions. These are all benefits that improve the bottom line.

The AREA portal offers more information on how to get started.

Can you discuss a few use cases of augmented reality for industrial professionals? Are there any barriers to adoption businesses should be aware of?

Based on my experience of speaking to the many enterprises and providers in the AR ecosystem, the use cases that are currently getting most traction include:

  • Remote assistance — being able to discuss with an expert (anywhere in the world) and use AR technology to show how to fix the problem.
  • Step-by-step guidance — using an AR-enabled mobile, tablet, or wearable device to show how and what to do when completing a task. This use case works particularly well for infrequent and complicated tasks.

In terms of barriers, the technology is still being developed and will continue to improve.

AR for Enterprise Alliance (AREA) has also identified business problems that it is working to overcome. These include issues when moving from pilot to full deployment. The members are working to understand and overcome safety, security, and human issues (e.g., convincing stakeholders and ensuring the workers are involved), as well as providing useful tools like an ROI calculator and Safety/Human framework.

What is one myth surrounding this technology, or Industry 4.0 in general, you’d like to debunk for our readers?

That it is complicated and difficult to deploy! This is simply not the case, and the most successful implementers of AR solutions and Industry 4.0 have started with solutions using IoT data, with simple analysis, and using tablets, phones, or assisted-reality devices to display actionable information that brings quick and substantial benefit to the company and worker.

Where do you see Industry 4.0 heading?

For Industry 4.0 to continue to provide benefit to manufacturing, Internet of Things (IoT), Artificial Intelligence (AI), and AR technologies need to interact and work together better to help deliver more actionable outcomes. Benefit will also increase as the concept of the digital twin becomes commonplace, enabling designers to plan, develop, and test more efficient processes and products. These can be tested in the augmented world before being implemented in the physical.

This has been demonstrated by the next AREA research project (voted for by the members), where best practices and the merging of IoT, AI, and AR technologies will be researched.

In the future, you can envisage a self-supporting manufacturing process able to solve its own simple problems allowing staff to see (via AR) issues that need timely intervention.

Link to article




AR Adds a New Dimension to Financial Trading and Analysis

AR/MR-assisted trading and data analysis platforms empower traders and investors with advanced fintech, which is capable of monitoring and visualizing financial markets with new depth. Holographic visualization presents a new enhanced view of dynamic data, with flat images evolving into 3D shapes and innovative heatmaps to reveal revolutionary new data insights. With an AR/MR-assisted user interface, users are no longer restricted by the physical size of a computer screen, mobile or tablet, and can get a true 360-degree view with unlimited applications.

Utilizing light portable MR headset technology or AR smart glasses, advanced holographic representations of financial data and feeds are overlaid on, and exist in addition to, the real-world view of the user’s workspace. The phrase “workplace everywhere” has new meaning with AR enabling users to simultaneously operate a laptop or smartphone, or speak with a physical person in the room at the same time as a virtual colleague via videoconferencing.

The AR financial landscape can remain completely private (safe from inquisitive eyes), or users can share data by mirroring their views to an external laptop and even enable “spectator view” for colleagues or clients who are also using AR/MR technology. Users can even invite clients or advisors located anywhere in the world to a virtual conference room, where they can collaboratively and seamlessly analyze and interact with their financial landscape.

The technologies behind the solution

Powerful AR/MR-assisted trading and market data analysis for the finance sector can be viewed through Microsoft’s MR HoloLens headsets, and new technology currently in development that will look and feel like ordinary eyeglasses. The inclusion of Unity, a powerful editor, enables the software to be transported to other wearable hardware. While the solution largely uses HoloLens gesture recognition technology, voice recognition is also possible using embedded Microsoft Cortana functionality, along with holographic object manipulation, which can be useful in certain scenarios.

The core of the AR/MR-assisted financial trading and market data analysis platform is built on an existing data solution called dxFeed, which is one of the world’s largest cloud-based fully-managed data tickerplants focused exclusively on the Capital Markets industry. dxFeed uses unique technology called QD, designed and built by Devexperts, for market data distribution. The result is a powerful tool that can transform and adapt any data feed into an AR/MR-assisted virtual market data infrastructure. Gathering and storing historical data from the key exchanges in the USA, Canada and Europe, every single change of price (tick-by-tick market data), is streamed live and can be accurately viewed and interrogated through the AR/MR headset.

What it means for traders and analysts

The advent of AR/MR-assisted trading and data analysis delivers many benefits to financial services firms:

  • Organizations can replace multiple monitors in a fixed location with a lightweight wireless MR headset or AR  smart glasses, freeing users from the physical size restrictions of computers, mobile devices, and tablets.
  • Companies can implement “workplace everywhere” – with a 360-degree view, users can work literally on any surface and even in the air.
  • Colleagues and customers can collaborate on projects from anywhere in the world via videoconferencing; point-of-view capabilities enable users to monitor and jointly analyze financial data, limiting miscommunication and strengthening decision-making.
  • Users can increase their productivity and dramatically improve market visualization with advanced holographic data representation – a key element for traders needing to make important data-driven decisions quickly.
  • A more intuitive user interface makes it easier to view, analyze and manipulate large quantities of complex data.
  • Users can gain rapid access to stored historical market data and use tick-market replay and back-testing while simultaneously keeping a sharp eye on current market activity.
  • Users stay better informed with streamlined integrated news feeds and financial information, aggregated from multiple providers in text view – with support for live streaming of news channels.

Who are the target users?

Fintech is more than a buzzword. In order to stay ahead of the competition, banks, investment-funds, hedge funds, FX desks, proprietary traders, and exchanges are adopting AR/MR technology. The driving force behind AR/MR-assisted trading and data analysis, however, is individual traders, investors and advisors working for financial institutions across the globe, who will find ease of collaboration from anywhere hugely beneficial.

Some typical scenarios

  • An investor can connect an advisor to a virtual conference room, enabling them to share their point of view and explain how a drop or rise affects the portfolio and what decisions they can make now.
  • A trader can take action faster as a result of a more intuitive interface highlighting hotspots and revealing opportunities.
  • An investor looking to enter new markets can accurately view historical data, use tick-market replay and back-testing and make informed decisions based on the hard facts.
  • A financial analyst required to monitor a particular stock on a major exchange can access and visualize full-depth data, explore how well the stock has performed in the past, and instantly communicate that information to a client, in the form of a holographic data representation.
  • Students or new employees learning to trade can use AR/MR-assisted fintech to study and analyze patterns using historical data and market replay, and immerse and interact with the financial market.

Dmitry Parilov is Managing Director of Data Products at Devexperts and Simon Raven is a technical writer.




Addressing the Security Challenges of Wearable Devices for Enterprises

Read the first installment here.

The topic of security in enterprise AR environments is both under-addressed and vital. Our cybersecurity team at Brainwaive is excited about the opportunity to work with the AREA to protect companies’ information and assets through this first-ever AREA-funded research project. The objective is to develop and popularize a reliable, repeatable means of assessing security when adopting AR headsets/glassware solutions in industrial/enterprise settings.

With several weeks of R&D behind us, the Brainwaive cybersecurity team is beginning to finalize the scope and structure of an AR Security Framework and Testing Protocol. While most of our initial focus is on security threats and the defensive posture of wearable AR devices themselves, it’s important to recognize that the headset or smart glasses are just one element in an end-to-end AR “solution stack.” The Security Framework will eventually address all the unique elements of the AR stack, including wireless networking, data gateways, cloud services, applications, and more. Additionally, full enterprise protection requires development and governance of sound use policies and procedures, and training to develop end-user competence with the systems.

From a security standpoint, wearable AR devices may seem to be similar to common mobile devices like smartphones and tablet computers. However, we’ve identified multiple important factors that make AR systems unique, and we’re mapping the new trust boundaries and roles of the users. The Brainwaive team will elaborate on these in the final report and in our presentation at the upcoming Augmented World Expo. Also, in this initial project, we’re focusing only on characterizing the inherent design characteristics of the wearable device hardware and software from a security perspective. In follow-on projects, we’ll perform active penetration testing to determine the robustness of device designs and their level of defense against malicious attacks.

Knowledge is power when it comes to protecting your enterprise assets from bad actors trying to break in and steal sensitive information or disrupt your operations. Employing the AREA AR Security Framework and Testing Protocol, enterprise users will be better equipped to select and use AR headset solutions providing the proper types and levels of security for their specific use cases.

Tony Hodgson is CEO of Brainwaive LLC.




Mark Your Calendar for May 17 – AREA Webinar on AR and IoT

Imagine an aircraft service facility where the maintenance crew has the actual performance data of each plane’s engine components right at their fingertips as soon as it arrives – including identification of faulty parts and step-by-step instructions on how to replace them.

That combination of IoT data and AR visualization is incredibly powerful. It promises to reduce downtime, ensure timely and appropriate maintenance, and prevent more costly repairs. And because these technologies can guide service technicians instantly to only the areas in need of repair – and provide hard data on when to replace a worn part before it fails – they can make service technicians significantly more productive and assets more reliable. However, there are still questions about this integration of AR and IoT:

  • How close is that scenario to reality?
  • What technologies are essential to making it happen?
  • What obstacles stand in the way?

To get the answers to these and other questions, you don’t want to miss the AREA’s upcoming webinar, Friends or Enemies – What is the Relationship Between Augmented Reality and IoT?  The event will be held May 17, 2017 at 8 AM Pacific/11 AM Eastern/4 PM UK/5 PM CET.

Speakers on the program include: Marc Schuetz, Director of ThingWorx Studio Product Management at PTC; Pontus Blomberg, Founder & VP, Business Development at 3D Studio Blomberg Ltd.; Carl Byers, Chief Strategy Officer of Contextere; and Giuseppe Scavo, Researcher, AR for Enterprise Alliance (AREA). AREA Executive Director Mark Sage will host.

What other use cases will benefit from the intersection of IoT and AR? What types of organizations and industries are best positioned to derive value from such solutions? Find out at the free webinar.  Register now to join us.




Here’s why P&G created guidelines for augmented reality equipment (Via Cincinnati Business Courier)

Read the full article…




Caterpillar, Lockheed Martin, P&G Lead Effort to Shape Future of Augmented Reality (Via IndustryWeek)

IndustryWeek highlights the efforts of 65 organizations to help shape the future of augmented reality in the manufacturing sector. The functional guidelines released Tuesday will help companies within the AR ecosystem to develop products and solutions for industrial enterprise users.  

Read the full article…




New Guidelines Point to an Augmented Future (via Computerworld)

Senior Editor for Computerworld, writes about the hardware and software guidelines for using augmented reality (AR) on the manufacturing floor. These guidelines were published Tuesday in a joint effort between UI Labs and the Augmented Reality for Enterprise Alliance (AREA).  

Read the full article…

 [Photo credit: Turkletom / Flickr]

 
 
 
 
 
 



AREA Interview: Ken Lee of VanGogh Imaging

AREA: Tell us about VanGogh Imaging and how the company started.

KEN LEE: The reason I started VanGogh was I noticed an opportunity in the market. From 2005 to 2008, I worked in medical imaging where we mainly used 3D models and would rarely go back to 2D images. 3D gives you so much more information and a much better visual experience than flat 2D images. But creating 3D content was a very difficult and lengthy process. This is the one huge problem that we are solving at VanGogh Imaging.

We started when Microsoft Kinect first introduced their low-cost 3D sensoring technology. It allowed you to map in a three-dimensional way, where you can see objects and scenes and capture and track them. Van Gogh started in this field around 2011 and we’ve been steadily improving our 3D capture technology for over five years, working with several clients and differentiating ourselves by delivering the highest quality and easiest way to capture 3D models.

AREA: What is Dynamic SLAM and how does it differ from standard SLAM?

KEN LEE: Standard SLAM has been around for years. It works well when the environment is fairly static – no movements, a steady environment, no lighting changes. Dynamic SLAM is a SLAM that can adjust to these factors, from moving objects and changing scenes to people walking in front and lots of occlusions.

AREA: Are there certain use cases or applications that are particularly suited to dynamic SLAM?

KEN LEE: Dynamic SLAM is perfect for the real world, real-time environment. In our case, we are using dynamic capture mostly to enhance the 3D capture capability – so making 3D capture much easier, but still capturing at a 3D photorealistic level and fully automating the entire capture process plus dealing with any changes.

Let’s say you’re capturing a changing scene. You can update the 3D models in real time, just as you would capture 2D images with a video camera. We can do the same thing, but every output will be an updated 3D model at that given point. That’s why Dynamic SLAM is great. You can use dynamic SLAM just for tracking – for AR and VR – but that’s just one aspect. Our focus is on having the best tracking, not just for tracking purposes, but really to use that tracking capability to capture models very easily and update them in real time.

AREA: Once you have that model, can you use it for any number of different processes and applications?

KEN LEE: Sure. For example, you can do something as basic as creating 3D content to show people remotely. Let’s say I have a product on my desk and I want to show it to you. I can take a picture of it, or in less than a minute, I can scan that product, email it, and you immediately get a 3D model. Microsoft is updating its PowerPoint software next year so you will be able to embed 3D models.

There are other applications. You can use the 3D model for 3D printing. You can also use it for AR and VR, enabling users to visualize objects as true 3D models. One of the biggest challenges in both the VR and AR industry is content generation. It is very difficult to generate true 3D content in a fully automated process, on a real-time basis, that enables you to interact with other people using that same 3D model! That’s the massive problem we’re solving. We’re constantly working on scene capture, which we want to showcase this year, using the same Dynamic SLAM technology. Once you have that, anyone anywhere can instantly generate a 3D model. It’s almost as easy as generating a 2D image.

AREA: Does it require a lot of training to learn how to do the 3D capture?

KEN LEE: Absolutely not. You just grab the object in your hand, rotate it around and make sure all the views are okay, press the button, and then boom, you’ve got a fully-textured high-resolution 3D model. It takes less than a minute. You can teach a five-year-old to do it.

AREA: Tell us about your sales model. You are selling to companies that are embedding the technology in their products, but are you also selling directly to companies and users?

KEN LEE: Our business model is a licensing model, so we license our SDK on a per-unit basis. We want to stay with that. We want to stay as a core technology company for the time being. We don’t have any immediate plan for our own products.

AREA: Without giving away any trade secrets, what’s next in the product pipeline for VanGogh imaging?

KEN LEE: We just filed a patent on how to stream 3D models to remote areas in real time. Basically, we’ll be able to immediately capture any object or scene, as soon as you turn on the camera, as a true 3D model streaming in real time, through a low bandwidth wireless data network.

AREA: Do you have any advice for companies that are just getting into augmented reality and looking at their options?

KEN LEE: At this moment, Augmented Reality platforms are still immature. I would recommend that companies focus, not on technology, but on solving industry problems. What are the problems that the companies are facing and where could AR add unique value? Right now, the biggest challenge in the AR industry, and the reason why it hasn’t taken off yet, is that so much money has gone into building platforms, but no one has built real solutions for companies. I think they should look for opportunity in those spaces.




Key Players in the Augmented Reality Industry

A recent article by Michael R. Blumberg examines major players in the industry of Augmented Reality who have developed solutions in field service and maintenance in addition to use case scenarios. A number of AREA members are included.

The article initially discusses the multiple components which should be integrated in order to make AR applications successful: viewer technology, such as a mobile device or smart glasses; the application allowing the device to access what the engineer is viewing in real time, which produces extra content such as sound or graphics; also video streaming from the onsite engineer to a remote engineering expert.

AREA members APX Labs, iQagent, NGrain, Scope AR, and XMReality are all mentioned in the article as key industry players, in addition to AR Media, Epson, Fieldbit, Microsoft, and PTC. AR technology from each organisation is noted and described:

  • APX – Their AR product Skylight is a platform which integrates with wearables such as smart glasses.
  • iQagent – Their mobile-based AR app scans QR codes to provide information related to maintenance.
  • NGrain – They have various AR applications, such as ProProducer, Viewer, Android Viewer, SDK, Consort, Envoy, and Scout.
  • Scope AR – Their Worklink application provides instructions as well as 3D images on the mobile or wearable screen. They also produced Remote AR, which enables onsite employees to remotely communicate with experts.
  • XMReality – Their product XM Reality Remote Guidance also enables onsite employees to receive visual instructions from remote experts.

 

 




Press Release Lockheed Martin Joins The AREA

Lockheed Martin Joins Augmented Reality for Enterprise Alliance Board

Lockheed Martin Brings Industry Expertise to AREA Board of Directors to Help Build Augmented Reality Ecosystem and Best Practices

WAKEFIELD, Mass., USA – December 13, 2016 — The Augmented Reality for Enterprise Alliance (AREA) announced today that Lockheed Martin has joined AREA at the Sponsor level and accepted a seat on its Board of Directors.  According to AREA Executive Director Mark Sage, Lockheed Martin, along with other AREA Enterprise members Bosch, Boeing, Huawei, and Newport News Shipbuilding and many others, have pledged support to drive ecosystem development and best practices for Augmented Reality (AR).

 

With over 30 members, the AREA is the only global membership funded alliance, helping to accelerate the adoption of Enterprise AR by creating a comprehensive ecosystem for enterprises, providers, and non-commercial institutes.  It supports innovative companies, aspiring to invest in AR who need a better understanding of the tools available, application possibilities, methods of implementation and return on investment.

 

The AREA provides a free and open exchange of best practices, lessons learned, and technological insights which can help enterprises effectively implement AR technology, boost operational efficiency and create long term benefit.

 

“Lockheed Martin is another strong and significant addition to the AREA Board of Directors,” said Sage. “They bring long experience with and a keen understanding of the tools, applications, and implementations of AR in the enterprise.  Their collaboration with AREA members in defining this emerging industry through research, networking, education and best practice is a welcome addition.”

 

The AREA’s membership benefits include access to high-quality, vendor-neutral content and participation in various programs, a research framework to address key challenges shared by all members, discounts for fee-based events, and more. Sponsor members have a direct role in shaping the rapidly expanding AR industry and demonstrate their companies’ leadership and commitment to improving workplace performance.

 

About the AREA

The Augmented Reality for Enterprise Alliance (AREA) the AREA is the only global membership funded alliance, helping to accelerate the adoption of Enterprise AR by creating a comprehensive ecosystem. The organization provides high-quality, vendor-neutral content and programs. Discover the benefits of joining the AREA by visiting our membership information page.