Taqtile Focuses on Eliminating the Skills Gap for Frontline Workers

There’s a growing problem affecting multiple industries. Experienced workers are retiring at an alarming rate, while equipment, plants, and infrastructure are increasing in complexity and sophistication. These trends are converging to create a significant shortage of skilled workers. New AREA member Taqtile is addressing this challenge with its Manifest software. Manifest is designed for easy knowledge capture and reuse, enabling non-technical subject-matter experts to capture how to operate and/or repair equipment step-by-step, incorporating audio, video, and other media. Stored in the cloud, that knowledge is then available for less experienced workers to access and follow the step-by-step instructions. We spoke recently with John Mathieu, Taqtile’s European Managing Director.

AREA: Why don’t you start by giving us a quick history of Taqtile?

Mathieu: Taqtile is backed by a team with many years of experience across the industry, including in mobile, apps, etc. The start of Manifest was realizing that there was an entire area of business that had not been touched by digital transformation but could really use digital capabilities to make their jobs faster, safer, and easier. We saw a problem we could fix, and a team that had a passion to make a change. The development of Manifest allows organizations to capture all the knowledge already out there and make it available in a new and sexy way that could be scaled across the entire workforce.

AREA: When you visit the Taqtile website, the first thing that leaps out is this concept of “immediately deployable,” which in the AR world sounds too good to be true. How is that possible?

Mathieu: One of the things we recognized early on is that a lot of the companies and industries that could use this technology don’t have AR/VR programmers or 3D CAD/model experts on staff ready to start creating for the very specific content required. Our goal is to make everyone an expert, so we choose to provide a solution that literally anyone can use, no programming experience required, making it immediately deployable. Additionally, we deploy our solution in the cloud, which allows for the solution to be provisioned, deployed and put to use immediately. We have enabled front line workers to capture everything that they already know by providing an application that literally anyone can pick up and use. As soon as they put on or pick up their Mixed-Reality-capable device, they can begin documenting and capturing the expertise they have right then and there. With Manifest, I’ll do a ten-minute training session on HoloLens, Magic Leap, or even iPad, and within about five minutes, subject matter experts are creating content.

AREA: And once that initial knowledge capture is done, it all gets saved in the cloud. Can you go back and build on that knowledge base iteratively?

Mathieu: Absolutely, and that’s one of the great features of the platform; you don’t need a developer, and no one needs to touch a PC. All the content that is captured is able to be improved upon over time or changed as necessary. They can put on or hold a Mixed Reality-capable device and walk up to a manufacturing device, place a QR code on a piece of paper, scan it on an area of that machine to create a spatial anchor, and then just go. If you have existing training manuals, we do have the ability to use Manifest through our web portal in a copy-and-paste operation. So, if it’s step 1, you would copy and paste the text of step 1 into the text field on the web portal, and when you fire that template up, that text appears holographically. And then you click to the next step and you take some evidence. You might shoot a picture of that part of the machine or shoot a video of your hand manipulating that lever. And then once you’re done, you can go back, open that template you’ve already written, and add a step, or insert additional ancillary media.

AREA: Are there particular vertical markets or industries where you’ve gained the most traction so far?

Mathieu: That’s a really great question. To date, we have gained the most traction in manufacturing, government and defense, and utilities. We’re working to solve a very large problem: over the next five years, up to 30 percent of the workforce is retiring across all industries all over the world. For instance, I was sitting across the table from the managing director a major electric company, and she had calculated that 42 percent of her workforce would be retiring by 2021. If you can’t get people into a utility company to perform those tasks and literally keep the lights on, that’s going to impact an entire economy. So spatial computing solutions are absolutely huge in helping to solve these major skill and personnel gaps.

AREA: Given your title, your focus is Europe. Are you a global company at this point, or are there certain geographies that you’re focused on?

 Mathieu: We’re becoming more global. I’m the Managing Director of Taqtile Europe based in Paris and we have representation in Sydney, Australia that provides coverage for key markets in AsiaPac, and then of course we’re all over North America, which is where our company is headquartered.

 AREA: We’re at the start of a new year. What can people expect from Taqtile over the next 12 months or so?

Mathieu:  Taqtile has a lot of great things coming in 2020. We will be launching a new product that will enable our customers to expand their Manifest production deployments, and further their expert training, maintenance and safety measures. Additionally, we are continuing our work on leveraging the power of spatial computing, so that we can walk into any environment and Manifest will know what machine A is, what machine B is, what machine C is, and be able to leverage the capabilities of this next generation of machines. Taqitle has a lot of great things coming, and we are excited to show them all to you soon.

 AREA: Tell us why you joined the AREA and what you hope to get from being a member.

Mathieu: We’re in an exciting, growing field, with new opportunities opening every day. Being a part of the AREA allows us to dig deep into our industry, and contribute to the research the AREA is developing, as well as market development. With every interaction we have, either with a customer or a competitor, we all get an opportunity to share knowledge in a space that is changing every day. The AREA allows all of us to share best practices, to help educate customers, and to expand the reach of AR in the industry.  We look forward to partnering with the AREA in outreach and market development as the AREA enables us all to benefit from sharing within the community.

How does AR fit into a company technology strategy?

Enterprise Augmented Reality (AR) offers countless opportunities to companies looking to improve the efficiency and effectiveness of their business. Many enterprises are pursuing Digital Transformation initiatives that focus on delivering technology strategies that drive innovation in support of the overall business goals.

Read on as we discuss the topic of technology strategies and how they relate to embracing enterprise AR in this, our latest AREA editorial. We’ve also created a complimentary handy podcast (>12 mins) for you to listen to on the go.

Robust technology strategies include the following components:

  1. Executive overview of strategic objectives

This covers the question: “What are the overall business drivers and how can technology advance them?” Such drivers can be evolutionary goals (e.g., improving profitability of certain activities within the business or reducing operating costs) or more revolutionary, for example, opening new lines of business.  

  1. Situational review

The technology strategy review should include a description of the current state of the business, what technologies are being used and how well they are working. The situational review should also offer commentary on the areas of the business (or potential new opportunities) that need to be improved or offer the greatest potential. These can be specific financial objectives (e.g., “reduce costs and improve efficiency within the services business”) or may address more “soft” objectives, such as reducing staff churn and therefore expertise transfer and retention.  

  1. Technology assessment and selection

As the strategy development continues, it quickly becomes important to assess which technologies can assist in supporting the business needs. At this phase, it’s important to take an outside-in view and gain perspectives on industry trends, perhaps hiring external experts or engaging with industry affiliations such as the AREA in order to determine the selection of the most appropriate technology.

The AREA can, for example, provide a neutral and independent view on the current technology state-of-the-art, its application to specific use cases and example case studies showing how the technology is being used within various industrial sectors.

  1. Strategic planning, resourcing and leadership

Next comes the determination of the implementation plan of the technology strategy. This phase should clearly identify potential vendors, internal staffing requirements and, most importantly, the internal champions and leadership (stakeholders) necessary to ensure alignment and roll out the solutions.

It is often helpful in this section of the strategy definition to include a maturity model, providing an internal roadmap over time of what is typically a growing adoption and leverage of the technologies within the strategy.

  1. Deployment

Lastly, the strategy execution – i.e., the rollout – commences. This will often include staff training, systems integration, custom development and more. Many companies will also implement a governance model that ties key performance indicators back to the original goals defined in the strategy.

This framework is typically used to support significant technology overhauls or new implementations, but what does this mean in relationship to adopting enterprise AR technologies?

Depending upon how and where AR is to be used, one or more of the following considerations will arise:

  1. Process impacts

Often, the adoption of AR will involve changing how certain business processes are performed. This will involve IT impacts (new IT infrastructure to manage the process) and human impacts – how the “new way of working” is rolled out to the organization.

  1. New hardware implications

AR may involve the usage of new hardware technologies (e.g., digital eyewear, wearables) and therefore the IT organization must be involved in actively supporting the needs of this hardware, which, initially, may apply only to a select and small proportion of the workforce.

  1. The “content creation to consumption” pipeline

Many AR solutions require the development of new content or may incorporate reuse of existing digital assets. These may include procedural definitions (step-by-step instructions), 3D models (ideally derived from the CAD master models) and more. This data pipeline needs careful planning and architecting to ensure enterprise needs of scalability and cost-control are met.

  1. Data and systems integration

Some AR solution deployments harness AR’s unique ability to place digital content directly into the visual context of a user performing a task. As this is a unique selling point of AR, it is important to consider the architectural needs to ensure that data from enterprise business systems, such as PLM, SLM, ERP and IoT data streams, may be presented within the AR application. Ideally, the AR technology should incorporate mechanisms to complement existing technology platforms and tools by ensuring communication and display of information from these systems.

  1. Pace of change

As with any new technology domain, the pace of change can be dramatic. A robust technology strategy should be flexible in its definition in order to adapt to later developments or to offerings from new vendors, rather than be locked into a potentially obsolete technology or insolvent vendor.

  1. Human factors, safety and security

AR solutions exhibit other factors that should be incorporated into a robust technology strategy, including safety aspects (users are now watching a screen rather than their surroundings and may lose situational awareness), and security (AR devices may be delivering high-value intellectual property that must be secure against malicious acts), amongst others.

Some of these challenges may be familiar to IT executives, while others may be new.

With these points in mind, and from the perspective of determining, planning and implementing a technology strategy, what does this mean to companies wishing to embrace enterprise AR?

Given the nature of the earlier points, and the depths of integration that may be required, one might think that AR needs to be considered only as part of a ground-up technology strategy definition. However, as with many technologies, integration and planning can happen at a later stage.

Mike Campbell, Executive VP, Augmented Reality Products at PTC, comments “Augmented Reality may be new, and its impact may be disruptive, but that doesn’t mean it can’t be woven seamlessly into your existing strategies. AR can plug into and enhance your existing technology stack, improving productivity and communications, helping to modernize training, and ultimately driving more contextual insights for employees.”

Mike makes an important point. Given that AR offers new “windows” into existing data and systems and provides new process methods, it remains important for many businesses that any disruption is a positive one for their business and not a negative one for their existing IT systems infrastructure. Meshing with existing infrastructure is key to enterprise adoption.

Mike Campbell continues: “Leaders in the AR industry work hard to make software and hardware scalable and simple for enterprise implementation. It can be integrated into a technology strategy to enhance the solutions you already have to offer in an efficient and engaging way to visualize information. You can leverage your existing CAD models or IoT data and extend their reach through AR, creating a strong digital thread in your organization and helping your employees access critical digital data in the context of the physical world where they’re doing their work.”

Given the fast pace of change in emerging technologies such as AR, businesses typically prefer not to be locked into the technological minutiae of specific vendors and clearly wish to leverage the investment in applications across multiple domains of their business, where it makes sense to do so.

Mike Campbell puts it this way: “Choosing a cross-platform AR technology that partners with powerful hardware, whether headsets or tablets, can give you more flexibility in how you want to deploy this information across your workforce, enabling you to provide solutions for employees in the field, on the factory floor, and even in the back office.”

AR can be considered a strategic technology initiative in its own right but the real power of AR is unleashed when it complements and supports other technology and business strategies. A common place for AR to really shine is at the intersection of Product Lifecycle Management (PLM), the Internet of Things (IoT) and, often, Service Lifecycle Management solutions.

AR is often used as an industrial sales and marketing tool, which typically requires a thin veneer of enterprise systems strategic alignment. However, the greatest value of enterprise AR comes when it is integrated with other technology strategies to be part of a larger and holistic strategic technology arsenal to transform specific business areas.

Commenting on this, Mike Campbell opines: “How exactly you choose to deploy AR will depend on your business needs. If you have existing CAD models, you can build these into AR experiences to offer immersive training, maintenance, or assembly instructions that overlay these models on top of the physical machines with which they correspond.

This can drastically improve your workforce productivity and shorten the time it takes to train someone by offering in-context information where and when it’s needed. If you have IoT data, enabling employees to visualize this data in AR can provide real-time insights into the machines they’re working on, letting them quickly and easily identify problems while on the shop floor.

In summary, considering how technology strategies are often defined, AR can be treated as revolutionary or evolutionary, enabling businesses to try, assess, learn and expand without disrupting existing IT infrastructure.  

We’ll conclude with one final thought from Mike Campbell: “The question really isn’t ‘how does AR fit into a company’s technology strategy’, but how do you want it to fit. There are countless ways AR can bring value to your business, and AR software and hardware providers are continually improving their technology to make integration powerful and simple.”

That is exactly what we’re supporting at the AREA. We’re helping a growing community of users and vendors of AR to share knowledge and tools along with developing expertise and best practices to ensure that AR adoption continues to grow in 2020 and beyond.

Within the AREA, we have several active committees that are committed to developing and driving best practices. To find out more, please visit thearea.org.


RealWear Connect Highlights AR’s Growing Maturity

Augmented Reality is rapidly maturing, with a growing focus on stability, security, and other practical considerations. That’s one of the key takeaways from RealWear Connect, an event hosted by AREA member RealWear on December 9 in Amsterdam.

AREA Executive Director Mark Sage attended the event as guest speaker and was impressed with what he heard and saw.

“Like many vendors in the AR ecosystem, RealWear is investing in stability and security,” Mark noted. “They’re putting $80 million into increasing stability and recently named former NSA security expert Patrick Neise as their CISO.”

RealWear also highlighted its Foresight cloud platform, a tool that makes the business of deploying and managing AR programs – configuration, updating devices, app provisioning, and data management – easier.

Other speakers at the event included Ton Van Der Hoeve, innovation analyst from Shell, who shared the oil giant’s experience deploying AR hardware. Jens Mutschall of Deutsche Telecom reviewed his company’s work in creating campus networks for applications such as AR, as well as 5G device development, including Automated Guided Vehicles (AVG).

“Manolis Koutsourelakis of the global industrial gas and engineering firm Linde gave a very informative presentation on how digitalization can improve safety,” said Mark. “Linde’s use of AR helps the company identify hazards, confirm adequate controls are in place, and enable technicians to perform non-routine tasks.”

To Mark Sage, the event showed that enterprise AR is making a real bottom line difference to organizations around the world. All of the companies present highlighted positive ROI benefits (both tangible and intangible) and successful deployment, which can only bode well for the future.

Augumenta Receives Patent for its AR SmartPanel Machine Control

Since its founding in 2012, AREA member Augumenta has made a name for itself by delivering user-friendly interaction methods that leverage wearable devices. Now the Finland-based company is building on that heritage with a newly-patented technology that lets wearers of smart glasses control industrial machinery (and more) by interacting with a virtual panel called a SmartPanel. We spoke recently with Augumenta CEO Tero Aaltonen to learn more about this promising technology.

AREA:  How would you describe what a virtual panel is and how it can benefit enterprises?

Aaltonen: It’s basically the same kind of panel that we are used to using today, but instead of being hardware-based, now it’s all done using Augmented Reality. So, it’s the same experience but done in a different way.

AREA: So instead of interacting with an industrial machine by turning physical dials and flipping switches, the user sees a virtual control panel through his or her smart glasses and is able to control the industrial machine with gestures. Talk a little bit about the advantages of that.

Aaltonen: There are many. One of the biggest benefits is that it eliminates maintenance costs because we’re only talking about black and white things on a flat surface. So, it doesn’t wear out and it never needs regular maintenance. That can be a significant savings over the lifetime of the panel. In addition, there’s no installation cost. That means we can have not just one of them, but many – as many as we wish. So, if you want to scale that interface across your factory, the price is not a limiting factor. You could have one or 10 or 100 in your factory and the cost is pretty much the same for the customer.

AREA: I would also imagine that, because modern factories change their product lines all the time, the SmartPanel enables them to reconfigure the control panel as needed, enabling them to be much more responsive to change.

Aaltonen: Exactly, and we actually have an easy-to-use design tool that’s accessible via a web browser. So, while it’s easy for us to change the panel, we made it so that the customers themselves can do all the customization without coming back to us. Also, when you have that level of flexibility in the UI it means that you can actually customize the UI by user or user group. So, if you’re the machine operator, you will see the operational. If you’re the maintenance person, it would show you the maintenance controls. It knows who you are in the organization.

 AREA: So, rather than looking at a hardware panel that may have 30 different dials on it for a multitude of different users, each worker sees only those controls that relate to his or her job?

Aaltonen: Yes. We can simplify the design, bringing only the controls that are needed by that specific person. Then of course, with a simpler UI, the risk of human error goes down.

 AREA: Tell us how this fits into the Augumenta business strategy.

Aaltonen: Augmenta started as a computer vision company, developing different types of interaction solutions for industrial customers. In our discussions with existing customers, we discovered there was a big demand for this kind of solution. And when we looked at the marketplace, we saw no one else was doing it, because it’s actually not that easy. There were a lot of algorithms that our in-house research team had to develop.

AREA: Who are the target customers for this solution?

Aaltonen: There are at least two types of customers. One is the company that builds industrial machines. The other one is the company that is operating the machines in the factory. They are buying machines from different vendors and they might want to get this kind of interaction from us so they can manage and control all the different machines. Many of these machines are moving to standard protocols, so it’s not that difficult for us to make a panel that can talk to a variety of machines.

AREA: What do you see as the primary use cases for the SmartPanel?

Aaltonen: One case is a highly-automated factory with few employees whose jobs are not to perform the manufacturing process, but to ensure that the machines keep running and products keep coming out. What happens often in these cases is that when an issue arises somewhere in the production line, the person that needs to fix it might be on the opposite side of the factory. So instead of having to walk across the factory floor to address the issue, they can do it at the nearest SmartPanel.

Another thing that we are seeing now is that these panels are being installed in locations that are environmentally very challenging with very harsh weather condition, or are in a location where there’s a risk that an unauthorized person could walk up to the panel and start using it. Because the SmartPanel is a flat black-and-white surface, it’s not affected by weather or environmental factors. And only people with the smart glasses and software can access it. So even if you have people trespassing, there’s nothing they can do when they get to the panel.

AREA: So, there’s a strong security aspect to it as well. If a company were interested in adopting the SmartPanel, how much time would it take to create a working solution?

Aaltonen: If customers are using standard protocols, we could get things up and running in two to four weeks. If a customer has older machines with more old-fashioned communication protocols, it takes a bit more time.

Interview with Brian Vogelsang of Qualcomm

AREA: How would you describe Qualcomm’s role in the enterprise AR ecosystem?

Vogelsang: We’re a technology
provider in the ecosystem, delivering chipsets that power AR experiences. Our
Qualcomm Snapdragon platform provides the best silicon/chipset that we can
customize to meet the needs of the XR enterprise ecosystem. You’ll see them in
products today from customers like Vuzix and RealWear. Then there’s the
Microsoft HoloLens 2 that was announced at Mobile World Congress; it uses our Snapdragon
850 Mobile Platform. Vuzix also announced at Mobile World Congress their M400
platform, which is powered by the Qualcomm Snapdragon XR1 platform. Finally,
there are new, emerging OEMs, such as nreal, Realmax, Shadow Creator, and ThirdEye.
Our goal is to optimize technology to put more capability in lighter weight
designs that can drive more immersive experiences at the lowest possible power
levels, but with full connectivity.

AREA: People might have thought that Qualcomm was getting out of AR
when it sold the Vuforia business to PTC three years ago, but the company is
still very much committed to VR and AR, isn’t it?

Vogelsang: That’s correct. We’ve
been working for over a decade in this space. We have a long history of
computer vision expertise and exploring how to build the technology and optimize
it in hardware in ways that will allow more immersive experiences while running
at the lowest possible power. To date, that has been predominantly on smartphones.
However, our long-term vision is that within a decade, we will start
transitioning from a handheld device (smartphone) to a head-worn device or a
sleek AR glass that people use the whole day. And that’s really what we’re
looking at: how do we accelerate that innovation and make those kinds of experiences
happen – initially for enterprises, but long term for consumers.

AREA: So, you expect enterprises to be the early adopters of
wearables, then the consumer market will develop from there?

Vogelsang: That’s right. Today,
in the wearable form factor, there’s a spectrum of devices, from Assisted Reality
devices for remote expert or guided work instructions, to full augmented or
mixed reality devices like HoloLens or Magic Leap. Enterprises are willing to
adopt these technologies if they solve a problem and deliver an ROI – and we’re
excited about that. But long term, we think that the technology needs to get
smaller, lighter weight, and more ergonomic.  More like your standard eyeglasses. Because of
these size requirements, that’s going to be particularly challenging
technically. To deliver an immersive experience at the lowest possible power
requires deep systems expertise. That’s right in Qualcomm’s wheelhouse. It’s
going to take a few years for the industry to deliver mass adoption of consumer
class AR eyewear. So for the short term, the enterprise is going to be doing a
lot to drive the market.

AREA: How closely do you work with wearables manufacturers?

Vogelsang: We work really
closely with them on their products and roadmaps, collaborating with them to
achieve their market objectives. There are always tradeoffs as OEMS balance
cost, weight, form factor and ergonomics, optics and display capability, performance,
thermals, and often these impact immersiveness. And so we work really closely
with them to understand their use cases and objectives and then help them with
hardware, software, and support to meet their objectives. We also give them
insight into future technology developments and their future requirements
inform our chipset roadmap. We can’t solve all the problems. Things like
displays and optics as well as camera modules are a big part of the equation in
building an AR device, and while we don’t build those technologies, we work
closely with the suppliers of these components and assist OEMs with integration
through our reference designs and HMD Accelerator Program, which pre-validates
and qualifies components so OEMs can get to market more quickly.  

AREA: It seems as if technologies are starting to converge in new
ways: 5G networks, Artificial Intelligence, the Internet of Things, and AR. Do
you get that impression as well?

Vogelsang: Definitely. We see
5G as the connectivity fabric that’s going to allow the mobile network to not
only interconnect people, but also interconnect and control machines and
objects and devices. 5G is going to deliver performance and efficiency that
will enable these new experiences and connect new industries, delivering multi-gigabit-per-second
rates of connectivity at ultra-low latency. Latency is hugely important when it
comes to Augmented and Virtual Reality experiences. And of course, 5G means more
capacity. But AI is already being used in Augmented Reality experiences, enabling
things like head tracking, hand tracking, 3D reconstruction and object
recognition or estimating light. AI is a really important part of that. And I
think 5G also will enable some capabilities to be moved off the device to the
edge of the mobile network – taking some capability and moving it to be
processed at the network edge. And that ultimately will help us enable lighter
weight designs with richer, more immersive graphics at that low power threshold
that we need. So all three – 5G, AI and AR – are coming together. And I think
IoT will be a part of AR in terms of syndicating information contextually about
the environment in an enterprise to an AR experience. IoT will feed the
insights, which will be bubbled up as AR experiences.

AREA: What do you hope to get out of being a member of the AREA?

Vogelsang: Qualcomm’s
customers are OEMs. We don’t sell to end customers, the people who would buy
those devices or experiences. However, we do need to understand what their
needs are so that we can better evolve our technology roadmap to support where
those end users want to go. So, one of the things that excites us about
becoming a member of the AREA is to begin hearing directly from some of the end
customers who are deploying wearable AR technology. We know this is a marathon
and we believe XR – spanning both Augmented and Virtual Reality – will be the
next computing platform. So, we’re taking a long-term view and investing now in
the technology that will enable this market. As a result, we’re very interested
in learning from other AREA members about how the technology is being applied
today to solve concrete problems in the enterprise so we can inform our roadmap.
Those learnings will help us deliver products that can accelerate the pace of innovation
and grow the overall AR wearable market. 

We’re doing some trials and
proofs of concept and other things where we get more directly engaged with end
customer use cases. So, being able to collaborate with other AREA members in
that space would be really good. Also, we’d like to get involved in the
committees. We have a human factors team here, and I’d like to get them engaged
with the work that’s being done on the human factors side. While we don’t build
end devices ourselves, we still need to understand as we’re building out
technology how human factors, such as weight, size, or thermals impact the user
experience and ergonomics.

We’d also like to get involved
in requirements. We think we’d really benefit from learning more about requirements
from a horizontal cross-section of the AREA membership. And finally, I think
we’d like to get involved in the marketing side, as well. We would be
interested in using our platform to help tell the story and accelerate industry

AREA: Where do you see things headed in XR over the next three to five
years? What are the next big milestones people should be looking for?

Vogelsang: I think that we’ll
see a transition from smart glasses or Assisted Reality experiences to more
Augmented Reality or spatial immersive computing type experiences. Over the
next few years, that transition will really start to accelerate. We’re already seeing
the early promise of what’s to come with technology such as HoloLens or Magic
Leap. I’m really excited about seeing the companies who are deploying smart
glasses or Assisted Reality experiences today start to adopt Augmented Reality
or immersive computing in a much larger way.

The AREA’s Annual Workshop

The Advanced Manufacturing Research Centre (AMRC) kindly
hosted the workshop which saw more than 70 participants from a range of
industries, including energy/utilities, buildings and infrastructure,
aerospace, defence, industrial equipment, mining, automotive and consumer high
tech, converge on the shop floor of Factory 2050 for a jam-packed series of
presentations, interactive workshops, demonstrations and networking.

Day 1 was opened by AREA Executive Director Mark Sage and
AREA President Paul Davies, who delivered a high-level overview of AR,
supported by leading companies and AREA members who have deployed AR.
ExxonMobil, Welsh Water and Boeing all helped paint a detailed picture by
sharing their use cases, experiences and challenges.

We then heard from Jordi Boza of Vuzix who shared his
thoughts and ideas of how to get started in AR followed by a presentation by
Atheer that took attendees through a case study showing how Porsche transformed
automotive dealer services with AR.

The last session of the day was an intense, hands-on session
presented by the AREA’s Dr. Michael Rygol who helped attendees get under the
skin of AR by discussing and documenting use cases and their key requirements
in working groups. Presentations by attendees led to some healthy debate and
interesting insights. The day was finished off with an informal networking
session where participants had the opportunity to take a closer look at some of
the organisations who were there with demo tables and to connect with
colleagues both old and new.

The second day was an early start at 8am and then straight
into a presentation from Theorem Solutions on the cognitive gap and potential
of XR technologies followed by a lively panel discussion on workforce
challenges led by AREA Board member Christine Perey of PEREY Research &
Consulting with representation from Boeing, ExxonMobil and VW Group UK. We then
explored more on the AREA’s Research capability by looking at past projects
before jumping into a master class on AR human-centred design from London-based
ThreeSixtyReality. A full agenda took us into a presentation on Human Factors
and related safety challenges and a pre-recorded session on overcoming the
challenges of AR security followed by a polished presentation from Microsoft on
their MR strategy and the eagerly anticipated HoloLens 2. A three-minute
provider pitch finished off a jam-packed day before participants headed home.

In summary, the depth and range of
content and sessions provided participants with a framework within which to navigate
(or continue navigating) their own AR journeys. Among the takeaways:

  • Staying in the AR game
    is tough. Organisations should consider both the opportunities and limitations of
    the current evolving environment.
  • The AR supplier ecosystem
    is continuing to grow, offering new and varied opportunities.
  • Clearer understanding
    and definition of the barriers to adoption (including safety, security, user
    experience) and paths forward to overcome these is essential.
  • Sound, appropriate use
    cases are key to learning more about AR. The number of use cases where AR
    delivers value continues to grow (and we need to capture and share these –
    hence the ASoN initiative from the AREA).
  • Digital eyewear to
    support AR is maturing rapidly (e.g., new models from Vuzix and Microsoft).
    Ensure you stay informed on new developments.
  • There is broad interest
    in AR across a number of industries – from industrial flooring to mining.
  • Considering the business
    benefits of AR is essential to obtaining buy-in from stakeholders and
  • There may be significant
    issues around safety and security where AR is concerned. Don’t ignore them.

The AREA annual workshop is an opportunity for members and non-members to connect, learn and share more on AR. We at the AREA are fortunate to have the opportunity to do this annually and it wouldn’t be possible this year without the valuable support of AREA members and our sponsors: Theorem Solutions, PTC, Vuzix and Atheer.

Embry-Riddle Prof. Barbara Chaparro on the Human Factors Aspects of AR

AREA: Tell us how you became interested in joining the AREA.

Dr. Chaparro: I first heard
about the AREA from Brian Laughlin at Boeing. Brian was my human factors
doctoral student when I was at Wichita State and we’ve kept in touch over the
years. I’ve seen the kinds of things he’s been working on at Boeing and how it
overlapped with my research interest in human/computer interaction and usability
and user experience. I saw an opportunity to pursue them further through the
AREA group.

AREA: Could you tell us more about your background as it relates to AR?

Dr. Chaparro: My background is
in the area of usability and user experience. I have worked with a number of
different companies and technologies focusing on implementing design principles
to make it as easy as possible for people to use devices and tools.

I became interested in AR when
Google Glass was introduced. I could see the potential in industries such as aviation,
medical, and consumer products. My initial interest with Glass was to use it as
a training tool for my students. I also worked with a colleague at Wichita
State to study user interactions with Glass versus a cell phone.

And then HoloLens came out, and
for a year and a half now, we have been exploring the user experience side of
HoloLens. We want to get an idea of how the average person experiences this
technology. For instance: What are some of the issues from a UX standpoint? The
gesturing, window manipulation, texting, voice input – all of these methods of
interaction bring usability and user experience issues to the human-technology
interaction. A lot of the literature is focused on the usability of a particular
app, but there is very little out there on the integration of multiple technologies,
working across a multitude of tasks at the same time, or task-switching between
the physical and augmented environment. That is my interest, and then seeing the
application of this to a variety of domains. I consult, for example, with
healthcare professionals who believe that AR has great potential. Whatever the
domain, there is going to be this core issue of usability that will determine whether
it takes off or not. Eventually, it comes down to the comfort and the seamlessness
of the user experience in the tasks that they are doing.

AREA: How do you expect to benefit from your membership in the AREA?

Dr. Chaparro: I see the AREA as
a fantastic mix of academic researchers and industries that are applying the
technology. Human factors is an applied field, so we’re always looking for
practical applications of the things we’re studying in the lab. So I see that
as a huge benefit of the AREA. Then we’ll benefit from the work of the various
committees. We’ve been participating in the Safety and the Research Committees,
and hopefully, the Human Factors Committee in the future. We need to understand
what the issues are, because any problem that an industry is having is a potential
research project for one of my students. And that’s the other benefit: to
recognize the needs of industries that will need to hire students that have
knowledge of this technology. We want to understand what those needs are so we
can build them into our curriculum if they are not already there.

AREA: Based on what you have learned so far, what do you see as the
major outstanding issue that needs to be addressed to make AR more usable to
the average person?

Chaparro: With these new
glasses and head-mounted devices, certainly comfort is an issue, especially in
industries where they will need to wear them for an extended period of time.
That’s going to be huge. And not from just a comfort standpoint but also visually
– going back and forth between the physical and augmented world and what that
experience is like.

AREA: In addition to the research projects you mentioned, what other
areas of AR are being explored at Embry-Riddle?

Chaparro: My colleague Dr. Joseph
Keebler has been conducting research related to marker-based AR in medical
training. His area of expertise is medical human factors, teams, and training, so
he is excited about the technology from both a training standpoint and as a
real-time use tool for high performing teams. The issue is that, while it
appears that this technology is great and effective, we really need more research
to demonstrate how and when it is working, and how to best integrate it into
modern day systems.

One challenge is that there’s
a novelty effect problem. For instance, there are research projects being done
that show AR is better for performing a task, but it is really hard to tease
away the novelty side of that. In other words – are people improving due to
increased learning from the AR system? Or is it simply the fact that it’s this
fascinating and visually impressive technology that is garnering people’s
interest and keeping them engaged? Joe and I are interested in how to structure
a study so that we are looking at the true effectiveness of the technology above
and beyond the effects of its potential novelty. Joe has published a few papers
on AR, including a chapter in the Cambridge Handbook of Workplace Training and
Employee Development (Keebler, Patzer, Wiltshire, & Fiore, 2017)[1].

Another one of our colleagues,
Dr. Alex Chaparro, has been working on the use of AR in transportation. For
example, AR has many applications in aviation, maintenance documentation, and
driving environments. His main interest is in the uses of AR and VR in these environments
to train individuals to perform complex tasks.

We also have a VR gaming lab.
Joe and I have also done some psychometric work on the validation of a new
satisfaction instrument for video games that we’re now trying to apply to the
AR world (Phan, Keebler, & Chaparro, 2016)[2]. We
definitely see the benefits of this technology and would like to see it

[1] Keebler, J. R., Patzer, B. S., Wiltshire, T.
J., & Fiore, S. M. (2017). 12 Augmented Reality Systems in Training. The
Cambridge Handbook of Workplace Training and Employee Development
, 278.

[2] Phan, M. H., Keebler, J. R., & Chaparro, B. S. (2016). The
development and validation of the game user experience satisfaction scale
(GUESS). Human Factors, 58(8),

The AREA & NIST Survey on AR Standards for Industry

To complete the survey will take approximately 5 minutes and aims to provide valuable information which will help drive and inform standards development strategies for the AR enterprise industry.

Please access the survey by following this link https://survey.zohopublic.com/zs/OwB3Gq

AREA to Lead Workshop at LiveWorx 19

LiveWorx has earned a spot on everyone’s calendar of must-attend events due to its content-packed agenda. The June conference in Boston really does deliver a year’s worth of technical learning in four days, addressing all the major technologies driving digital transformation – from AR and IIoT to blockchain and robotics.

ThirdEye Gen: The company behind “the world’s smallest Mixed Reality smart glasses”

AREA: How and when was ThirdEye Gen founded?

CHERUKURI: The company’s been around for about three years. The first two years were spent researching and designing. Then we came up with our first product, our X1 smart glasses. We launched the X1 at CES in January of 2018 and we just unveiled our new X2 product, the world smallest Mixed Reality smart glasses with built-in SLAM, at CES 2019.

AREA: How does the X2 compare to other smart glasses?

CHERUKURI: The main differentiator for the X2 is its field of view, which, at 42 degrees, it is very wide. We designed and built the product mainly for the enterprise market, so it also features a very high brightness level and includes massive battery packs so you can wear the glasses six to eight hours at a time. And it’s based on Android, so it’s very easy to customize and create applications.  The built-in SLAM (Simultaneous Localization and Mapping) we developed in-house allows for advanced tracking applications, such as 3D machine instructions.

AREA: What made you decide to get into the smart glasses business?

CHERUKURI: We had a related background in the military. In addition, our engineers had been working with this technology for the past 20 or 30 years. With the growth of AR, we saw an opportunity to expand our business into the enterprise market.

AREA: What applications or industries have you had success in?

CHERUKURI: The most popular application for the X2 so far has been the ThirdEye App Suite. We offer our own remote help software, but we also partner with third-party software companies that have their own platforms. A lot of these companies buy our glasses, load them with their software, and resell them with their value added. That’s the most common use case, but we have others. For example, we work with many companies in the healthcare space who are VARs and use it for the visually impaired, surgical use cases, and other uses.

AREA: What kinds of benefits are ThirdEye customers getting from your products?

CHERUKURI: Most of our customers are still in the pilot phase, but even at that early stage, they’re seeing that, by saving the cost of sending an expert just once to the customer site to fix a problem, the glasses pay for themselves. The ROI is huge. The main challenge to mass deployment is the legacy of installed systems and the integration effort.

AREA: Do you consider that to be the greatest barrier to AR adoption?

CHERUKURI: Yes. Whenever you have a new technology, the integration required to bring that solution to the day-to-day life of a large company takes time in terms of going from a pilot program to wide-scale use.