1

Emerging Tech to Drive Enterprise Digital Transformation

An article on CXO Today recently discussed the digital-heavy economy developing in India. The drive of this transformation is the reduced cost of transacting via online banking and the fact that this costs even less when the channel is mobile.

Efficiency and competitiveness are the main components of digital transformation; it requires deep understanding of present and emerging business process models as well as disruptive digital technology.

Technologies that enable transformation, for example Augmented Reality can improve education and learning, and automated reasoning can improve transaction processing. Key practises allow digital transformation to be optimised, and the correct soft skills allow it to be sustained.

In response to the increasing digital environment, many organisations are including or changing roles to have a digital focus. This synchronicity increases efficiency as well as allowing companies to become something completely different. Being able to drive transformation is one of the key values of technology.

Leaders and disruptors are moving towards delivering new value to customers to improve customer experience. Companies believe they are innovative due to changing tactics, investing in new tech, and employing tech experts.

The article concludes by stating that businesses must proactively innovate in order to stay ahead in an evolving marketplace.




LetinAR New Optic Technology to Breakthrough Limits of AR

LetinAR Co Ltd, a tech startup company, attended Mobile World Congress (MWC) 2018 with Pin Mirror Augmented Reality technology, revealed a press release on BusinessWire.

Various AR glasses considered major candidates for next-generation wearables have been released over the past few years by huge companies such as Microsoft and Google. Difficulties of these for commercialisation include:

  • Over-sized form factor and weight
  • Cause of dizziness
  • Limited resolution

These limits are still unresolved due to the complication of optical systems making mass production more difficult. LetinAR will challenge these problems at MWC 2018 by introducing novel optical technology.

Advantages of the technology are said to be:

LetinAR’s tech provides 70 degrees of Field of View (FOV), never previously attained in a regular sized pair of glasses.

FOV can be extended effectively by manufacturers by arranging more pin mirrors in a single lens. The Pin Mirror Lens projects a sharp image from a 25cm range, enabling the device to be used for lengths of time without dizziness.

The Super-Thin Pin Mirror Lens allows for it to be fitted with thinner devices compared to ones with light-guides and half-mirrors. The simple structure of injection molded plastic lenses allows for mass production and will also help advance the period toAR commercialisation.

LetinAR has introduced client companies, investment companies, and cooperative companies to developing next-generation lenses.




Dundee AR Tech Start-up improving efficiency in Oil and Gas

Mozenix, based in the Vision Building in Dundee, was established last summer to test the commercial validity of AR technology.  Information sourced from an article that appeared on EnergyVoice.com.

Led by Michael Brown and Michael Romilly, who is a co-founder of Dundee mobile agency Waracle, they teamed up with Aberdeen software company Return to Scene to develop the mobile app for the energy sector.

The software uses the camera on a smartphone or tablet to recognise oil and gas structures and then bring up identification tags and information about them on screen.

Return To Scene’s head of product development and support Martin Macrae, explained: “Offshore oil and gas assets are complex, adaptive structures with a constant flow of actions being undertaken by international teams.

“The systems which enable these actions are underpinned by asset registers which are represented by physical tags attached to equipment.

“The location of these tags and the ability to visualise data in a certain way, is crucially important.

“This is where AR technology, and specifically Mozenix unique software delivery capability, can solve a myriad of challenges.”

Since Mozenix launched the company has secured a number of contracts with blue-chip clients throughout the UK.

Mr Romilly, the company’s CEO, said: “We’re delighted to be working with R2S on such a highly innovative AR initiative.

“What we’re seeing with new mobile AR apps is very similar to what we witnessed over a decade ago when Apple launched the first iPhone.

“AR technology leverages untapped value by using the smartphone’s camera to create new immersive experiences and solve complex process efficiency challenges.

“It’s a great time for innovative companies such as R2S to be investing in these types of projects, as the rewards for early adopters in certain sectors can be significant.”

Return To Scene counts global giants such as BP and ConocoPhillips as among the clients for its visual asset management and data solutions.




IBM Watson Unity SDK brings AI to enterprise AR and VR apps

On Tuesday Feb 20 2018 IBM and Unity launched the IBM Watson Unity SDK on the Unity Asset Store, allowing developers to more easily integrate Watson cloud services and artificial intelligence (AI) techniques such as visual recognition, speech to text, and language classification into their Unity applications.

It also marks another step on the journey to bring augmented reality (AR) and virtual reality (VR) applications to the enterprise.

AR and VR hold promise for business use cases, including employee training programs that teach workers how to perform a dangerous job in a virtual environment, or field work in which employees can hold up their phone or smart glasses to an object to determine if it needs to be fixed.

As AR and VR technologies mature, there is increasing interest coming from the enterprise market for innovative applications in marketing, design, engineering, manufacturing and analysis.

IBM has been exploring enterprise AR and VR applications with clients such as the Immersive Insights demo, which brings AR visualizations to data science tools. In this new partnership, IBM and Unity plan to help drive the development of the AR/VR business market with applications that bring contextual expertise and AI capabilities directly to the employee.

 




RoMA: Robotic 3D Printing and Augmented Reality Combine in Interactive Fabrication Platform

Cornell University Researcher Huaishu Peng has been wprling o m3D printing, augmented reality and robotics combined.  He is working on a Robotic Modeling Assistant, RoMA, created by Peng and his team.

Peng is interested in the technical aspects of human-computer interaction (HCI), and designs software and hardware systems to enable 3D modeling with interactive experiences, as well as making functional objects using custom fabrication machines.

Peng wrote, “I envision that in the future (1) people will design both the form and the function of everyday objects and (2) a personal fabrication machine will construct not only the 3D appearance, but also the interactivity of its prints.“

The article gives details of all the researchers working on the project.  Details of the abstract are given.  As a designer is using RoMA’s AR CAD editor to draw a new 3D model in the air, a 3D printing robotic arm is building features to augment the model at the same time, in the same design volume.

Then, the partially 3D printed model can act as the designer’s physical point of reference while they continue to add elements to the design.

According to the paper, “To use the RoMA system, a designer wears an Augmented Reality (AR) headset and starts designing inside the print volume using a pair of AR controllers. As soon as a design feature is completed, the RoMA robotic arm prints the new feature onsite, starting in the back half of the design volume. At any time, the designer can bring printed features into the front half of the design volume for use as a physical reference. As she does so, the robot updates its schedule and prints another available part of the model. Once she finishes a design, the designer steps back, allowing the robotic system to take full control of the build platform to finish printing.”

It’s almost like a 3D printing pen, but on a much larger scale, with AR technology and a robotic arm controlling the 3D printing process.

RoMA users are able to, according to the project page, “integrate real-world constraints into a design rapidly, allowing them to create well-proportioned tangible artefacts,” and even extend an object through in-situ fabrication.

There are details of the project design and inclusion of AR technology in the full article.

 




Three Additional Industries That Augmented Reality Will Disrupt – LucyD

This post appeared on Medium, by LucyD.

LucyD are admittedly ‘obsessed’ by Augmented Reality.  They say the massive potential for enterprise applications is what sets AR apart from VR.  They then go on to take a look at 3 specific industries where AR is being used:

Construction

Despite the fact that the construction industry utilizes advanced 3D tools to design buildings, most of the time workers still refer to an actual paper blueprint. While we don’t expect paper to go out of fashion any time soon, the fact is that workers lose a massive amount of time because they need to constantly refer to the blueprints.

With AR, it would be possible to refer to the blueprint with no more than a verbal command. This would allow contractors, even the architect, to leave precise notes on a digital blueprint. There would be no need to lug around a hard copy, either.

Car Maintenance

While this application is not strictly limited to car repair, Augmented Reality will make DIY repair tasks as simple as following a tutorial.

These days, if you have a problem with your car, you need to take it to a technician. You leave it there for a few days or weeks and hope that by the time you pick it up, the problem is fixed.

Imagine if you could pop the hood of your car, load the specs for your car and get detailed instructions on how to diagnose and fix whatever was wrong with your vehicle.

Repair shops could also use this technology to hire more mechanics. An increase in mechanics would mean more competition and lower prices across the board, as the barrier to entry into this field would be reduced. It’s a win-win for everyone.

Cooking

Augmented Reality can be used for anything that requires a tutorial — so why not take advantage of the ultimate tutorial-based activity: cooking.

Every dish requires a recipe, but imagine if you received the instructions in real time. A built-in timer would tell you exactly when to flip the burger, a visual sensor would let you know if you’re burning the pancakes, and an olfactory sensor could alert you that your chocolate chip cookies had reached their ideal golden-brown texture.

At the Tokyo Institute of Technology, a team is aspiring to do those things and more. While this tech is still in its infancy, great forward strides in the AR cooking industry can be anticipated in the near future.

That said, all of these potential technologies have yet to fully bloom. At Lucyd, we believe the reason for this is lack of a hardware standard.

Our Lenses and decentralized blockchain ecosystem aims to change all this. Similar to open-source frameworks like Linux, companies will be able to develop applications that solve all these problems and more.

For the AREA’s work on AR functional requirements please see this page.




Meta Announces New Augmented Reality Integration with SOLIDWORKS

 

Integration Overview

SOLIDWORKS “Publish to Xtended Reality” capability will allow users to export a CAD model from SOLIDWORKS to a customized version of an open-source format known as “glTF.” Once a SOLIDWORKS model has been exported to glTF, it can be viewed on Meta’s Model Viewer platform in the Meta 2 Development Kit headset.

The exported file retains key information from SOLIDWORKS, such as:

  • Display states
  • Materials/colors
  • Animations (such as exploded view animations, motion study, etc.)
  • 3D model hierarchy

This AR integration between SOLIDWORKS and Meta enables a simple and more natural design visualization for SOLIDWORKS customers on the Meta 2. Furthermore, the Meta 2’s wide field of view and direct hand interaction creates an easier and more immersive experience than virtual reality. Through this collaboration between Meta and Dassault Systèmes’ SOLIDWORKS brand, consumers of 3D CAD are no longer limited to viewing models on a 2D screen, and product design can become three-dimensional.

Key benefits of the Meta-SOLIDWORKS integration include:

  • Speed – the plug & play nature of the file export/import process means there is no need for a SOLIDWORKS user or developer to build models uniquely for the Meta AR headset.
  • Accessibility – benefits of 3D CAD visualization is not limited to designers and engineers – any sales or training professional wanting to view 3D models in immersive AR can do so immediately.
  • Efficiency – viewing 3D CAD models in AR can have a significant impact on time-to-market, cost optimization and revenue by shortening the design review cycle, increasing sales conversion, and enhancing training comprehension.

The full press release can be viewed here.




Arvizio and DotProducts Partnership – Mixed Reality in Industry

Important information about Mixed Reality mentioned in the article includes:

  • At enterprise level, MR adds another dimension to visualisation and product design
  • Exploration of Augmented, Virtual, and Mixed Reality is often linked to additive manufacturing
  • MR allows for collaboration to no longer be site-specific
  • A way of managing 3D data models for view in MR is via the use of point clouds
  • Point cloud data creates an image as a collection of dots which saves a significant amount of processing power

Arvizio provides MR services to enterprises. They have integrated high definition point cloud data produced by DotProduct 3D to improve the current abilities of its proprietary MR Studio™ platform for the Microsoft HoloLens and other headsets.

Jonathan Reeves, CEO of Arvizio, is quoted to have said that this integration will enable collaboration for stakeholders and team members in highly efficient new ways.

Use cases for DotProducr’s 3D scanners mentioned in the article include:

  • Electrical utilities
  • Building renovation
  • Offshore oil facilities
  • Hardware engineering in the US Navy
  • Bridge renovation project (used by multinational construction and development company Skansa)

The partnership between Arvizio and DotProducts has produced MR products that allow customers to multitask using virtual display channels. Users can host a video chat with colleagues and simultaneously access any relevant data about the project to add to the discussion.




H-E-B deploys Vuzix Smart Glasses within manufacturing operations

  • Vuzix, a Smart Glasses and Augmented Reality (AR) technology supplier, has teamed with H-E-B, a major U.S. grocery company to pilot Vuzix Basics Video (VBV) glasses within the company’s manufacturing operations, according to a company press release.
  • The model in use at H-E-B is the M300, known for strong audio and noise cancellation excellence. Employing augmented reality on the plant floor allows H-E-B to provide faster technical education and knowledge transfer abilities.
  • The M300 is believed to offer speedy ROI and easy deployment for users.

More on the story can be read on Supply Chain Dive.com  




Students create AR reality system that lets doctors see under patients’ skin

The system, called ProjectDR, allows medical images such as CT scans and MRI data to be displayed directly on a patient’s body in a way that moves as the patient does.  “We wanted to create a system that would show clinicians a patient’s internal anatomy within the context of the body,” explained Ian Watts, a computing science graduate student and the developer of ProjectDR.

The technology includes a motion-tracking system using infrared cameras and markers on the patient’s body, as well as a projector to display the images. But the really difficult part, Watts explained, is having the image track properly on the patient’s body even as they shift and move. The solution: custom software written by Watts that gets all of the components working together.

Vast applications

“There are lots of applications for this technology, including in teaching, physiotherapy, laparoscopic surgery and even surgical planning,” said Watts, who developed the technology with fellow graduate student Michael Fiest.

ProjectDR also has the capacity to present segmented images–for example, only the lungs or only the blood vessels–depending on what a clinician is interested in seeing.

For now, Watts is working on refining ProjectDR to improve the system’s automatic calibration and to add components such as depth sensors. The next steps are testing the program’s viability in a clinical setting, explained Pierre Boulanger, professor in the Department of Computing Science.

Next steps

“Soon, we’ll deploy ProjectDR in an operating room in a surgical simulation laboratory to test the pros and cons in real-life surgical applications,” said Boulanger. “We are also doing pilot studies to test the usability of the system for teaching chiropractic and physical therapy procedures.” added Greg Kawchuk, a co-supervisor on the project from the Faculty of Rehabilitation Medicine. Once these pilot studies are complete, the research team expects the deployment of the system in real surgical pilot studies will quickly follow.

Watts is co-supervised by Boulanger, Cisco Chair in Healthcare Solutions and professor in the Faculty of Science, and by Kawchuk, professor in the Faculty of Rehabilitation Medicine.

ProjectDR was presented last November at the Virtual Reality Software and Technology Symposium in Gothenburg, Sweden.

Information source: Eurekalert.org