1

With Google Glass, Sutter Health Sees mHealth Success in Workflows

One of northern California’s largest health systems is seeing strong success with the enterprise version of the mHealth smartglasses. And officials are crediting that success to what they’re calling “the human ROI.”

“This is not a machine,” Albert Chan, MD, chief of digital patient experience at Sutter Health, says of the high-tech wearable. “It’s a solution to workflow problems.”

And it’s making a world of difference, Chan says, to physicians who generally spend half their time with patients and the other half doing administrative tasks.

The Sacramento-based, 24-hospital, not-for-profit health system is one of the largest networks using Google Glass, with more than 100 physicians sporting the smartglasses in a wide variety of locations. Those devices are embedded with mHealth technology developed by Augmedix, one of a handful of companies designing software for smartglasses.

Among the most popular uses for the smartglasses is in translating patient encounters into the medical record. This is typically done either by the doctors themselves or by scribes, who copy clinician directives – written or taped – into the patient’s record. Chan says Sutter Health now employs about five scribes for every four doctors.

With Google Glass and Augmedix technology, Chan says that process is streamlined – for both the doctor and the scribe – and made more meaningful. The patient visit is entered into the record immediately, with more accuracy, absent the vagaries of a doctor’s handwriting, an imperfect recording or time spent between the visit and data entry.

Recently, Sutter Health and Augmedix announced a collaboration with Google to add artificial intelligence tools, by connecting the smartglasses to the Google Cloud Platform.

“Our work on GCP has been accelerating as we continue to advance our capabilities using Google Cloud machine learning APIs and core HIPAA-secure cloud infrastructure services,” Ian Shakil, co-founder and CEO of Augmedix, said in an announcement released at this year’s HIMSS18 conference in Las Vegas. “As our largest deployment, Sutter represents an at-scale example of what’s possible when you bring together a technology-enabled documentation service for health systems and the power of the cloud.”

Chan, who uses Google Glass in his own family medicine practice, says the wearables aren’t meant to improve clinical outcomes – at least not directly. They’re designed to reduce the stress on clinicians by giving them an intuitive, hands-free means of completing administrative tasks, such as entering data into the patient record, collaborating with the care team and messaging.

“As AI matures, we can layer that into our (administrative) processes to make things more efficient,” he says. “We can understand patterns and anticipate needs.”

Read the full article here.




5 Ways AR and VR are Infiltrating the Enterprise

Use Cases For the Enterprise Are Becoming Reality 

It’s not just The AREA who are providing use cases for enterprise AR although if you’re interested in our offerings see here. https://thearea.org/area-resources/use-cases/

 

Enterprise uses cases are definitely out there, from things like designing machinery in a collaborative VR space to avoid collision issues on the manufacturing floor, according to a report released in March by Kaleido Insights. The report uncovered some use cases exploring mixed reality’s effects within enterprises, such as, engineering and design modeling, training and employee education, real-time information overlay, theft protection, b2b sales, and marketing and entertainment.

 

Enterprises are now starting to realize “where efficiencies can be reached that can save money.” And the use cases are not isolated into one particular industry, Szymanski found. “What we found in talking with a lot of leaders and innovators and practitioners in XR spaces is that these use cases span multiple industries,” she said. “So something that might work for automotive could also work for healthcare and education.”

Let’s examine some of the ways VR and AR are creeping into the enterprise.

Engineering and Design Modeling 

Manufacturing firms are using computer-aided design and computer-aided modeling software to mock up products for manufacturing environments or automotive environments before tech hits the shop floor, according to Szymanski. It can help detect collision and safety issues and increase collaboration within organizations. “We see that as the case study that is really taking off the most for the past few years,” Szymanski added.

Training and Education

Case studies in training and education include more of the VR variety than AR. “People can be really immersed in these new training environments for situations that either are very unique or are very dangerous,” Szymanski said. Oil companies, for instance can train employees and contractors on dangerous situations in the field. These situations “require a lot of people to collaborate,” Szymanski said. “With a VR environment, they’re able to scale training efforts.” Szymanski also cited retail trainers who can show employees what a retail sales floor should look like. “Retail employees are usually seasonal, and there’s high turnover, so you don’t really want to pull away a veteran employee to train someone that might be gone in two months after the holiday season.”

Real-Time Information Overlay

AR headsets can provide real-time data delivery to service technicians. They in turn can make repairs to products/homes/cable systems more effectively and efficiently, allowing for quality control on the spot. Typically, you’d find these use cases, Szymanski said, in an industrial capacity but there are some interesting case studies emerging in healthcare, she said; i.e. surgeons who need some on-the-spot information.

Check out this AR example of a water pump fix.

Onboarding Employees

Beam, a design and digital marketing agency, has built an AR-based onboarding program. New employees can use AR headsets during their onboarding with new organizations in order to gain knowledge about the business, the office and to connect with employees. Of course, you wouldn’t want to make the already-awkward-enough first day on the job more uncomfortable, so Beam suggests organizations encourage their new employees to use the AR headsets during off-hours. The headsets bring information such as who sits where, what they do, which clients they have and how you can connect with them. Clicking on a person will unlock a quick video where an employee says what she or she does, and how he or she can help the newbie.  “Ask me about joining the new Fantasy Football League. I’m the commissioner,” “Andy” says in a video Beam produced to demonstrate its AR onboarding program.

“Onboarding is still kind of a weak link in the chain,” said James A. Gardner, who worked as a consultant with Beam on its AR onboarding program. “I don’t think it’s as bad as it used to be, but I can remember starting a job where you show up and don’t have a spot to work and a computer for the first week. It’s really unwelcoming.”

AR for onboarding can help new employees understand how the company works, what different departments do and encourage the newbie to “really get productive and acclimated quickly,” Gardner said. It goes beyond, Gardner added, just shaking hands and getting the standard, “Welcome to the team, Bill.” Ultimately, it can help collapse the time of a new employee getting comfortable and productive.

Customer Experiences

Of course, it’s always nice to walk the reality of the customer’s shoes. Rori DuBoff, managing director of content innovation and extended reality for Accenture Interactive, said her company has worked to implement VR simulations that help better understand customers and their experiences. She cited the example of creating user personas with financial institutions to help train employees on what a customer journey looks like. “When a customer is trying to go through a new mortgage application what is the actual experience like for them?” DuBoff asked. “What are the questions? How can we be more empathetic and understanding to what that experience was like for the consumer?”

 




Volkswagen Commercial Vehicles Test Drives RealWear’s Hands-Free HMT-1 AR Wearable for Speeding up Maintenance and Repairs

“This is an exciting step forward in maximising uptime for our customers through the use of augmented reality and remote diagnostic tools,” said Paul Anderson, Service Operations Manager at Volkswagen Commercial Vehicles. “Our ultimate aim is to ensure we can keep the customers’ vehicles on the road for longer and that means reducing the time it takes to repair a vehicle when it’s in one of our centres.”

Using the state-of-the-art wearable device, Volkswagen Commercial Vehicles service technicians will be able to simply clip the head-mounted device onto a safety helmet or bump cap and connect directly with the Technical Support Centre at Volkswagen Commercial Vehicles’ Technical Support Centre at the head office in Milton Keynes. Anderson explained that the device was well designed for loud workshop environments, dark and tight areas which require light and hands-free use in rugged environments.

This virtual assistance could significantly reduce service time, as technicians will be supported throughout the diagnosis and repair of the vehicle.

“The RealWear HMT-1® device is solving one of industry’s major dilemmas by bringing the right information to every frontline worker in real time, reducing downtime and improving productivity,” said Andy Lowery, RealWear’s Co-founder and CEO in a statement. “We will work closely with Volkswagen Commercial Vehicles to ensure a successful pilot and rollout to allow them to give the best support to their customers.”

Anderson continued, “The new devices allow our team of Technical Support Agents to support our network with a virtual visit which is as close as possible to the agent being in the centre. Factors such as time out of the office, speed to booking and travel time are stripped out leaving only the value of having our technical support agents virtually in the Van Centre to support the diagnostic process. There’s clearly an environmental benefit in this approach, too.”

RealWear’s® purpose-built platform enables experts to send precise visual instructions remotely to Volkswagen Commercial Vehicles Van Centres and field technicians.

The HMT-1 device serves as a virtual technical support agent, guiding technicians through complex repairs by augmenting images, wiring diagrams and adding repair suggestions into their view while walking them through the necessary diagnostic steps.

The full press release can be read here.

RealWear’s member profile can be read here.

 




The Impact of AR/VR in Enterprise

AR and VR will very soon become important tools of a number of different industry types that want to find new and innovative ways to make their products more desirable and build more future-oriented enterprises. An area where AR and VR will have the most impact will be healthcare. A higher success rate of surgeries and improvements in quality of care are just some of the advantages of using AR and VR in this industry. As precision is the most important thing when performing surgery, surgeons can be more precise with the help of 3D visualizations of the area of surgery in real-time, bolstering the rate of success. Construction is another area where AR is expected to have a big role to play in enhancing the workflows of businesses. Imagine construction companies that provide floor-by-floor 3D plans of buildings before construction is even started. This would give customers a sense of what the buildings will look like upon completion.

The inevitable linking of AI and AR/VR will open new doors for further innovation. This will help enterprises take advantage of new applications and enriched experiences to enhance their business processes beyond what was thought impossible only a few years prior.

 




5 Ways AR and VR are Changing the Oil and Gas Industry

Being a huge and relatively old industry, oil and gas corporations are prime for disruption from new technologies. AR and VR have already seen application in other sectors and there are a number of ways that they could revolutionise oil and gas. So here are 5 ways that AR and VR are going to change the oil and gas industry.

There is a lot of detail in this article and whilst AR and VR are somewhat mixed in together (which can often be the case for enterprises adopting new and emerging technologies), there is a neat summary infographic toward the bottom.

In summary, the key points for how AR will be used in the oil and gas industries are as follows:

  1. Risk assessment – AR will provide real time information about the rig’s machinery for engineers.
  2. Maintenance – AR will give engineers a live feed of the maintenance needs of their machinery.
  3. Training – AR can provide real time tutorials and help while in the field
  4. R&D – AR will allow designers to create and perfect their rig designs
  5. Carbon footprint – both VR and AR will cut down on the overall travel a firm’s employees need to do and can even be used to optimise the efficiency of older rigs allowing them to run for longer.

 

 




Industry Pioneers Choose ThingWorx to Further Innovation Efforts and Improve Productivity – India

ThingWorx Studio is the AR platform under the ThingWorx portfolio. With ThingWorx Studio, users can leverage the richness of 3D and the insights from IoT to deliver compelling augmented reality experiences that help improve efficiencies, build better products and enable safer, more productive workers. In India, this has been adopted by the following companies

  • ARizon Systems, a startup focusing on applications of Augmented Reality in manufacturing. It is an initiative to help the production industry to optimize their performance using the cutting edge technologies in the space of Augmented Reality (AR) and Artificial Intelligence (AI).
  • Grind Master Machines is an emerging global player in superfinishing machines manufacturing. They are a pioneer in manufacturing customized metal finishing, deburring & robotic automation machines.

To find out more about PTC, read their member profile.




U.S. Troops to Test Augmented Reality By 2019

According to Breaking Defense, the U.S. Army is developing a helmet-mounted system designed to project important data onto a soldier’s field of view. The augmented reality concept is based on the heads-up display used on fighter planes. Introduced in the late 1970s, HUDs project key information such as speed, altitude, heading, radar mode, and available weapons onto a fighter pilot’s field of view, allowing the pilot to keep his or her eyes on the skies.

The Army wants HUD 3.0 to work similarly, keeping track of a soldier’s location, the location of friendlies, and other key information. The system would do away with the need for soldiers to use a map to figure out a unit’s location, requiring compasses to determine direction. Speeding up the information-gathering process would allow soldiers to proceed to the next stage—making plans and carrying them out—faster than the enemy.

A previous Army project, HUD 1.0, is already in service. HUD 1.0 is the Enhanced Night Vision Google III, a weapon-mounted thermal sight that projects the weapon’s field of view—as well as the aiming point for a M4 carbine—onto a helmet-mounted monocle. Using HUD 1.0, soldiers can fire from behind cover without exposing themselves to enemy fire. Breaking Defense reports the service is skipping 2.0 due to the sheer technological leap that 3.0 provides.

Army engineers are partnering with an “unnamed industry partner” to develop HUD 3.0 at a rapid pace. One of the biggest potential problems is bombarding soldiers with too much data, crowding their field of vision with useless information. Another is making the helmet mounted displays “soldier proof,” or tough enough withstand the rigors of field use.

 




Augmented Reality Is A Game Changer For Oil & Gas

AR headsets with smart goggles provide on-site technicians with wireless connection directly to headquarters staff or to the most skilled experts thousands of miles away, who can guide the on-site staff through the tasks they perform via audio and video.

Analysts expect technological advances to play an increasingly important part in the continued cost cuts for oil—an industry that has already slashed costs across the board to survive and reposition itself to profit at low oil prices.

According to ABI Research, the AR revenue for the energy and utilities sector will be US $18b market in 2022, with platform and licensing, as well as smart glasses hardware, comprising the majority of that market, according to ABI Research.

This year alone, the energy and utilities industries will account for 17 percent of global smart glasses shipments.

“AR enables better visualization of underground assets, pipelines in concrete, or complex components, which help avoid breaks while digging, detect dangerous leaks, and reduce accidents. Accordingly, employee safety will be maintained along with a decline in errors and total downtime,” Marina Lu, Senior Analyst at ABI Research, says.

For an oil industry recovering from the oil price crash, cutting downtime and costs and boosting efficiency is the new normal at oil prices half their level compared to 2014.

So, companies have jumped on the ‘digital disruption’ bandwagon and some are already using smart goggles and wearables to reduce downtime and increase safety.

For example, Baker Hughes, now part of GE, has been using the VRMedia Srl Smart Helmet.  Recently, Baker Hughes has replaced parts of a turbine at a petrochemical plant in Malaysia in five days and no travel expenses as one on-site technician was guided by specialized U.S. engineers supervising the work remotely from a Baker Hughes site in California. The replacement of the parts would have otherwise involved at least 10 days of halted operations at the plant and US$50,000 to fly the American team, Bloomberg reports.

 

Cutting the downtime saves millions of dollars to oil companies.

According to Deloitte, a 100,000-bpd refinery losing a single day of operations could reduce revenues by more than US$5.5 million and cut profit by US$1.4 million.

Oil and gas facilities shut for 27 days each year on average, Bloomberg quoted industry analyst Kimberlite International Oilfield Research as saying.

Remote AR is not the cure-all wonder tech because it needs reliable wireless networks which oil rigs in the wilderness often lack, and because headsets need to meet strict safety standards to be used close to hazardous materials.

But some of the biggest oil firms are already using some kind of AR or VR or wearables to increase safety and efficiency and cut costs.

  • Eni and MIT have created wearables to improve workplace safety in the oil and gas industry.
  • Shell is using VR in trainingfor safety training procedures at a deep-water oil project offshore Malaysia.
  • BP uses AR smartgalsses in its U.S. operations, alongside drones and advanced analytics.
  • Scotland-based Cyberhawk performs inspections and surveys with drones at oil and gas platforms, plants, and refineries across the world, and its clients include major oil and gas companies.

The use of sophisticated and smart technologies is spreading across the oil sector, in which the biggest firms and all those smaller companies who survived the downturn continue to look to save more costs.

 




Butterfly Network Announces the World’s First Augmented Reality Telemedicine Technology

Butterfly iQ is a personal ultrasound device.  Using Butterfly Tele-Guidance™ technology, an ultrasound expert can remotely guide any user to acquire even the most challenging ultrasound scans, bringing medical expertise to wherever it’s needed most. Only a fraction of the 40 million healthcare workers worldwide have the expertise to capture and interpret ultrasound images. Butterfly Tele-Guidance will expand their reach and facilitate early diagnosis and preventive medicine.

“This is an important step in fulfilling the promise of bringing ultrasound to the millions that do not have access to this essential medical technology,” said founder and serial entrepreneur Dr. Jonathan M. Rothberg. “I set out to democratize ultrasound imaging as I did with DNA sequencing. Two-thirds of the world has no access to medical imaging, and even in the developed world, expense and lack of expertise limit its accessibility. We solved the expense hurdle with the Butterfly iQ, and now with Butterfly Tele-Guidance technology, we are providing a unique approach to solving the expertise shortage.”

Butterfly Network conquered the problem of affordability of ultrasound systems by leveraging semiconductor technology and replacing the piezoelectric crystals with the world’s first commercial Ultrasound-on-a Chip™. Butterfly iQ received FDA clearance for 13 indications, the broadest ever for a single ultrasound probe, making it the only whole-body ultrasound imager. Priced at under $ 2k, the iQ also is the first personal ultrasound device, paving the way for all healthcare professionals to have a window into the body.




How Augmented Reality will make surgery safer

Some of the biggest medical advances of the last few decades have been in diagnostic imaging—ultrasonogaphy, mammography, computerized tomography (CT), magnetic resonance imaging (MRI) and so on. The same forces that have propelled technology developments elsewhere—tiny cameras, smaller and faster processors, and real-time data streaming—have revolutionized how doctors use imaging in performing procedures. Almost every surgery involves some sort of a scan prior to incision. Even in emergencies, surgeons have ultrasound or CT to help guide the procedure. Imaging can now be performed in real time at the point-of-care during procedures, both big and small.

Yet, while imaging has radically evolved, how images are displayed is basically the same as it was in 1950. Visual data are always shown on a 2D flat screen, on displays that force health care providers to look away from the patient, and even away from their own hands while operating. Further, the images are not displayed from the perspective of the viewer, but rather from that of the imaging device: doctors have to use skill and imagination to understand and mentally project the images into the patient while they are doing procedures. Finally, different types of visual data are displayed separately, so doctors have to direct additional attention to mentally fusing multiple image types, such as angiography and CT, into a coherent representation of the patient. Acquiring this skill takes years of training.

Augmented reality (AR), a set of technologies that superimpose digital information on the physical world, has the potential to change all of this. In our research at the Maryland Blended Reality Center’s “Augmentarium,” we are prototyping AR applications in medicine, as are teams at Stanford, Duke and Johns Hopkins. In envisioned application, a surgeon using an AR headset such as Microsoft’s HoloLens would be able to see digital images and other data directly overlaid on her field of view. In such a scenario, the headset might display a hovering echocardiogram with vital signs and data on the characteristics of the patient’s aneurysm directly above the surgical field. The surgeon needn’t look away from the patient to multiple different displays to gather and interpret this information.

AR’s potential ability to concurrently display imaging data and other patient information could save lives and decrease medical errors. This is especially true for procedures done outside an operating room. The OR may be the safest place in the hospital, where one patient has an entire team of 4 to 8 dedicated doctors and nurses. Because everyone has pre-operative imaging, the procedures are generally well-planned. Anesthesiologists monitor the patient’s physiology and administer pain-controlling and life-saving medications. Surgical nurses make sure all of the necessary equipment is immediately available. Surgeons can be completely immersed in the operative task. But time in the room is extremely costly, and ORs are solidly booked with elective cases. Elective operations are an essential source of revenue for all hospitals, so there is incredible pressure to keep ORs full and flowing. Small, emergent procedures do not easily fit into this reality. As a result, many of these procedures are done outside the OR in intensive care units and emergency departments. It’s during these “bedside procedures” that patients may be most at risk and where AR could provide some of the greatest benefit.

The full article can be read here.