A new 3D approach to remote design engineering
Article by Karl Maddix, co-founder and CEO of AREA member Masters of Pie. The coronavirus pandemic has forced almost every business to adapt to new ways of working. In many cases, conferencing services have saved the day – enabling remote teams to collaborate on projects when they can’t be in the same room. But two-dimensional (2D) conferencing is a poor substitute for engineers trying to work together remotely on complex 3D data to design the latest motor vehicle or jet engine.
And trying to untangle complex problems remotely from thousands of miles away is fraught with difficulties – even when using products like Microsoft’s Remote Assist. The expert often has to resort to waving their hands around on a screen to communicate to the technician which part of a machine they should be fixing – and which parts should be left alone.
Real-time immersive 3D collaboration is now adding a new dimension to such problem solving – users can share live, complex 3D files such as CAD data, interact with them and reveal ‘hidden’ parts deep within a machine that may be causing an issue. The technology also transforms day-to-day collaboration between remote engineering team members. Design reviews, for example, can be brought to life, with participants ‘walking through’ a model, no matter where they are in the world.
The fundamental problem at the root of many of these issues until now has been that enterprise teams have lacked the ability to effectively collaborate in real time using live, complex 3D data. The solution lies in purpose-built framework technology for integrating natively real-time collaboration and immersive device support directly into legacy enterprise software packages.
The key to enabling true real-time collaboration is to start where the data ‘sits’ and ensure that this original data ‘truth’ is the same for everybody when working together, no matter where they are located or what device they wish to use. This way, everyone in the team has the correct and most up-to-date information available.
Whether it is a CAD package, PLM software, an MRI scanner, robotic simulation software or a laser scanning system, many industries are becoming increasingly dependent on spatial data types and digital twins. These complex data formats are usually incompatible or just too cumbersome to use ‘as is’ in existing collaboration platforms such as Webex, Skype, Google docs or Slack – all built primarily for 2D data formats such as video, text or images.
Moreover, the legacy software generating the data itself is unlikely to have any in-built real-time collaboration functionality – forcing enterprise users to resort to one of two methods. One option is to manually export the data, carry out a painful and time-consuming reformatting process, then manually import the newly crunched data into some type of third-party standalone collaboration package. The alternative is to ignore the spatial nature of the data entirely and instead screen-grab or render out 2D ‘flat’ images of the original 3D data for use in a basic PowerPoint presentation or something similar.
Neither of these methods allows teams to efficiently collaborate using a live data truth – i.e. the original data itself instead of a reformatted, already out-of-date interpretation of it. So, both methods only compound the root collaboration problem instead of helping to solve it.
The latest generation of real-time immersive 3D collaboration technology is integrated directly into the host software, grabbing the original data at source before efficiently pushing it into a real-time environment which users can access using their choice of device (VR, AR, desktop, browser or mobile) for instant and intuitive collaboration. End-to-end encryption ensures that even the most sensitive data may be confidently shared across remote locations.
The integration into the host package provides not only a live connection to the data but also a bi-directional connection, meaning that users are still connected to the host software package running in the background. The advantage of this over standalone applications is that it still gives access to core features of the host package – enabling accurate measurement of a CAD model using vertex or spline snapping to the original B-Rep held in the CAD package, for example. All the underlying metadata from the host package is also available to the immersive experience – and annotations, snapshots, action lists or spatial co-ordinate changes can be saved back into the host package.
The new post-pandemic requirement to have a distributed workforce – in conjunction with the rollout and adoption of key technology enablers such as server-side rendering and high-capacity, low-latency connectivity – is set to accelerate the adoption and integration of real-time immersive collaboration solutions. In the future, 5G technology will also open up the potential to stream to immersive AR and VR devices – untethering the experience and facilitating factory-wide adoption of immersive solutions. For example, as ‘Industrial Internet of Things’ (IIoT) data streams from smart devices in the factory, it will be overlaid via AR glasses in the physical space. And as cloud service providers build out features such as spatial anchoring to support ever-larger physical spaces, these new services will be used within collaborative environments rich with real-time data.
Factory workers, for example, will have the ability to ‘dial an expert’ directly from a virtual panel on a smart factory device. This offsite expert will appear as a holographic colleague and bring with them live 3D data for that individual machine. Both users will have real-time IIoT data overlaid intuitively on the fully interactive 3D model to facilitate a more effective diagnosis and maintenance process.
Empowering shop-floor workers with hands-free AR and detailed 3D data will dramatically improve assembly line efficiency, with an intuitive environment where product data is fully interactive. Users will be able to move, hide, isolate and cross-section through parts, while using mark-up and voice tools to create efficient instructions for the assembly or disassembly of complex products. These instructions will be recorded and delivered as holographic guides via AR directly on the assembly line.
The next generation of real-time immersive 3D collaboration technology is even set to enable you to have a scaled-down hologram of your latest engine design sitting right in front of you on your desk. As you work on the design and refine it using your CAD software, the changes will be dynamically loaded into the hologram so that you can see the effects immediately and make any further necessary adjustments.
Meanwhile, digital sleeving – with 3D images overlaid on physical designs – will enable you to check how two parts of the engine come together, even when they are being designed by different teams in different locations. Similarly, you will be able to see how, for example, cabling will fit inside your latest aircraft seat design or where best to put the maintenance pockets for easy access.
This kind of approach adds a new dimension to the handoff between design and manufacturing. If adjustments need to be made to a fan assembly design, for example, the relevant part can be isolated within an immersive design review – and speech-to-text notes can be added to the part number and automatically become part of the change request. It’s all a far cry from endless design iterations, spreadsheets and printouts – or CAD screen shares using a 2D representation of a 3D problem.
In the post-pandemic remote world, conferencing is bringing people, video and documents together. Collaboration is now adding the fourth dimension of 3D immersive experience to complete the picture.