Facilitated User Interactions for Selecting and Manipulating 3D Models in AR

Why is this Important?

  • AR experience developers and employees in AREA customer segment companies will have many use cases involving the selection and manipulation of 3D models. However, the acceptance and value of 3D models will be low until there are improvements in how models are prepared, presented, and manipulated during AR experiences.
  • The proposed research topic will benefit AREA members by increasing 3D model usability, lowering a cognitive load, and increasing the impact of 3D models in the workplace.

Computer-generated models and models derived from scans of objects are increasingly part of product designs and testing activities. They are also valuable in diagnostics, and guiding assembly, repair and maintenance procedures. Today, many AR experiences use only 2D targets and display 2D line drawings, text, and images. However, in the future 3D models will increasingly be used for object recognition, tracking, and registration of experiences. Models will also be displayed over real-world objects for instructions or “hanging” in space when two or more collaborators need to examine the same object. Users will need to manipulate and interact with the models.

Unfortunately, the tools (e.g., pointing devices) and conventions (e.g., gestures and voice commands) for the selection and manipulation of 3D models in AR are unfamiliar and challenging for users. Most models lack clearly labelled anchors, handles, and other indicators that would reduce the barriers to user interactions. When there are multiple models in a scene, users may be unable to select one for closer examination. Users must learn advanced techniques and concentrate to perform simple manipulations such as open, close, and hide, not to mention advanced manipulations such as rotate, scale (enlarge and shrink), attach, move to back, move forward and other actions that may increase the value of a model in AR experiences.

This research topic focuses on the design, development, and evaluation of computer-human interactions and computer graphic techniques to improve the ease-of-use of 3D models in AR experiences. When research results are published in peer-reviewed journals, there will be greater diversity in solutions and more innovation based on a deeper understanding of the options and trade-offs involved.

Stakeholders

This research is relevant to designers and managers of enterprise 3D assets, 3D asset platform publishers, AR experience developers, AR program managers, and AR authoring platform publishers.

Possible Methodologies

The research will build and publish a set of 3D models that objectively represent what would be used in workplace settings where AR is used. The models will be designed with features such as anchors and handles. Studies will compare approaches developed in commercial software for 2D screens with those proposed in AR and VR game engines and experience authoring platforms. In laboratory settings, users will be given tasks to perform with 3D models while using different models of wearable AR displays. In addition to time-motion studies and ergonomics, user testing will measure strain and cognitive effort when performing tasks that require model selection and manipulation.

Research Program

This research topic could be combined with studies of 3D capture, computer graphics, and 3D model registration and annotation. Research pertaining to AR user interfaces and interaction could also be extended to include 3D model selection and manipulation. The study of hand tracking would contribute to exploring use of gestures for simple and advanced model usage.

Miscellaneous Notes

Microsoft has published several papers and best practices on this for HoloLens, hand tracking, and gestures, but these are exclusively implemented in one platform and there has not been extensive comparison using quantitative methods. This topic was proposed for consideration in 7th AREA research project topic call and received high member interest and support.

Keywords

3D models, selection, manipulation, anchors, labels, handles, rotation, scale, attach, human computer interaction, computer human interaction, computer graphics, computer

Research Agenda Categories

Technology, End User and User Experience

Expected Impact Timeframe

Near

Related Publications

Using the words in this topic description and Natural Language Processing analysis of publications in the AREA FindAR database, the references below have the highest number of matches with this topic:

More publications can be explored using the AREA FindAR research tool.

Author

Christine Perey

Last Published (yyyy-mm-dd)

2021-08-31

Go to Enterprise AR Research Topic Interactive Dashboard