Situational Awareness
Situational Awareness
This use case pertains to providing digital assets about current (real time) and historical data to an AR user in context for informed decision making. It can be a component of other use cases, including complex assembly, collaboration, guidance, inspection, maintenance and remote assistance.
Prior to AR Adoption
Situational awareness is defined as having one or more of the following characteristics:
- Involves a user receiving real time data and/or readings about the real world, usually on a continuous basis for the purpose of decision support. The raw data may be shared with other users or as information that is sent to the user after processing in artificial intelligence systems.
- The user gains the comprehension of their environment, possible connections between historical and current data, and the projection of their future status.
- Sources of live data can be sensors on places, machines or people and, although close proximity to the data source is more likely to be relevant for action, sensors can be at any distance from the user
Situational awareness is particularly valuable when the user must make highly informed decisions without delay. Decisions may be about any action, for example, which route to take to a destination, evacuation of an area in case of risks, preventative maintenance and other time-sensitive issues. Decisions may be made by the user or by managers. Having current and highly precise data about the status of objects in the users environment may influence the duration of a user’s exposure to risk. The term is closely associated with command and control scenarios, such as fire fighters or soldiers in the field.
Business Challenges AR Introduction Addresses
Those in the field who benefit from situational awareness may need to use all their peripheral vision, be carrying materials, following instructions or be otherwise occupied. In these situations, users are unable to focus attention or hands away from their tasks to query an interface on a smart phone for data or place a call to a remote person who is receiving real time data on a console.
Use Case with AR
Using AR displays connected to networks and using interfaces designed for the purpose of visualizing live data streaming from connected machines, remote cameras or other sensors (eg. Seismometers, pressure gauges, gauges of water flow), the current data can be superimposed on the user’s view of the real world.
Sensors on a user’s AR device can also be the source of live data for remote managers or decision makers who are not on site but need data for decision making (sometimes called a “digital dashboard.” The remote decision maker can see and hear the on-site user’s circumstances and questions from the point of view of the on-site user.
The type of AR display for situational awareness depends on many factors:
- Need for use of both hands
- Room in the vicinity where the procedures are performed for another screen pointed directly at the work space
- Support for introducing new display devices (e.g., wearable AR, projection AR)
Another capability that an AR-enabled system providing situational awareness can support is the real time capture of the issues encountered when users make decisions. The recording can be used for future training purposes, or to record successful (or incomplete) procedures.
Common roles of Users
Anyone in the field/on-site who can benefit from having real time (live) data about machines, objects and sensors in proximity or at a distance
Business Benefits:
The benefits of AR-enhanced situational awareness include providing a user with data that can be used immediately. By avoiding the need to go to another interface or another location to obtain the real time data (elimination of need to travel), AR reduces downtime and risk by increasing user knowledge. Also the user with live data about physical world status will be able to make decisions with less doubt and uncertainty, reducing stress and cognitive load.
Requirements
AR Hardware Requirements: On-board Storage – Augmented
AR Hardware Requirements: Inputs / Outputs: Sensors - Augmented
AR Hardware Requirements: Inputs / Outputs: Perception - Augmented
AR Hardware Requirements: Inputs / Outputs: Audio - Augmented
AR Hardware Requirements: Inputs / Outputs: Augmented
AR Hardware Requirements: Field of View - Augmented
Augmented - The device MUST provide a 3D view (the images for both eyes are fully overlapping, just offset to provide the perception of 3D).
Augmented - The device MUST provide a minimum 35 degree diagonal field of vision where AR content can be displayed. Augmented - The device SHOULD provide a minimum 50 degree diagonal field of vision where AR content can be displayed.Augmented - The device SHOULD have a variable range of operation from 40 cm to infinity, with HW support for developers to utilize from 20cm
AR Hardware Requirements: Environmental - Augmented
AR Hardware Requirements: Inputs / Outputs: Controller
AR Hardware Requirements: Inputs / Outputs: Mouse / Touchpad
AR Hardware Requirements: Inputs / Outputs: Perception
AR Hardware Requirements: Inputs / Outputs: Processing
AR Hardware Requirements: Inputs / Outputs: Sensors
AR Hardware Requirements: Safety
AR Hardware Requirements: Visual Tracking - Augmented
AR Hardware Requirements: Wear Ability / Comfort - Augmented
AR Software Requirements: AR Content Support
- The content generation and consumption tool MUST support open 3D model formats.
- The content generation and consumption tool SHOULD support proprietary 3D model formats.
- The content generation and consumption tool SHOULD support animations.
- The content generation and consumption tool MUST support open 2D formats.
- The content generation and consumption tool SHOULD support proprietary 2D formats.
- The content generation and consumption tool SHOULD support open video formats.
- The content generation and consumption tool SHOULD support open audio formats.