1

Chemical and Radiation Sensors for AR Devices

Existing AR display devices do not have sensors to detect gases, radiation or other elements in the user’s environment. For many use cases of interest to the oil and gas industry as well as, mining, emergency responder and chemical industries, there needs to be research into the types and integration of existing chemical sensing technologies with the AR display or wearable computing system or to develop standard interfaces with IoT sensors to display readings in real time.

This research topic is a concrete example of a more fundamental research topic: the detection of an AR system user’s context beyond what can be done with existing sensors (cameras, microphone, IMU, etc). The scope of this research can be narrow or large, depending on the support provided from commercial or public agencies.

Stakeholders

Developers, operators and employees working in places where invisible gases may pose a risk, regulatory agencies, compliance officers.

Possible Methodologies

A laboratory would need to be developed for controlled exposure to chemicals and radiation sources. The lab and platform for testing will have off-the-shelf sensors and/or the project may require development of lightweight and power efficient sensors that are effective as alternatives to cameras and existing vision-based environmental capture. The testing and development of 3D interaction modes when user’s environmental sensors detect unsafe conditions is a fundamental part of this research domain.

Research Program

This topic or theme of research overlap with the topic of visualizing conditions in the direction or on the path of any moving object in atmosphere that is opaque or under water. The outcomes of this research could also be applied in non-industrial use cases (e.g., pollution sensing). The same research topic could be combined with study of user interfaces and interaction paradigms for the visually-impaired community.

Miscellaneous Notes

In 2012, the U.S. Department of Energy Office of Scientific and Technical Information (OSTI) funded research conducted at Department of Nuclear Engineering & Radiological Sciences, University of Michigan, on this topic. Preliminary results were reported in a poster about visualization of radiation in AR. A 2017 presentation about research on related topics conducted at CERN using HoloLens in particle accelerator environments is also relevant.

Keywords

Sensors, human factors, environment, oil and gas, chemical, power and energy, radiation, hazardous materials, radioactivity, explosives, chemical hazards,, volatile organic compounds, hazardous materials, aromatic compounds, gases, hydrocarbons, indicators (chemical), radioactivity, chemical hazards, chemical detection, gas sensors

Research Agenda Categories

Industries, Technology

Expected Impact Timeframe

Medium

Related Publications

Using the words in this topic description and Natural Language Processing analysis of publications in the AREA FindAR database, the references below have the highest number of matches with this topic:

More publications can be explored using the AREA FindAR research tool.

Author

Christine Perey

Last Published (yyyy-mm-dd)

2021-08-31

Go to Enterprise AR Research Topic Interactive Dashboard




What Factors Influence Perceptions of Presence?

Presence is a term used to describe the “feeling of being there” in a virtual environment. It can be cognitive, in that the user’s mind is engaged in the virtual content, or it can be perceptual, in that the user’s sensory systems perceive the virtual environment to be “real.” In Augmented Reality (AR), this is represented by a seamless integration between the physical and virtual worlds.

A question that arises in this area of research is what factors of the virtual environment influence the perceptions of presence? Is there a minimum amount of virtual information needed in an AR environment? Does the level of fidelity of the virtual elements influence the reported experience of presence? For example, will an application that uses realistic 3D models result in higher levels of presence than lower fidelity (i.e., cartoon) 3D models?

In VR worlds, the user is immersed into a completely virtual world, so presence is likely to be experienced if the application is successfully implemented. In AR environments, users are exposed to virtual elements in addition to their physical surroundings. If a user has to switch their attention between the two worlds to complete a task, the result can be disruptive and/or fatiguing due to the lack of presence. If the user can complete the task as if the virtual elements are part of the physical world, the result is a seamless, productive experience that includes presence. It also is possible that lower fidelity serves as a distraction to the user, which may result in breaks in immersion and a reminder of the artifical nature of the virtual world. This could, in turn, impact their sense of presence.

A practical example of this issue is with the use of avatars in collaborative environments. How is presence affected by the appearance and customization of the avatars? If a user pays attention to a particular feature or abnormality due to low fidelity rendering, it may impact their ability to perform the task at hand. Another example may be dynamic elements in the virtual world, such as a bouncing ball or a spinning tire. How distinguishable, in appearance and in behavior, the object is from its physical counterpart may influence the level of reported presence.

The main research question in this area is “How much virtual information of what fidelity is necessary in an AR environment to produce a sense of presence?” This question is of interest at both ends of the quantity spectrum — how much is enough, and how much is too much? Too little and the user will not experience presence; too much and the user may become so immersed in the virtual elements that awareness of the physical world may be compromised.

This research topic involves the examination of AR environments with differing amounts of virtual elements at different levels of fidelity and the measurement of presence among users.

Stakeholders

Developers, users, operators, users of collaborative virtual environments # Position on X and Y axes (1-5)

Possible Methodologies

Presence tends to be a self-reported measure assessed by means of a questionnaire. Physiological measures also may be explored to correlate with reported presence, engagement, and satisfaction. These measures could be systematically compared across environments of varying complexity in terms of number of virtual stimuli for a variety of tasks.

Research Program

This topic is related to other proposed AREA Research Agenda topics on display technology and user perceptions and satisfaction.

Miscellaneous Notes

An interesting article related to the amount of physical space in which AR application is used can be found here.

Keywords

Presence, immersion, awareness, realism, cognitive tunneling, mental workload, seamless experience, high fidelity, low fidelity, realism, avatars, object interaction, display technology, presence, interactive computer graphics, user experience, cognitive systems, sensory perception, avatars, computer graphics, color computer graphics, holographic displays, animation, image quality

Research Agenda Categories

End User and User Experience, Industries, Technology

Expected Impact Timeframe

Medium

Related Publications

Using the words in this topic description and Natural Language Processing analysis of publications in the AREA FindAR database, the references below have the highest number of matches with this topic:

More publications can be explored using the AREA FindAR research tool.

Author

ERAU Team

Last Published (yyyy-mm-dd)

2021-08-31

Go to Enterprise AR Research Topic Interactive Dashboard




New Power Sources for Wearable AR Displays

As the complexity and computational requirements of world capture, world analysis, scene management, rendering, and human interactions with AR experiences increase, a delicate balance must be struck so as to ensure that the useful life of a device between recharges is not too low. Power management may be addressed through a combination of different approaches, including increase in use of low power DSPs, off-loading some computational tasks to the edge of the network (off-device services), and increasing power storage capacity. There is another potentially powerful resource to address the duration of wearable AR display usage. This topic focuses on the research and development of novel methods to capture power from the user or the environment that could be transferred to the power storage system.

Specifically, the proposed topic will research, build, and test implementations of new methods of energy harvesting from sources that have not been used in prior wearable AR display systems. There will need to be studies of human movement (steps, arms, hands), solar sources for users that are outdoors, and chemical reactions that release energy. This topic will span a wide range of technologies, including analysis of the power requirements of each individual AR display component.

Stakeholders

All users of wearable AR display devices and those who manage their use in the workplace will benefit from longer duration between recharges. Introducing or integrating novel power production, transfer, and storage technologies will have impacts on display costs, which could affect the number of devices purchased.

Possible Methodologies

The research will be leveraging developments in physics, chemistry, and other sciences pertaining to energy production and combining those with deeper studies of wearable AR display device power use. In addition to theoretical calculations, there will need to be prototypes developed and tested to measure the efficacy and efficiencies of power capture, storage, and use with AR displays. Finally, the introduction of new energy production, transfer, and storage methods or systems will need to be carefully studied for their safety in the workplace.

Research Program

This research topic could be studied in different environmental conditions, such as indoors, outdoors, and in different temperatures. In addition, there will need to be studies of different use cases in which some users are actively moving throughout a work shift. The results of this research would also be very valuable for non-enterprise users and display devices.

Miscellaneous Notes

Although quite dated by today’s standards, one of the first studies focusing on this topic was performed by Dr. Jannick Rolland and Dr. Henry Fuchs to examine the pros and cons of these two display options in a surgical use case. The study was published in the journal “Presence” in 2000. The topic of mitigating parallax-related registration errors is a highly active field of study, as demonstrated by this article published in December 2020 in the Frontiers in Robotics and AI journal.

Keywords

Power use, power consumption, power production, AR device energy sources, energy capture, energy production, AR device energy transfer, energy consumption, energy management, electromechanical, solar energy, friction, power storage for AR displays, human factors, weight, usability, portability, computational efficiency, low power electronics, electric batteries, power consumption, power conversion, power aware computing

Research Agenda Categories

Displays, Technology

Expected Impact Timeframe

Long

Related Publications

Using the words in this topic description and Natural Language Processing analysis of publications in the AREA FindAR database, the references below have the highest number of matches with this topic:

More publications can be explored using the AREA FindAR research tool.

Author

Christine Perey

Last Published (yyyy-mm-dd)

2021-08-31

Go to Enterprise AR Research Topic Interactive Dashboard




AR Visualization of Body Sensors for Worker Biofeedback

During the performance of tasks or fulfilment of roles, an employee may be unaware of changes in their involuntary bodily functions such as blood pressure and heart rate. When aware of any unusual metrics, the user can choose or be prompted to take appropriate actions. Visual and/or auditory information captured in real time by body-worn sensors (e.g., watches) could capture posture, head flexion and extension, whether the arms are above the shoulders, and possibly even squatting. The system could provide feedback on static posture duration and/or frequency and suggest postural changes or breaks based on criteria that increase the risk of muscular skeletal disorders.

The sensor data can be provided to the AR display through a body-area network. This topic focuses on the low-latency transmission and processing of observations from body-worn sensors to the user’s AR display and the presentation of data and recommendations in a compact and actionable manner. The system design research should explore both automated modes (which are triggered upon the user’s functions reaching a threshold) as well as manually controlled modes (e.g., enabled by a user seeking to obtain vital statistics). There must also be research to ensure that any AR visualization system is secure and upholds all relevant user data privacy protection policies.

Stakeholders

Companies monitor and manage workplaces for their suitability to employees. This research will be valuable to all workplace health and safety professionals, wellness.

Possible Methodologies

Research on this topic includes testing and studying the features of body-worn biometric devices in professional settings. It will be necessary to design and build a body-area network supporting existing standards in order to interface with radios and technologies on AR display devices. The research will also require designing and studying the usability, efficacy, and performance of AR-enriched user interfaces for providing a user with only pertinent biometric data via an AR display.

Research Program

This research topic can be combined with other topics pertaining to automated alerting of users to risk conditions using the AR display. It can also extend research on biometrics, biofeedback, and behavior modification therapies.

Miscellaneous Notes

A peer-reviewed study of a biofeedback system combined with a VR display for managing emotional states was published in SIGGRAPH ’17 proceedings . As documented in this 2019 review of the state of the art of body-area network usage in healthcare, the technology is maturing. Combining body-area networks and AR in professional domains is an unexplored field that has high potential for impact. Similarly, biometrics and the use of biofeedback in the workplace are very large and active fields of research. However, to date, their intersection with Augmented Reality has not been documented in the peer-reviewed literature.

Keywords

Biometrics, body sensors, blood pressure, heart rate, body temperature, body-area network, biofeedback, user data privacy protection, visualisation, alerts, biofeedback,

Research Agenda Categories

End User and User Experience, Technology

Expected Impact Timeframe

Medium

Related Publications

Using the words in this topic description and Natural Language Processing analysis of publications in the AREA FindAR database, the references below have the highest number of matches with this topic:

More publications can be explored using the AREA FindAR research tool.

Author

Christine Perey

Last Published (yyyy-mm-dd)

2021-08-31

Go to Enterprise AR Research Topic Interactive Dashboard




Impact of Spatial Vision on Visual Encoding and Memory Anchoring

Some complex instructions and illustrations used in training or on the job require greater time and cognitive load for workers to understand, retain, and use when displayed in planar mode (2D). Three-dimensional representations and spatial vision enabled by stereoscopy has been shown to increase comprehension of spatially relevant concepts and increases their encoding and retention in memory. Although technology today enables spatial vision, frequently it requires some level of compromise around performance, wearability, or resource requirements.

This research topic focuses on measuring the impact of binocular (vs monocular) vision on short- and long-term memory encoding (i.e., the process of changing sensory inputs into forms that are stored in the brain and anchored in such a way that enables effective retrieval).

Stakeholders

User experience designers, AR experience developers, human resources professionals, AR display designers, AR display manufacturers, researchers studying cognition and performance of users in the workplace

Possible Methodologies

The research topic would need experts to develop and use a combination of existing neuro-analytical tools (tools that measure neurological brain activity) and biometric tools that infer neurological responses by proxy. The former includes EEG, fMRI (functional MRI), fNIRS (functional near-infared spectroscopy) and steady state topography (SST), all of which directly measure brain activity related to specific brain functions. For instance, SST measures the speed of electrical activity on the surface of the brain, linking changes in certain areas to specific metrics like engagement and memory encoding. The latter includes eye tracking, facial coding, and biometric data like heart rate monitoring. This research will analyze data for broader interpretation, offering insights into the use and impacts of 3D spatial viewing with AR, compared to measurements made by technology like SST and fMRI.

Research Program

This research topic can be integrated with fundamental research on brain function. It could also be combined with studies of specific use cases in which the system recalls the users’ spatial vision strategies and enhances those selectively. User experience design would also benefit from studies of this and related neuro-analytical tools and topics.

Miscellaneous Notes

There has been research published on the topics of spatial vision and more specifically on AR and memory encoding. This 2019 article published in Frontiers in Human Neuroscience reports on research conducted using AR to assess the impact of gender on spatial vision and anxiety.

Keywords

Spatial vision, spatial memory, visual encoding, memory anchoring, spatial frequency, receptive Field, modulation transfer function, high spatial frequency, threshold

Research Agenda Categories

Displays, Technology, End User and User Experience

Expected Impact Timeframe

Near

Related Publications

Using the words in this topic description and Natural Language Processing analysis of publications in the AREA FindAR database, the references below have the highest number of matches with this topic:

More publications can be explored using the AREA FindAR research tool.

Author

Peter Orban, Christine Perey

Last Published (yyyy-mm-dd)

2021-08-31

Go to Enterprise AR Research Topic Interactive Dashboard