Trade-off and Substitution: Stereoscopic Vision for Spatial Audio
Stereoscopic vision has long been considered the best type of visualization in terms of matching physical and simulated realities. While stereoscopic vision is the goal, producing a perfect 3D visualization and registration of AR assets is difficult using current technologies. Addressing the shortcomings of current AR displays will be prohibitively high and present a financial barrier to AR adoption in enterprises.
Leveraging advancements in Digital Signal Processing (DSP) and audiology, a new class of devices are emerging. Spatially-aware audio transducers can help determine the exact position and posture/pose of the wearer as well as generate a simulated sound field that matches the physical environment. Such systems could be combined with existing vision-centric displays for high fidelity enterprise AR experiences.
The scope of this topic includes measurement of the spatial audio technology resource requirements and impacts of combining visual cues with spatial audio on user performance. Comparative studies of human cognitive performance aided by varying blends of spatial technology ranging from “audio-only” to “video-only” and various combinations of both are also in scope.
Stakeholders
AR experience designers, developers of integrated sensor and world capture components, human factors researchers
Possible Methodologies
This research topic will require development of visual and audio AR experiences to be produced in a highly controlled laboratory environment within which a series of experiments can be conducted and reproduced. Studies will compare spatial audio requirements to vision-only AR experiences on the basis of accuracy, speed, battery life, bandwidth requirements, processor performance, wearer comfort and pricing. In addition to user perception assessments through surveys and interviews, methods could be expanded to include time-motion studies using standardized, public and well-documented processes typical of industry verticals, use cases and horizontal use case categories.
Research Program
This topic is at the intersection of both 3D visualization and 3D audio. The methodologies and tools developed for this research could be used in the study of perception, presence, and lead to new guidelines for AR developers and manufacturers of HMDs for enterprise AR.
Miscellaneous Notes
In 2016, the Sound of Vision consortium, which focuses on the construction of a new prototype electronic travel aids for the blind published a report about audio-assisted vision. A peer-reviewed article presenting a novel technique for reproducing coherent audio visual images for multiple users, only wearing 3D glasses and without utilizing head tracking was published in 2011 in the Journal of The Audio Engineering Society.
Keywords
Spatial audio, effectiveness, spatial vision, 3D audio, perception, audio signal processing, acoustic waves, active noise control
Research Agenda Categories
Technology, End User and User Experience, Displays
Expected Impact Timeframe
Medium
Related Publications
Using the words in this topic description and Natural Language Processing analysis of publications in the AREA FindAR database, the references below have the highest number of matches with this topic:
- Kim, H., Remaggi, L., Jackson, P. J. B., & Hilton, A. (2019). Immersive Spatial Audio Reproduction for VR/AR Using Room Acoustic Modelling from 360° Images. 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).
- Heller, F., & Schöning, J. (2018). NavigaTone. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems.
- Schmalstieg, D. (2018). VIS Keynote Address : When Visualization Met Augmented Reality. 2018 IEEE Conference on Visual Analytics Science and Technology (VAST).
- Erkut, C., Holfelt, J., & Serafin, S. (2018). Mobile AR In and Out: Towards Delay-Based Modeling of Acoustic Scenes. 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).
- Abdelrazeq, A., Kohlschein, C., & Hees, F. (2019). A Cloud Based Augmented Reality Framework – Enabling User-Centered Interactive Systems Development. Advances in Intelligent Systems and Computing, 417-422.
More publications can be explored using the AREA FindAR research tool.
Author
Peter Orban, Christine Perey
Last Published (yyyy-mm-dd)
2021-08-31