-
Notifications
You must be signed in to change notification settings - Fork 2
/
Copy path2024-new-ears.html
50 lines (45 loc) · 2.67 KB
/
2024-new-ears.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
---
layout: publication
year: 2024
month: 11
selected: false
coming-soon: false
hidden: false
external : false
# link: https://dl.acm.org/doi/10.1145/3472749.3474750
pdf: https://doi.ieeecomputersociety.org/10.1109/ISMAR62088.2024.00053
title: "New Ears: An Exploratory Study of Audio Interaction Techniques for Performing Search in a Virtual Reality Environment"
authors:
- Muzhe Wu*
- Yi Fei Cheng*
- David Lindlbauer
# blog: https://interactive-structures.org/publications/2023-10-parametric-haptics/
doi: 10.1109/ISMAR62088.2024.00053
venue_location: Seattle, WA, USA
venue_url: https://ieeeismar.org/
venue_tags:
- IEEE ISMAR
type:
- Conference
tags:
- Virtual Reality
- Auditory Perception
- Interaction Techniques
venue: IEEE ISMAR
video-thumb: L1lm76vChMQ
video-30sec: L1lm76vChMQ
#video-suppl: PIUCEdw4UqA
#video-talk-5min: l9ycUrf50TE
#video-talk-15min: l9ycUrf50TE
bibtex: "@inproceedings {Wu2024NewEars, \n
author = {Wu, Muzhe and Cheng, Yi Fei and Lindlbauer, David}, \n
title = {New Ears: An Exploratory Study of Audio Interaction Techniques for Performing Search in a Virtual Reality Environment}, \n
year = {2024}, \n
publisher = {IEEE}, \n
doi = {10.1109/ISMAR62088.2024.00053}, \n
keywords = {Virtual reality, auditory perception, interaction techniques}, \n
location = {Seattle, WA, USA}, \n
series = {ISMAR '24} \n
}"
---
Efficiently searching and navigating virtual scenes is essential for performing various downstream tasks and ensuring a positive user experience in VR. Prior VR interaction techniques for such scenarios predominantly rely on users' visual perception, which contrasts with physical reality, where people typically rely on multimodal information, especially auditory cues, to guide their spatial awareness. In this work, we explore the potential of leveraging auditory interaction techniques to enhance spatial navigation in virtual environments. We drew inspiration from prior distant interaction techniques and developed four approaches to augmenting how users hear in the virtual environment: Audio Teleportation, Audio Cone, Ninja Ears, and Boom Mic. In a comparative user study ($N=25$), we evaluated these approaches against a baseline teleportation technique in a search task, where participants traversed a virtual environment to locate target items. Our results suggest that several of our audio interaction techniques may enable more efficient search behaviors while enhancing overall user experience. However, not all techniques were appreciated equally, suggesting that careful attention to their design is critical for ensuring their effectiveness. We conclude by discussing the potential implications of our results for future audio interaction technique designs.