Shaping Spatial Sound: A Psychoacoustic Approach to XR Learning

Short Title: EchoXR
Faculty Mentor:
Dr. Merate Barakat (mbarakat@iastate.edu)
Graduate mentor:
TBD (xxx@iastate.edu)
REU Interns: TBD

Project Summary
The integration of psychoacoustics in learning and gaming environments presents several significant benefits. One of the primary advantages is cognitive enhancement, where thoughtfully designed auditory cues can boost attention, memory, and cognitive processing, ultimately improving learning outcomes. Additionally, psychoacoustics fosters emotional and behavioral engagement, allowing for a deeper emotional connection and immersion, which can enhance the retention of educational content.
 

This approach promotes accessibility and inclusion by utilizing auditory processing to create diverse extended reality (XR) environments that cater to various neurotypes, including those with visual impairments and neurodiverse backgrounds. Realistic soundscapes further enhance a user’s sense of presence, making virtual scenarios more effective as pedagogical tools. 

Research supports the positive impact of auditory cues on mitigating cybersickness. Spatialized audio aligns visual and auditory stimuli, reinforcing a user’s sense of presence and minimizing sensory conflicts that can lead to discomfort. Auditory anchors, or stable and predictable sounds, can help ground users, reducing the sensory mismatches contributing to cybersickness. Furthermore, incorporating pleasant background music or calming sounds can serve as effective cognitive distractions, alleviating anxiety and discomfort associated with VR experiences. 

Subsidiary Objectives: This subsidiary aims to support an interdisciplinary REU project at the intersection of soundscape design, psychoacoustics, and XR methods by: 

  1. Introducing undergraduates to auralization and spatial audio design in Unity-based XR environments. 
  1. Developing immersive educational experiences that use aural cues to support learning in virtual environments, focusing on informal learning contexts. 
  1. Evaluating the impact of spatialized sound on presence, cognition, and comfort (including cybersickness) in XR learning scenarios. 


Methodology:
The project will incorporate: 

  • Unity and spatial audio plugins (e.g., Resonance Audio and Audio Spatializer SDKs). 
  • Binaural and ambisonic audio recording techniques. 
  • Psychoacoustic testing with undergraduate-developed interactive scenes. 
  • Physiological feedback tools and perceptual testing protocols. 


Expected Outcomes:
 

  • A suite of Unity-based, including auralization tutorials and modules. 
  • Student-designed XR experiences demonstrating spatial sound principles. 
  • A short paper or poster summarizing user comfort and learning outcomes findings. 


Impact on Student Research Experiences:
Students will gain experience with spatial sound design, Unity development, and user-centered design testing, building interdisciplinary skills in HCI, cognitive science, and immersive media.