This project will improve the simulation of musical concert listening and ensemble performance experiences in VR. Innovation relies on the integration of improved computational methods for acoustic scene rendering, and improved empirical methods to measure subjective ‘presence’. The project will work towards a biofeedback system allowing controlled regulation of humans’ behavioural, cognitive and affective responses within (musical) VR environments.

The objective of this project is to develop an interdisciplinary scientific framework, that allows to simulate, in VR, aesthetically rich experiences of musical concert listening and performance, in multi-user environments. The research will focus on four sub-goals;

  • Improve the current state-of-the-art of the computational rendering of acoustic scenes in VR. This innovation is situated in combining optimised sound source positioning and room acoustics, within dynamic multi-user environments.
  • Develop a new empirical procedure to objectively measure, model and recognise the subjective feeling of presence, which is foundational to the user experience in VR. This will take into account behavioural, cognitive and affective dimensions of the experience.
  • Develop a biofeedback control system that balances the accuracy of the auditory rendering of musical VR environments, in view of optimising psychological presence in multi-user experiences.
  • Design of empirical experiments to test the efficacy of the biofeedback control system in its ability to modulate users’ behavioural, cognitive and affective responses towards optimal states.

This research will be performed by an interdisciplinary team of researchers affiliated with the Institute of Psychoacoustics and Electronic Music (IPEM) and IDLab-MEDIA, both members of the Art and Science Interaction Lab consortium.

This project is funded by the Special Research Fund of Ghent University in the context of the bi-annual project call for Interdisciplinary Research Projects.

MusiXR is an interdisciplinary research project to innovate musical concert performance and experiences in extended reality (XR). Specifically, the project relies on mutually reinforcing developments in the domains of engineering and humanities, aiming at:

1. Improving the computational rendering of acoustic scenes in multi-user XR environments (auralisation). The work focuses on developing an accurate and computational-efficient modeling of sound wave reflections.

2. Improving empirical methods for assessing the subjective experiences of musicians and listeners in musical XR. We propose an approach taking into account behavioral, cognitive and affective dimensions of the human experience. This work relies on advances in neurocognitive science, and a dynamical system’s approach to the measurement and analysis of human behavior and responses.