Improving group musical experiences in virtual reality, using a biofeedback control system

01 October 2020 → 30 September 2024
Regional and community funding: Special Research Fund
Research disciplines
  • Humanities
    • Music performance
  • Natural sciences
    • Virtual reality and related simulation
  • Social sciences
    • Cognitive processes
  • Engineering and technology
    • Audio and speech processing
    • Pattern recognition and neural networks
Virtual reality (VR) music experience and interaction acoustic scene rendering biofeedback-based control
Project description

The IOP will improve the simulation of musical concert listening and ensemble performance experiences in virtual reality (VR). Innovation relies on the integration of improved computational methods for acoustic scene rendering, and improved empirical methods to measure subjective ‘presence’. The project works towards a biofeedback system allowing controlled regulation of humans’ behavioural, cognitive and affective responses within (musical) VR environments.