Spatial composition represents a key aspect of contemporary acousmatic and computer music. The history of spatial composition practice has shown many different approaches of composition and performance tools, instruments and interfaces. Furthermore, current developments and the increasing availability of virtual/augmented reality systems (XR) extend the possibilities in terms of sound rendering engines as well as environments and tools for creation and experience. In contrast to systems controlling parameters of simulated sound fields and virtual sound sources, we present an approach of XR-based and real-time body-controlled (motion and biofeedback sensors) sound field manipulation in the spatial domain. The approach can be applied not only to simulated sound fields but also to recorded ones and reproduced with various spatial rendering procedures.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.