Room simulation based on convolution is state of the art in modern audio processing environments. Most of the systems currently available provide only a few controllers to modify the underlying room impulse responses. The sound designer can manipulate one set of numeric parameters even in spatial reproduction systems. This paper describes a new approach for the interactive control of room impulse responses based on visualization and parameterization. The new principle is originally developed for the use in Wave Field Synthesis systems and based on Augmented Reality user interfaces. An adaptation to conventional user interfaces and other spatial sound reproduction systems is possible. The modification of the room impulse responses is performed by direct interaction with 3D graphical representations of multi-trace room impulse responses.
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.