Friday 13th February
Paper Session 6: Game Music Systems
Pete Harrison, Codemasters UK
6-1: Building Interactive Networked Musical Environments Using q3osc
Robert Hamilton, Stanford University, Stanford, CA, USA
Interactive networked musical gaming environments designed as control systems for external music and sound programming languages can be built using the q3osc Quake III/ioquake3 gaming mod. Bi-directional support for the Open Sound Control (OSC) messaging protocol compiled into the game engine allows for the real-time tracking, sonification, spatialization, and third-party control of game entities, clients, and environmental parameters. Reactive audio environments ranging from abstract multi-user musical performance spaces to representative acoustic models of physical space can be constructed using either a standard user-centric audio perspective or a potentially more immersive and inclusive space-centric perspective. Issues of space and perspective are discussed as related to the distribution of performance space and sonified environment across both local and wide-area networks.
6-2: Approaches to Creating Real-Time Adaptive Music in Interactive Entertainment: A Musical Perspective
Kenneth McAlpine, Matthew Bett, James Scanlan, University of Abertay Dundee, Dundee, UK
In this paper we discuss the different roles that music plays in an interactive entertainment title, suggesting both creative and procedural approaches to its creation and execution, and in particular, highlighting the importance of procedural music engines to support creative activity. We suggest further the role that algorithmic and procedural generation routines may have in creating music for interactive entertainment titles in the future and the role that human composers might play in the next-generation game soundtrack.
6-3: Mapping Sounds into Commands
Giordano Cabral, Roberto Cassio Silva Jr., MusiGames Studio, Recife, Brazil
Recent years have witnessed the boom of musical games. These games associate the commands of the player with musical or sonorous events. While the creation and edition of these associations remain a key factor for the musical game industry, digital signal processing techniques continue evolving, providing very useful information about songs. Even though these techniques cannot provide (nowadays) a perfect transcription of songs, they can be successfully mapped into game commands if a proper strategy is applied. This paper discusses some strategies used by MusiGames to answer questions like, "How can the information automatically retrieved be used in a game?" More specifically, "How can this information be used to determine which commands should a player hit on specific moments of a game?"