Current studies show that vehicle interiors will change more than they have in decades. This is due to simultaneously occurring mega trends. Autonomous driving in particular allows the driver to focus less on what is happening on the road. This leads to new usage concepts that shift attention to the interior experience. This again comes with completely new demands on sound systems. The availability of immersive entertainment technologies used for new comfort functions and mobile working will be essential. Due to the changed focus of attention, not only the personalized sound staging itself will take on a new importance, but also its correct spatial mapping. This applies to a wide variety of functions such as interior staging, driving safety, well-being or communication. In this context, the effort required to create the audio content will take on a significant role, requiring a new unified interface for spatial presentation in order to limit production efforts. This paper describes how object-based audio (OBA) as a platform technology can be used to meet these requirements. Based on specific use cases, a new work?ow is presented that has been implemented for use in series production. The concepts for rendering technology, audio tuning process, and implementation on resource-limited hardware are described.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.