Human auditory localization strongly relies on head movements. Thus, for plausible perception of virtual acoustic scenes the incorporation of head movements is mandatory. This is achieved via loudspeaker playback as a listener can move the head relatively to the scene. When using binaural synthesis the head movements need to be tracked and the scene needs to be rotated accordingly to achieve a stable perception of the acoustic scene. We present a low-cost, plug-and-play device (MrHeadTracker ) to facilitate head-tracking based on the Arduino platform and the BNO055 sensor. Its performance is compared against another low-cost device (GY-85) and an optical tracking system (Optitrack Flex 13). The proposed MrHeadTracker outperforms the GY-85 device in terms of accuracy and latency and yields comparable results to the optical tracking system.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.