Because the human brain is often optimal for detecting subtle patterns, this paper explores a novel transformation that maps numerical data into sound. In this research, a set of data taken from head-related transfer functions was used to create physical objects (bells made from stainless steel) whose acoustics were then presented to listeners. The technique is called acoustic sonification. Listeners were able to hear differences in pitch and timbre of bells that were constructed from different datasets, while bells constructed from similar datasets sounded similar. Modulating the shape of a bell with a dataset can influence the acoustic spectrum in a way that results in audible differences |even though there was no apparent visual difference. Acoustic sonification can take advantage of auditory pattern recognition.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.