The use of visual user interfaces in smartphones and other personal media devices (PMD) leads to decreased situational awareness, for example, in city traffic. It is proposed in the paper that many menu navigation functions in PMDs can be replaced by an eyes-free auditory interface and an input device based on acoustic recognition of tactile gestures. We demonstrate, using a novel experimental setup, that the use of the proposed auditory interface reduces the reaction times to external events in comparison to a visual UI. In addition, while the task completion times in menu navigation are somewhat increased in the auditory interface the subjects were able to complete the given interaction tasks correctly within a reasonable time.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.