The mechanisms of human audio visual perception are not fully understood yet. For interactive audio visual applications running on devices with limited computational power it is desirable to know which of the stimuli to be rendered in an audio visual room simulation have the greatest impact upon the perceived quality of the system. We have conducted experiments to determine the effect of interaction upon the precision with which test subjects are able to discriminate between different parameter values of auditory attributes. This paper details one of these experiments and compares different approaches for the analysis of the obtained data. The results show a noticeable bias towards faulty ratings during the involvement in a task, although the analyzes using significance tests do not completely confirm this effect.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.