Predicting the Perceived Quality of Nonlinearly Distorted Music and Speech Signals
In a previous study perceptual experiments were reported in which subjects had to rate the perceived quality of speech and music that had been subjected to various forms of nonlinear distortion. The subjective ratings were compared to a physical measure of distortion, DS, based on the output spectrum of each nonlinear system in response to a 10-component multitone test signal with logarithmically spaced components. The values of DS were highly negatively correlated with the subjective ratings for stimuli that had been subjected to "artificial" distortions such as peak clipping and zero clipping. However, for stimuli that had been subjected to nonlinear distortion produced by real transducers, the correlation between the DS values and the subjective ratings was only moderately negative. A new method predicts the perceived quality of nonlinearly distorted signals based on the outputs of an array of gammatone filters in response to the original signal and the distorted signal. For each filter, the cross correlation is calculated between the outputs in response to the original and the distorted signals for a series of brief samples (frames). The maximum value of the cross correlation for each filter for each frame is determined, and the maximum values are summed across filters, with a weighting that depends on the magnitude of the output of each filter in response to the distorted signal. The resultant weighted cross correlation gives a perceptually relevant measure of distortion called Rnonlin, which can be used to predict subjective ratings. There were high correlations between the predicted ratings and the subjective ratings obtained previously. The correlations were greater than obtained using the DS measure. A new perceptual experiment, using a mixture of artificial and real distortions, confirmed the validity of the new measure.
Click to purchase paper or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members, $5 for AES members and is free for E-Library subscribers.