A study is presented of the effects of time variance on maximum-length sequence (MLS) measurements. Both interperiodic and intraperiodic effects are studied, using time stretching and an analogy with delay modulation as models. It is shown that both types of time variance cause apparent level losses, which typically represent an error with a + 12-dB per octave characteristic relative to an ideal reference impulse response (IR). Furthermore this error is localized around the reference IR and cannot be decreased by averaging or truncation. Intraperiodic time variance might also cause an apparent random noise component, typically with a + 6-dB per octave characteristic. This component is spread in time and can consequently be decreased by truncation. Averaging also decreases this noise if the time variance is fast so that the correlation of this apparent noise between MLS periods is low. The existence of such apparent noise indicates that the measured system response is distorted. Measurements were carried out in different rooms, and IR differences from these demonstrated the existence of apparent time-variance noise, increasing by approximately 6 dB per octave, in reverberant rooms.
Click to purchase paper as a non-member or login as an AES member. If your company or school subscribes to the E-Library then switch to the institutional version. If you are not an AES member and would like to subscribe to the E-Library then Join the AES!
This paper costs $33 for non-members and is free for AES members and E-Library subscribers.