AES Section Meeting Reports

Chicago - November 26, 2013

Meeting Topic:

Moderator Name:

Speaker Name:

Meeting Location:

AES Member Bonus Material

* Differential Microphone Array Usage in Android Mobile Device Multi-media Applications - Plamen Ivanov (YouTube Video)


Plamen started his presentation by giving a brief history of when multiple microphones were added to cell phones: 2004 — 2006 first introduced phones with two independent microphones; 2007 -- 2009 provided phones with two-microphone null steering (NS) algorithms; and 2010 offered the first three microphone cellphone (from Motorola).

Two different approaches to null steering were then presented: the delay & sum beam former and the differential array. The delay & sum beam former introduces a delay (via a digital filter) to achieve coherent summation. Steering is achieved by adjusting the delays in each sensor path by selecting filter coefficients out of a given filter bank. The differential array (da) provides higher directivity (over the delay & sum) with the same number of sensors giving a constant directivity with frequency but its output is not flat so it requires equalization. The da is known as an end-fire array whereas the delay & sum is known as a broadside array. Adding a time delay element to a differential array allows for effective null steering. Various directivity patterns can be obtained using two omni microphones such as cardioid, super-cadioid, bi-directional, hyper-cardioid, etc. Multi-microphone 'hybrid' arrays process microphone pairs' signals in their optimal sub-bands. This enables a phone to have various audio attributes with modes such as 'subject', 'balanced', 'concert', or 'mic-zoom'.

Implementing multiple microphones in a new phone requires attention to multiple disciplines and attributes. The designer needs to consider the user-interface, susceptibility to wind noise and handling noise, and occlusion (when the user places their hands or fingers over the mic openings). Plamen also described how an audio engineer faces various implementation challenges. System interactions include mechanical & electrical requirements, algorithm & software requirements, and system integration.

Questions from the audience included asking about user testing (robotic and human tests are conducted early and often by a development team), restricted bandwidth for user settings (the various algorithms provide multiple options for the user to choose from and the options are not just implemented by band limiting the response), expectation that a user knows how to use the phone in the various modes (menus and icons are constantly being designed, reviewed, and updated to assist the user), and if proximity effect is considered an issue (no, it is not a design issue since most use cases for recording would not require to place the phone close to the talker).

The Chicago AES Section would like to extend a special thanks to Plamen Ivanov for presenting to our section and for including some rather intense math to show us that providing audio options is not as simple as today's cellphone user would anticipate.

Written By:

More About Chicago Section

AES - Audio Engineering Society