Discussion on Differences in Sensitivity Specifications between Analog Microphones and Digital Microphones
Sensitivity, the ratio of the analog output voltage or digital output to the input pressure, is a key indicator for any microphone. Where the input is known, the mapping from the sound domain unit to the electrical domain unit determines the amplitude of the microphone output signal.
This article explores the differences in sensitivity specifications between analog and digital microphones, how to choose the best-sensitivity microphone for a specific application, and discusses why adding one (or more) digital gain can enhance the microphone signal.
Analog and digital
Microphone sensitivity is typically measured with a 1 kHz sine wave at a sound pressure level (SPL) of 94 dB (or 1 Pa (Pa) pressure). The amplitude of the analog or digital output signal of the microphone at this input excitation is a measure of microphone sensitivity. This reference point is only one of the characteristics of the microphone and does not represent the full performance of the microphone.
The sensitivity of the analog microphone is simple and easy to understand. This indicator is generally expressed in logarithmic units of dBV (relative to the number of decibels in 1 V) and represents the volts of the output signal at a given SPL. For analog microphones, the sensitivity (expressed as linear unit mV/Pa) can be expressed in logarithm as decibel: