Can 0.1 dB really make that much difference?
Analog-to-digital converters (ADCs) are obviously mixed-signal devices, but sometimes engineers whose experience is primarily with analog circuits seem to forget about their digital nature. When an amplifier is chosen, the designer knows that if the signal swing is comfortably below the P1dB specification, the 3rd order distortion will increase by about 2 dB for every 1 dB increase in the output amplitude. If the output amplitude increases by 0.1 dB, the 3rd harmonic would increase by 0.2 dB. That's negligible, and pretty hard to measure.
An ADC's transfer function is very different from that of an amplifier, however, with discrete steps in “gain” for adjacent outputs. The ADC's differential nonlinearity (DNL) specification captures the magnitude of these errors. Integrating the non-uniform steps can result in a transfer function that exhibits very strange shapes. The ADC's integral nonlinearity (INL) specification captures the magnitude of the errors relative to an ideal transfer function, and is a good predictor of an ADC’s distortion. Unlike the predictable behavior of an amplifier, however, the INL is not helpful in predicting the behavior of distortion products vs. input amplitude, and a 1 dB change in input amplitude can result in a ±5 dB change in the third harmonic! That's right; increasing the input level can cause the harmonic distortion to increase or decrease.
Another important difference between ADCs and amplifiers is their behavior when overdriven. An amplifier's gain gracefully compresses as its input is increased. Eventually the amplifier’s output reaches a maximum level and clips, resulting in large odd-order distortion products (the clipped signal starts to look like a square wave, whose spectral content comprises a sum of odd-order harmonics). The ADC has no such graceful behavior, and when its input voltage exceeds its input range, its output clips immediately. This can result in dramatic changes in distortion. Some ADCs can maintain good performance with input amplitudes very close to full scale, but all ADCs will fall off a cliff when the input saturates. A designer contacted me about what he considered an unimaginable distortion change as he monitored FFTs with ADI’s VisualAnalog™ software. It turned out that the average amplitude was set just 0.05 dB below full scale. The input varied over time, and even a 0.1 dB increase caused occasional clipping, resulting in a 40 dB shift in odd-order distortion. I recommended that they implement a good gain-control loop!