# A Low Jitter Clock is Required to Evaluate High Resolution ADCs

As a baseline, the clock input of the DC1826A-A is driven with a Rohde and Schwarz SMB100A RF generator and the analog input is provided by the Stanford Research SR1. The result is the PScope data of Figure 1 which produces an SNR of 98.247dBFS. The SNR is obtained by adding the input level below full scale (-1.047dBFS) to the measured SNR. The jitter of 18.8psRMS  at the CNV input of the ADC can be measured using an Agilent Infiniium 9000 series oscilloscope or equivalent as shown in Figure 2. The theoretical limit for SNR based on the jitter and input frequency is 20*log (2*π*fIN*tjitter) where tjitter is the RMS jitter and fIN is the input frequency. Plugging in the values for this example yields 20*log (2*π*20 kHz*18.8ps) = 112.5dB. This value must then be RMS summed with the ADCs SNR to yield an effective SNR. Looking at the LTC2389 data sheet, the typical SNR used for the demo board circuit (Figures 7a and 7b) at 2 kHz is 98.8dB. The SNR vs Input frequency curve of the data sheet shows that SNR rolls off about 0.3dB at the 20 kHz input frequency used in this experiment so the 98.8 dB number will be adjusted to 98.5dB. The RMS sum of 98.5dB and 112.5dB yields 98.3dB which is approximately the result obtained in Figure 1.

Now that a baseline SNR measurement has been obtained, what happens if a higher jitter clock source is used? Using the XXXX-YYYYY (manufacturer and model number withheld) generator a jitter of 76.5psRMS is measured as shown in Figure 3. The theoretical SNR limit with this jitter is 100.3dB which when RMS summed with the 98.5dB of the LTC2389-18 yields 96.3dB. The measured SNR of 96.2dBFS shown in the PScope screen capture of Figure 4 agrees closely with this result. This is a 2dB loss in the SNR with less than 60ps of additional clock jitter at a relatively low input frequency of 20 kHz. At an input frequency of 100 kHz the SNR would be reduced to 86dB.