Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Solution for testing high resolution ADC?

Hello all,

 

I am looking for a test solution to characterize on the bench a 16 to 20-bit low frequency ADC with an output sample rate of up to 1 Ksps and input differential range of +/- 1 V centered on a common mode between 1.2 V and 2.5 V. The LSB step size is on the order of 15 uV.

 

My understanding is that this test requires a highly linear ramp signal source (more linear than the ADC under test by >=3 bits) to make a very slow ramp (almost DC), and/or an exceptionally pure < 1 KHz sine wave source, with much better distortion than the ADC under test. This is true for both code density histogram testing, and/or FFT-based dynamic testing

 

I do not have such equipment right now, and I am trying to determine the cheapest solution to invest in.

 

I have the PCIe-6251 card with 16-bit analog output and 16-bit analog input, but this clearly does not seem to be linear enough.

 

 

 

I have read the relevant forum posts on the ADC linearity testing topic such as:

http://forums.ni.com/t5/LabVIEW/linearity-testing-ADC-using-PCI-6289/m-p/286924

http://forums.ni.com/t5/Multifunction-DAQ/Minimizing-noise-on-PCI-6251-Analog-output-using-external/...

http://forums.ni.com/t5/Multifunction-DAQ/Wide-temperature-drift-in-6289/td-p/872999 

 

My questions are:

1) Is there some document stating the output noise spec for the analog output of the NI 625x or 628x series? I do not see this in the datasheet.

 

2) Will the PCI-4461 or PCI-6281 be the better option for this application? Any other suggestions? I am a bit scared about the comments that 4461 is not very good for DC accuracy and has potentially high temperature drift. Also, the 4461 is pretty expensive so I don't want to buy it if it is not suitable for this application...

 

3) Is there some way to trade-off the high output rate capability of these DAQ cards for better low frequency accuracy?

 

 

Background on Q1:

On my 6251, as a loopback test, I routed AO output to the AI input.

* I made the AO output range to be +/- 5 V (narrow as possible) and the AI input range to be +/- 0.1 V.

* I made the AO output produce 0 V DC, and then acquired that with the AI input (say 1000 times).

* The best 1-sigma standard deviation in the readings was about 90 uV only.

* The accuracy was not improved even if I tried various input modes (RSE, NRSE, DIFF) - this is not surprising, since all the connections were made using short wires on the SCB-68 interface board.

* Using AO 0 as an external reference for AO 1 to reduce the AO 1 output range to below 1 V also did not help.

* However, if I physically short the AI input to ground and read the AI a 1000 times, I could get about 1-sigma standard deviation of about 28 uV. This tells me I am limited by the analog output rather than the analog input of the 6251. I think it is not zero offset, since the 90 uV 1-sigma std. deviation was observed even if I made the AO = 50 mV instead of 0 V.

* One possibility could be noise on the AO output... except I do not know what the spec for AO noise over say 0.1 Hz to 1 KHz actually is, hence the question.

 

Thanks for sharing your experience

0 Kudos
Message 1 of 4
(5,274 Views)

The product to use would be a 4070/4071 DMM.  This is of course a measurement device, and not a source, so you'd have to use it as follows:  Choose a reasonably stable source, such as your 6251.  Measure it simultaneously with both the DMM and your DUT.  Then compare the readings from your DUT with the readings from the DMM.  The more simultaneous you can make the measurements, the less stable and quiet the source needs to be.  As long as the DMM and the DUT see the same signal, everything is fine.  If the measurements are simultaneous, then the signal is the same for both devices by definition.  If the measurements aren't simultaneous, then the source has to be stable for both measurements, and noise and drift become more of an issue.

0 Kudos
Message 2 of 4
(5,272 Views)

You are clearly aware of some of the challenges of those tests.

 

I remind you of the Nyquist criterion. If the fastest sampling rate is 1 kS/s, then the fastest signal you can sample is less than 500 Hz. Since you need to hit all the bins in your ADC you will either need to test with a much lower frequency or sample many cycles of a signal which  is not synchronized with the sampling rate.

 

Another consideration is gain errors in the ADC. The actaul input range of the device under test may be slightly larger or smaller than exactly 1 V.  You cna make the signal bigger than 1 V but then you need to detect and compensate for the out of range conditions.

 

There is a substantial body of literature on ADC testing in journals such as the IEEE Transactions on Measurements and Instrumentation and others.

 

Lynn

0 Kudos
Message 3 of 4
(5,267 Views)

Thanks for the responses:

 

@Chris R: Thanks for the suggestion on the DMM. Three follow-up questions:

A) Do I understand correctly that even the 7.5-digit 4071 is only accurate to ~6 digits (~20 bits) at 1 Ksps read-out rate? This might be good enough for static DNL/INL in my case, but I just want to be sure I am reading the datasheet right...

B) I was hoping to come up with a setup that could do both static and dynamic testing. How would I do that with this DMM approach? Is the idea to digitize a sine-wave produced by 6251 using the DMM's waveform digitizer? Are there any Labview examples of how this approach would work in practice?

C) Also, just want to follow-up on my Q1: is there really any information on the noise/spurious vs. frequency for the AO output for the 625x, 628x or 4461 available (with no external filter, @ nominal temperature)? Alternately, suppose I ask the AO to make a sine-wave at 1 KHz, what is the expected THD+N at the output? Most of these DACs producing the AO are delta-sigma type and there will be some residual frequency content at their output. This should ideally be specified...

 

The concern is as follows: the ADC under test is a sigma-delta modulator, and I don't want to have noise-folding or driving it with some high frequency out-of-band signal. I want to determine what sort of low-pass filter (or band-pass filter for sine waves) I might need at the AO output to limit these effects.

 

@johnsold: Yes, I agree with you. I read the IEEE 1241-2000 standard for ADC testing. For DNL/INL testing, it is likely I will have to use non-coherent sampling using a very low frequency signal of known probability distribution (whether it is an almost-DC ramp or an extremely slow sinusoid; each can possibly repeated a few times).

 

The ADC under test has an on-chip reference, sampling clock generation circuit and digital decimation filter. The reference voltage is available externally on a device pin, so gain error can be calibrated by measuring it using a highly accurate DMM. The sampling clock is derived from a PLL using a crystal as a reference; the sampling clock is actually 500 KHz on-chip, giving the ADC a pretty high oversampling ratio of ~250 for a 1 KHz input signal.

0 Kudos
Message 4 of 4
(5,261 Views)