PXI

cancel
Showing results for 
Search instead for 
Did you mean: 

High Precision Voltage Source to Calibrate 24 bit ADC

I am trying to find a cost effective way to calibrate a 24 ADC with a voltage source.

The ADC has 3 differential inputs.It ranges from 40, 20, 10, 5, 2, 1, 0.5Vpp

It natively samples at 32 kHz, But I will be taking a 1 second sample as a reading for the calibration.

 

The source must be able to produce half full scale of each range ie 20V to 0.25V differential.

The source must be able to produce the signal to +-0.01% (ie. 20V +-0.002V to 0.25V+-25mV)

I was looking at the PXI-4132. I will have to buy the PXI frame and controller.

Is there a cheaper or better solution for this task. I also looked at the Keatley $6000 and Agilent solutions $5750.

A SMU appears to be more feature rich then what is called for in this task.

 

This is my first project please be critical of my post so that I can improve.

0 Kudos
Message 1 of 3
(6,529 Views)

BeanBoy,

 

I do not have specific device recommendations but some questions to clarify your requirements.

 

1. Why are you only looking for +/-0.01% when one bit in 24 bits represents ~6e-6%?  You are throwing away about 8 bits.

2. Why do you want to test to half of the full scale range?  Depending on the accuracy of both the source and the ADC you might find the measurement just above or just below the half scale point.  Your source generally need to be able to go to the desired voltage (say half scale) plus enough margin to be sure of reaching the half scale output of the ADC under worst case error of both the source and the ADC.

3. Does the source need to be able to produce negative voltages?

4. What is the input impedance of the ADC?

 

If you are trying to find an inexpensive way to do a limited calibration, buy a highly stable source at the highest voltage you need to generate. The set up a group of voltage dividers to obtain the lower voltages.  Send the assembly to a calibration house to get it calibrated.  You may need to repeat the calibration process several times to determine how stable your standard is.  For this method stability is more important than the accuracy because the calibration will measure any deviation from the nominal.

 

Lynn

Message 2 of 3
(6,490 Views)

Two ways: Get a Reference source (have a look at the secondary market 😉 like this  , I think you will need a recalibration anyway)

or get a stable source and a reference voltmeter.

 

also have a look how other solved this problem .. like Jim Williams from linear see

http://cds.linear.com/docs/Design%20Note/dsol11.pdf

http://cds.linear.com/docs/Application%20Note/an86f.pdf

 

However as Lynn already pointed out 16bit seems all you need.

 

And: Usually an ADC used ratiometric migth go up to 22bit .. not ratiometric it will need a reference better than 22bit (ok 16bit here, that can be done)   . So if your ADC will work ratiometric in your application all you need is a stable voltage divider 😉 since reference sources are available ....

 

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


Message 3 of 3
(6,372 Views)