Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Minimizing noise on PCI-6251 Analog output using external reference

Hello,
my task is to characterize a 12 bit ADC.
Therefore I want to use the 16-bit DAC of PCI-651 DAQ board to build up a voltage ramp as input for my ADC. Since the reference voltage of the ADC is 1.8 V I also want to use the external voltage reference of the analog output to maximize the resolution of the DAC.
Is there a way to isolate the reference ground of the DAC from the rest of the system to minimize the noise which is induced by the DAQ board/PC power supply? Is there any other way to minimize the noise and increase accuracy?

In worst case, what is the amount of noise induced by the DAQ board? For example, if I use an high accurate external reference voltage of 1.8 V (+/-0.05% accuracy), what would be output accuracy of my analog output?

Thanks in advance,

Christoph
0 Kudos
Message 1 of 6
(4,510 Views)
 

Hi Christoph,

first can you tell me you exact card ? 651x ? When you go an product side in internet you see on right the ressource tab.

(e.g. pci 6512 https://www.ni.com/en-us/shop/model/pci-6512.html)

Click here and then to user manual - you find the accuracy there. Then we have no analog output card without ground references but when you create a second

signal with zero voltage you can subtract the two signals like differential mode. But normally you connect ground through whole measuring and have also no problem...

Best  Regards

AE Munich

0 Kudos
Message 2 of 6
(4,494 Views)
Hi Johann,

Thanks for the reply.
I assume that you have transposed numbers. I am using the PCI-6251 DAQ board which you can find here:
http://sine.ni.com/nips/cds/view/p/lang/de/nid/14124

But yes you are right, there I can find the accuracy of the analog outputs, but only if I use the internal voltage references. The accuracy is specified to about 2mV for the +/- 10V range and to about 1mV for the +/- 5V range. But I cannot find any information about the accuracy of the DAC if I am using an external reference voltage.
If I use a voltage reference of 2.5 Volt (Range: 0 -2.5V) with an accuracy of 1.25mV what is then the resulting accuracy of the analog output?

Thanks for the idea with the second signal, I will give it a trial. I need to be as accurate as possible since I want to statically test a 12 bit ADC with an LSB of 0.46mV and then it is unfavorable to have a higher noise than 1mV at the input.

Regards,

Christoph
0 Kudos
Message 3 of 6
(4,490 Views)
From my colleague:
 
This answer will apply to most of the 12-bit E Series products (E-1, E-2, E-3, E-4, 602x) There are three main errors that relate to the accuracy when the external reference is used: gain, linearity, and offset. The linearity of the DAC will remain fixed, regardless of whatever reference is used. The offset remains constant, regardless of the reference used, although a +/-1 mV error for a +/- 10 V range is insignificant, while a +/-1 mV error at +/-5 mV range is very noticeable.

The gain error, as a percentage, will also change with the external reference. When the board is calibrated, the gain error of the system is corrected. This includes errors from the DAC, the resistor networks, buffers, and internal reference. When an external reference is used, the routing of the signal is different, and a few buffers used with the internal reference are no longer used. The absence of the errors from these buffers are what contributes to the gain error. If a large external reference is used, the error from these buffers may add up to several mVolts, which gives an error as a fraction of a percent. As the external reference value decreases, you have the error increasing because the denominator decreases (residual error of buffers / external voltage). It is possible to calibrate the gain error with an external reference, but the user would have to know exactly what reference voltage he is using. Also, NI-DAQ does not do this - the user would have to write the routine by himself. The other drawback would be that the granularity of adjustment to the gain error remains independent of the external reference, and the board may have a rather large residual error after this calibration. For example, with an 8-bit caldac, if you can adjust the gain error by 0.2 mV per caldac LSB (as an example with a made up number), for an external reference of 10 mV, you will still have a 2% (0.2 mV/10 mV) error, which may be unacceptable. The granularity of adjustment varies by board
0 Kudos
Message 4 of 6
(4,488 Views)
Hi Johann,
Thanks for the fast explanation.

Christoph
0 Kudos
Message 5 of 6
(4,486 Views)
I'll add just a little to this discussion.

First, I'll point out that the PCI-6281 is really the right board for this application, rather than the PCI-6251, for the following reasons.
1. It has better accuracy than the 6251 for a given range.
2. It has more calibrated output ranges, including ±2V.
3. It has provision for output offsetting. This means you can connect your DUT ground to the APFI0 pin and add that voltage to the analog output voltage, which effectively refers the AO to your DUT ground.

It is true, however, that using anything other than the built-in ranges will result in a loss of accuracy, simply because any other range will not be calibrated.

One way to make the most of your 6251 in this application would be to use an analog input to measure (differentially) the voltage you are applying to your DUT. So even if your AO (analog output) were not producing the exact right voltage, if you measure that voltage perhaps your software could account for the error. For example, produce a voltage sweep with the AO and measure it with your DUT and with an AI (analog input) channel of the 6251. Instead of determining the transfer function of the DUT with respect to the intended output voltage, determine it with respect to the voltage measured by the 6251 AI, which should be more accurate.

HTH,
Chris
0 Kudos
Message 6 of 6
(4,474 Views)