07-17-2011 05:07 AM
Hi,
In the PCI 6229 Analog outputs are accurate to about 3.2mV.
The range is 20V (-10 to 10 [V]) of 16 bit, meaning that i can send voltage values in a resulotion of 0.3[mV].
Does it means that i should drop the last 3 bits?
thanks in advance,
Omer Wagner
Solved! Go to Solution.
07-18-2011 05:14 PM
Hi Omer,
Dropping the 3 bits won't provide any benefit. It will actually causes more processing and maybe even decrease your accuracy.
Allie
07-19-2011 03:30 AM
so,
does it mean that the DAQ steps are finer then the 3[mV] but the maximal error is 3[mV]?
if, for example, i want to calibrate another equipment i should use 3[mV] steps or can i use finer steps?
07-19-2011
03:51 PM
- last edited on
08-20-2025
04:05 PM
by
Content Cleaner
Hi Omer,
It's true that the nominal resolution of the AO on the 6229 should be about 0.3 mV. The specifications guarantee a differential non-linearity (DNL) within ±1 LSB, so consecutive codes would actually be between about 0 and 0.6 mV from each other once this is taken into account.
The absolute accuracy spec is the maximum that you might deviate from the theoretical voltage output across the entire range of the device. It accounts for gain error, offset error, integral non-linearity (INL) and noise.
So, you can update the output in ~0.3 mV steps (±1 LSB), but the overall voltage might be up to 3.2 mV off of what you have specified. Calibration can reduce gain and offset error--note that the absolute accuracy spec assumes that you are within 10 degrees C and 1 year of the last external calibration.
Here is a link from Maxim that includes a glossary of various terms commonly used when characterizing ADCs and DACs.
Is that explanation helpful?
Best Regards,
07-20-2011 12:36 AM
almost
you wrote that calibration could show me regions where the accuracy might be beter. how can i do this calibration?
07-20-2011
01:10 PM
- last edited on
08-20-2025
04:07 PM
by
Content Cleaner
Hi Omer,
NI publishes a calibration procedure if you want to perform calibration yourself. NI also offers Calibration Services.
Note that NI's published accuracy specs are valid for one year following calibration, which is the recommended calibration interval. NI does not provide accuracy specs for smaller calibration intervals, but the calibration procedure does give 24-hour limits which should give you an idea of the amount of drift you might expect over time.
Best Regards,