08-17-2011 08:46 PM
Hi,
I am looking to use the NI USB-6008 to read analog differential input millivolt values within the range 0 to -1V. In terms of accuracy, I am wondering if there may be a way to increase the read to within <0.5 mV accuracy from 1.53mV as listed in the datasheet (for "Absolute accuracy at full scale, differential") within the +/-1V range. Is a simple calibration or linear fit sufficient, or is there more complicated math involved related to the circuitry inside the DAQ. I am hoping to be able to use the Labview software to somehow calibrate the voltage value, or even some external circuitry. Any help is appreciated! Thanks!
Solved! Go to Solution.
08-18-2011 11:43 AM
6008's only a 12-bit DAQ.... regardless of what range or signal conditioning you do to the input, it's still only going to be 12-bit accurate.
If you need more accuracy, you'd need a better DAQ like a 621x with a 16-bit A/D.
You can choose to do some amplification or averaging to reduce the noise, but at the end of the day it's still a 12-bit measurement and that will factor into the measurement error.
08-23-2011 05:57 PM
As it turns out, after some digging, the accuracy of the module NI USB6210 would be due to its design, rather than the size of its bit input capability. As seen in the datasheet, due to factors such as: temperature variation, noise, and nonlinear gain.