05-23-2017 10:12 PM - edited 05-23-2017 10:15 PM
So I decided to revisit the ADC specs. The ADC should theoretically be able to see about +/- 5 mV of change in voltage, however it says that it only has a rated accuracy of 50 mV, how is this possible? I imagine its because its referencing the power supply, not some internal regulator or laser trim voltage reference(I wish...). What I am confused by is that when I measure a positive voltage its about +/- 45-55 mV inaccuracy, however, when I flip the direction, its accurate to within 5 mV (I used a AD584 to check)! Whats the deal!? Could I safely flip the direction of the voltage and convert it to positive if I ever wanted to use a sensor? I've gone past 10V and it doesn't reach 9.995 volts until I get to like 10.1V in the + readings; very inaccurate.
Is there anyway to use an external voltage as a reference and trim that to other analog inputs? I know on some of the more expensive NI DAQs they have an EXTREF.
I wish the ADC was more than 12 bits, it would be awesome for doing in-situ measurements like temperature or voltage for more expensive characterization machines in the material science world.
05-31-2017
09:50 AM
- last edited on
03-19-2025
04:02 PM
by
Content Cleaner
Hello,
I am assuming based on the fact that you are going to 10V that you are talking about the MSP connector inputs, in which case the absolute accuracy is actually +/- 200 mV. It may help to take a look at this document, which better explains resolution vs. absolute accuracy:
Specifications Explained: NI Multifunction I/O (MIO) DAQ
https://www.ni.com/en/support/documentation/supplemental/16/specifications-explained--ni-multifuncti...
I am curious about the difference you are seeing on the positive and negative voltages. Do you see this always, or just at the extremities of the voltage range?