01-29-2009 12:43 PM
I am not getting the volts per bit that I expect from my PCI-6221 board.
I'm running the "Test Panel" available in Measurment & Automation Explorer v.4.3. I have an NI-6221 multifunction board with a BNC-2090 interface box installed. When I run the "analog input" test, with dfferential input and ai0 connected to ai8, I expect 0 volts. The min, max input values are at the defaults of -10 V, +10 V. The graph shows zero +-1 to 2 bits of noise. See screenshot. The actual amplitudes reported are +0.000468,+0.000144, -0.000180, -0.000504. This corresponds to 0.000324 V/bit. I expect 20V/(2^16 bits) = 20V/65536 bits = 0.000360 V/bit. What accounts for this 10% discrepancy? Thanks.
Bill
Solved! Go to Solution.
01-29-2009 05:47 PM - edited 01-29-2009 05:48 PM
Hey Bill,
I could see a couple things: With M-series, realize that the data returned from the ADCs are not linear so the V/bit will vary across the range of the acquistion. We use Mcal to correct for the non linearity. That may account for the discrepancy right there. Also, note that the range is actually larger than that - in order to do cal across the full desired range some of the codes are outside the 10 V range - generally 5%. Though this would actually push the V/bit up. This is mentioned in the M-series user manual in the Analog Input Range section. All of this is factored in to the absolute accuracy specifications so they are still valid.
Finally, when I calculate 20/65536 I get .0003052 - which when I multiply by 1.05 gives me .0003204 which looks much better. You may have swapped the 6 and 5, or Windows calculator is fibbing to me yet again
So this brings me to my question - are you just noticing the difference in theoretical vs measured and wondering what is up, or were you planning on using this info? If you're looking at reading binary (a common practice that results in questions like yours) take a look at this KB when you get to scaling to calibrated data -
Hope this helps,
Andrew S
01-29-2009 08:54 PM