Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Understanding & Improving Absolute Accuracy

Solved!
Go to solution

Many of the Absolute Accuracy figures calculated in the user manuals for C-Series modules use the following inputs when calculating Absolute Accuracy:

  • TempChangeFromLastExternalCal = 70 *C
  • TempChangeFromLastInternalCal = 1 *C
  • Sigma/Std-dev = 3
  • # of samples = 100

I have a few questions regarding the two TempChange values listed above

  1. Why is the TempChangeFromLastInternalCal set to such a low value of only 1 *C; are the modules continually doing internal calibration automatically?
  2. In regards to TempChangeFromLastExternalCal, am I reading it correctly that NI is calculating the accuracy assuming that the hardware has seen a +/- 70 *C temperature delta since the last time it was externally calibrated; i.e. calibrated at 25 *C but operating at 95 *C? 

In regards to improving Absolute Accuracy, I'm curious if the following two statements would improve accuracy:

  1. The hardware exists in a temperature controlled enviornment with a +/- 10*C delta from room temperature. Can I therefore use the value of 10 for TempChangeFromLastExternalCal if it was externally calibrated at room temperature?
  2. If I short both the AI+ and AI- terminals of an unused channel to COM and use the resulting measurement to be subratcted from all other channels' readings, have I essentially elimited the offset error from the Absolute Accuracy formula?


 

 

For completeness, the formulas given for Absolute Accuracy are:

 

Absolute Accuracy = ±(VoltageReading*GainError + VoltageRange*OffsetError + NoiseUncertainity)
GainError = ResidualAIGainError + GainTempco*TempChangeFromLastInternalCal + ReferenceTempco*TempChangeFromLastExternalCal
OffsetError = ResidualAIOffsetError + OffsetTempco*TempChangeFromLastInternalCal + INL_Error
NoiseUncertainity = (RandomNoise*sigma)/(# of samples)


0 Kudos
Message 1 of 6
(4,479 Views)

Hey Sean,

 

Those numbers come from how R+D specs the device and the conditions we expect the device to be in since the last time it was calibrated both internally and externally. These are also the specs you need to be within in order to use the accuracy summary and accuracy details tables.

 

To answer your questions regarding the TempChange values:

 

1. The TempChangeFromLastInternalCal is set to such a low value because this is we expect the device to be when these calculations are done. The device does not automatically perform internal calibration, but in order to see the numbers in the charts, you need to be within one °C of the last internal calibration. 

 

2. For TempChangeFromLastExternalCal, NI is saying that the device should be within 70 °C of the last internal calibration. 

 

For your second set of questions

 

1. You can use a delta of 10 °C to do the calculation to find the maximum error you will see. This error will obviously change based on how far you are from the last time you did an internal calibration. One of the largest factors that affects accuracy is the temperature differential from the last time the device was calibrated. 

 

2. Shorting the AI+ and AI- terminals to COM will give you a 0 measurement (at least it should in theory) so subtracting it from the other measurements will not change the value. The OffsetError you calculate is a property of the ADC, the amplifier, and other circuits components and that changes based a number of factors (listed in the table).

 

If you have any more questions or if anything is unclear, let me know and I will do my best to explain further. 

 

-KP

Kurt P
Automated Test Software R&D
0 Kudos
Message 2 of 6
(4,449 Views)

Hi Kurt,

 

Thanks for the reply.  I do have some follow up questions.  

 

For each question, assume the following scenarios which happen in chronological order.

 

I have an NI 9206 C-Series module that was just sent to the metrology lab and it was both internally & externally calibrated at 25 °C

  • If I wanted to calculate the Absolute Accuracy of this module immediately after the internal & external calibration was done, would it be correct to set both the internal & external TempChange values to 1 °C?

Now lets assume the module was left overnight in an uncontrolled environment where the temperature delta of the room was 10 °C

  • If I again wanted to calculate Absolute Accuracy, would I need to set both internal and external TempChange values to 10 °C?

Finally, lets say I do an internal calibration on the module subsequent to the 10 °C temperature delta that it saw overnight

  • To correctly calculate Absolute Accuracy, would I set just the Internal TempChange value to 1 °C and leave the External TempChange value to 10 °C?

And for my last question, other than limit the temperature delta and/or increase the number of readings, are there any other tricks I can do to increase Absolute Accuracy? 

 

The reason I ask is that I have a requirement to read a battery cell voltage with a range from 0 - 5V with an accuracy of no greater than +/- 2mV; addtionally, the battery cells can have a common mode voltage in the 100V range.  I'd love to use the NI 9206 which is made exactly for this purpose, but the Absolute Accuracy for the 5V range is spec'd at +/- 3.2mV; which is just out of reach of my requirement.   From my calculations, if I wanted a noise coverage of 3 sigma and was willing to average 100 readings, I would need to keep the temperature delta between +/- 2.5 °C from the last time it was cal'd (both internally & externally) in order to get the accuracy down to +/- 2mV, and that is a tall order to make.   Do you have any suggestions on how I can make the 9206 better fit my needs?

 

Thanks

 

 


0 Kudos
Message 3 of 6
(4,442 Views)
Solution
Accepted by topic author SeanDonner

Hey Sean,

 

Temperature change refers to the difference in temperature from the last time you did those two calibrations and the current temperature on  the device. So if you did internal and external calibration at 25 °C, then you find the difference between the current temperature and 25 °C. This is valid for two year for external calibration (according to the spec sheet)  To answer your questions:

 

1) If you both internally and externally calibrated at at 25 °C, then you would set TempChange to 1 °C if the environment it was in was 24 or 26 °C.

 

2) If the room was 10 °C when the measurement was taken, both the internal and external TempChanges would be 15 °C (because your calibration was done at 25 °C).

 

3) If you did an internal calibration at 10 °C, then your internal TempChange would be 0 °C and your external would be 15 °C.

 

As for tricks, I don't have too many. The best one I can suggest is use a lower range and some device that has a lower output but I don't think that would be viable for your application. The best is try to self calibrate as close as you can to the most consistent temperature. 

 

I found a VI  that calculates the Absolute Accuracy of a device. You have to put a lot of values in but it lets you play around with the values to see what you can afford. My calculations show you in your range, but you need to be absolutely sure of those fluxuations.

 

https://decibel.ni.com/content/docs/DOC-9912

 

Attached is a picture of the values I used to get that error. I am pretty sure the devices are calibrated at 23 °C. Here is the manual where I found the numbers I used.

 

http://www.ni.com/pdf/manuals/374231c.pdf

 

Sorry for some of the confusion earlier, I hope this post clears up a lot of information and if it doesn't please let me know. 

 

-KP

Kurt P
Automated Test Software R&D
0 Kudos
Message 4 of 6
(4,424 Views)

Hi Kurt,

 

Thank you for your clairifications, the picture in my mind is much clearer now.

 

In regards to internally calibrating the module as close as possible to nominal operating temperature, I'm curious if this is possible when the C-Series module is in a Compact RIO and/or MXI-Express RIO chassis? Is there a LabVIEW FPGA interface to internally calibrate the modules? If not, could I do it via the host VI that communicates with the FPGA?  I'd like to make it so that each time the RIO chassis is rebooted an internal calibration of all my voltage monitor cards will be performed.

 

Thanks


0 Kudos
Message 5 of 6
(4,390 Views)

Hey Sean,

 

To be honest, I am not familiar with LabVIEW FPGA interface or the Compact RIO chassis/ MXI-Express chassis. I can tell you that when you do an internal calibration it rewrites certain values to the EEPROM of the card. The EEPROM only has a certain number of write cyles before the memory breaks down, so if you do this too often, you may see your module fail unexpectedly. 

 

I would suggest posting new threads to the CRIO and MXI-Express forums asking how that would be implimented. 

 

Best of Luck!

 

-KP

Kurt P
Automated Test Software R&D
0 Kudos
Message 6 of 6
(4,365 Views)