LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQ Assitant: Setting "signal input range" outside calibration range

I have a question concerning the calibration and measurement of thermocouples using the DAQ assistant.

 

I want to measure temperatures at cryogenic temperatures (liquid nitrogen, ~-196°C).

 

To calibrate the thermocouples, I used a two point calibration (ice water, 0°C and liquid nitrogen, -196°C).

 

Now I cannot set my "signal input" minimum below -196°C, this causes an issue since now I cannot measure below -196°C. Especially I cannot see the "negative" (below -196°C) noise of the signal when in liquid nitrogen.

I can set the minimum signal input to a lower temperature if I disable the calibration, however than the measurement is highly inaccurate.

 

The error i receive is Error -200077 (which is explained here https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z000000P7gySAC).

 

Is there an option to let LabVIEW extrapolate the form the calibration range?

 

I couldn't find anything online so I appreciate any help. Thank you in advance for your answers.

 

0 Kudos
Message 1 of 4
(3,106 Views)

Hi Max,

 

where (and how) do you apply "calibration" data in your DAQAssistent?

 

I wouldn't apply scaling to a thermocouple reading - atleast not within DAQmx.

If needed I would apply the scaling on the temperature reading I get from DAQmx in a 2nd step in the VI:

check.png

The constant in the snippet is suitable for a perfect sensor without measurement errors, but you can apply any (and how many) coefficients you like…

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 2 of 4
(3,067 Views)

Hi Gerd,

Thank you for your reply.

 

To answer your question.

Where : I apply my callibration data/ calibrate directly in the DAQ assistant. The same approach just in NI MAX is described here https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z000000P80ZSAS .

 

How : I place my sensors in icewater and liquidnitrogen of which I know the temperatures (0 and -196°C)

 

So I do no generate a DAQmx code from the DAQ assistant and alter it.

 

I would have assumed, that Labview has full dataset (e.g. https://www.thermocoupleinfo.com/type-k-thermocouple.htm ) of each thermocouple type (in my case K type) and uses the two calibration point to fit the characteristic thermocouple type curve to each specific sensor. The mV/°C-slope of the thermocouple types is known, there should be only small differences/offsets due to the junction right? 

 

I understand your suggestions as follows: Use the internal calibration/ NI factory setting for the thermocouple type and read out the temperature. In a second step apply a scalling fucntion to calculate the actual temperature. Is that correct?

0 Kudos
Message 3 of 4
(3,054 Views)

Hi Max,

 

I would have assumed, that Labview has full dataset (e.g. https://www.thermocoupleinfo.com/type-k-thermocouple.htm ) of each thermocouple type (in my case K type) and uses the two calibration point to fit the characteristic thermocouple type curve to each specific sensor.

So your assumption is maybe wrong? 😄

When you apply a new calibration you are replacing the known conversion curve - that's my assumption…

 

I understand your suggestions as follows: Use the internal calibration/ NI factory setting for the thermocouple type and read out the temperature. In a second step apply a scalling fucntion to calculate the actual temperature. Is that correct?

Yes. Either do as I have depicted above - or follow the last sentence in the KB entry you linked to:

This type of calibration can be done programmatically in LabVIEW by taking measurements at a known reference level and creating a custom scale. For more information on custom scales, please see the Related Links section below.

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 4 of 4
(3,050 Views)