LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Calibration of pressure transducer in LabView

Solved!
Go to solution
Hey all, I wander if it is possible to calibrate a pressure transducer alone in LabView? If yes, any suggestion on how to proceed? I'm thinking that I could collect the data at 0 mmHg and 300 mmHg and then determine a linear function (y=kx+m) based on the data, but I feel like there should be a better way to do that. Additional info: - The transducer (Vedette DPT-6000) is connected to NI-9237 which in turn is connected to cRIO-9056. - I'm using LabView 21.0 (32-bit). - I'm reading the data with the help of DAQmx. Thank you in advance. Kind regards, Hossein
0 Kudos
Message 1 of 5
(1,855 Views)

The specific type of calibration depends on the sensor itself. Typically, people get it calibrated by an external institution and use the data from the calibration report.

 

If you know that your sensor is linear, you can pick the 0mmHg to measure the offset error and 300mmHg to measure the gain error and use it in your y=kx+m (called slope, intercept cal method).

 

You can then use the k and m terms as part of the custom scale while creating the DAQmx task then all the measurements you get in DAQmx Read will be automatically converted to mmHg after correcting using the k and m factors.

Santhosh
Soliton Technologies

New to the forum? Please read community guidelines and how to ask smart questions

Only two ways to appreciate someone who spent their free time to reply/answer your question - give them Kudos or mark their reply as the answer/solution.

Finding it hard to source NI hardware? Try NI Trading Post
Message 2 of 5
(1,832 Views)

@HosseinBakhtiari wrote:
Hey all, I wander if it is possible to calibrate a pressure transducer alone in LabView?

Well, you need a external reference or known source of pressure 😉 😄

 

If you can, apply more than just two points, you gain confidence in your assumption to do a linear approximation and use only two points later.

I remember that we used at least 10 points and fitted a square function to get the needed uncertainty...  depends on your needs and the stuff you can use.

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


Message 3 of 5
(1,793 Views)
Solution
Accepted by topic author HosseinBakhtiari

Sounds like to you need to "scale" your incoming data. I couldn't find where I originally downloaded it. But I have attached a LabVIEW scaling example I found awhile back.

 

***For some reason I'm unable to attach any files at all to the forum now. (Anyone else having this problem?)

 

The example is called LabVIEW Scale Generator (2013).zip

 

 

***Edit*** After clearing Chrome's cache I was able to post correctly and upload files again.  

---------------------
Patrick Allen: FunctionalityUnlimited.ca
Message 4 of 5
(1,778 Views)

Thank you for all your responses!

 

Kind regards,

Hossein

0 Kudos
Message 5 of 5
(1,746 Views)