LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Pressure transducer Scaling Error

Hi everybody,

I am facing an issue with my pressure sensor scaling.

The hardware I am using is USB 6001, 12V DC power supply and 3-wire Hydac pressure sensor which has

Supply- 7-12 VDC

Signal - 0.5-4.5V and

Ground

range - 0-400 bar

USED differential as well as RSE

I have used DAQmx with custom scaling and converted voltage in Bar but the readings on my labview has 10 Bar difference what I am reading on my pressure gauge. For example if my pressure gauge shows 50 Bar , then my labview channel reads 38-40 bar. Also there is too much fluctuation.

Need a solution how I can fix the scaling issue, offset and slope are m=100 and b = -50.

0 Kudos
Message 1 of 20
(4,820 Views)

You have a sensor (a pressure gauge) that provides an output of its (true?) reading.  You have DAQmx which gets values from the Sensor (in volts?).  You have a method to convert voltage to Bar (pressure), and have assumed that the sensor is linear and has slope of 100 and intercept of -50.

 

And the readings are inconsistent.  At least you are testing your sensor!

 

This sounds like a question of Calibration.  Here are some questions you should ask yourself (and answer):

  • Do you have a way of producing a known input (Pressure) so that you can test the reading that the Sensor gives you (which seems to have an independent readout of Pressure, in addition to its Voltage output)?  This could tell you if you can believe the Sensor's Pressure output.
  • Do you "know" (or, better, can you test) that over the Pressure Range you will be using, the Voltage varies linearly with Pressure?
  • You should make a series of Pressure/Voltage measurements that span the range of Pressures you will be using, and plot the data.  See if it is a good approximation to a straight line.  Determine its Slope and Y Intercept, and see how well they compare with 100 and -50.  This step is called "Calibration".
  • Always believe your own Calibration!!!

A sad story -- some BME Masters-level students wanted to make measurements with a triaxial accelerometer that output voltage, with an offset specified as 1.5v ± 10% and a sensitivity of 0.3 v/g ±10%.  They knew that if they took the square root of the sum of the squared X, Y, and Z accelerations, they should get 1g (1.5v), and it should be independent of the orientation of the accelerometer.  It wasn't (by about 10%)!  Why not?  [Look at that huge variation in the scale factors ...].

 

Bob Schor

0 Kudos
Message 2 of 20
(4,748 Views)

Thanks Bob for replying me back.

Answers to the above questions:

a.) I have a hydraulic setup and a digital/Analog pressure guage  connected to my system to measure what should i get on my Labview

b.) I am using a Hydac pressure sensor which says that it works on linearity with an accuracy of <+-0.5% (HDA 4400)

c.) Even i have to take pressure and voltage measurements, how can i do so because it gonna work on the scaling I have provided.

I am new to labView, may be I am wrong at some part , help me on that.

 

Thanks

Ankit

0 Kudos
Message 3 of 20
(4,741 Views)

First, take the data without scaling, i.e. get the answer in Volts, not in "what I hope are Bars".  Take a simple example -- you put in a pressure of 0, get a voltage of 2v.  Put in 1 Bar, get a voltage of 10v.  Two points determine a line, so your slope is (1-0)/(10-2) = 0.125 Bar/v, and the Y Intercept will be -0.25 Bar.

 

Now set up the DAQ device with this Scaling factor and repeat the measurement.  When you put in a pressure that gives you 2 v, this corresponds to a Pressure of -0.25 + 0.125 * 2 = 0 Bar, which is what you should read on your Pressure Gauge.  Now pump it up to 10 v, and see if the Pressure is -0.25 * 1.25 * 10 = 1.25 - 0.25 = 1 Bar, as it should be.  If so, you have successfully experimentally determined the two Calibration Constants for a linear Scale.

 

The Slope and Offset have to be "consistent with reality" -- if you trust your measurement of voltage and pressure, then you can use it to determine Slope and Intercept.  If you trust Slope and Intercept (i.e. the Calibration), then if you don't get 0 when you have no pressure, you "can't believe your own eyes" and should give up being an Engineer!

 

Bob Schor 

0 Kudos
Message 4 of 20
(4,734 Views)

@Bob_Schor wrote:

First, take the data without scaling, i.e. get the answer in Volts, not in "what I hope are Bars".  Take a simple example -- you put in a pressure of 0, get a voltage of 2v.  Put in 1 Bar, get a voltage of 10v.  Two points determine a line, so your slope is (1-0)/(10-2) = 0.125 Bar/v, and the Y Intercept will be -0.25 Bar.

 

Now set up the DAQ device with this Scaling factor and repeat the measurement.  When you put in a pressure that gives you 2 v, this corresponds to a Pressure of -0.25 + 0.125 * 2 = 0 Bar, which is what you should read on your Pressure Gauge.  Now pump it up to 10 v, and see if the Pressure is -0.25 * 1.25 * 10 = 1.25 - 0.25 = 1 Bar, as it should be.  If so, you have successfully experimentally determined the two Calibration Constants for a linear Scale.

 

The Slope and Offset have to be "consistent with reality" -- if you trust your measurement of voltage and pressure, then you can use it to determine Slope and Intercept.  If you trust Slope and Intercept (i.e. the Calibration), then if you don't get 0 when you have no pressure, you "can't believe your own eyes" and should give up being an Engineer!

 

Bob Schor 


Note that you do all of this in NI-MAX, not LabVIEW. 

0 Kudos
Message 5 of 20
(4,719 Views)

Thanks a lot Bob for your help.

If I am working on DAQmx on LabVIEW , its not gonna work or shall I do all in NIMax?

If Yes, please provide me any reference video or steps where I can start on NI max.

/// Thanks for helping out a new beginner

0 Kudos
Message 6 of 20
(4,706 Views)

You can technically use LabVIEW and DAQmx, but that adds the potential complexity that you did something wrong in the programming. I find it always best to troubleshoot DAQ scaling, etc. in NI-MAX. I will often even have my scaling defined in MAX. You can find a tutorial at https://knowledge.ni.com/KnowledgeArticleDetails?id=kA03q000000YGfDCAW&l=en-US

0 Kudos
Message 7 of 20
(4,681 Views)

@AJ28 wrote:

Thanks a lot Bob for your help.

If I am working on DAQmx on LabVIEW , its not gonna work or shall I do all in NIMax?

If Yes, please provide me any reference video or steps where I can start on NI max.

/// Thanks for helping out a new beginner


Several years ago, I had some students who were measuring muscle twitches using 3D Accelerometers, and wanted to express things in units of "g" (gravity) instead of volts.  We knew the (approximate) bias and gain of the three channels, so I tried building a Task in DAQmx that scaled the three voltage channels to give me g's.

 

The "easiest" way to set up a Task in DAQmx, particularly during initial testing, is to create the Task in MAX and use it in your code (look up "Learn 10 Functions in NI-DAQmx and Handle 80 Percent of your Data Acquisition Applications" on the Web -- it will explain completely how to do this).  For complicated reasons, however, I wanted to programmatically set the Scaling factors (for one thing, I needed to calibrate the accelerometers, as the "nominal" values for the scales were not very accurate).  This proved tricky, so I came to the LabVIEW Forums where I was advised to "do it all in DAQmx" -- I believe this approach is also explained in the article I cited.

 

Bob Schor

0 Kudos
Message 8 of 20
(4,667 Views)

Thanks for letting me know Bob , I have worked on MAX and it worked well but my LabView reading doesn't give me '0' reading when there is no pressure as it is showing me some 1.2 BAR or 1.3 Bar. I am getting some voltage drop at my AI+ and AI ground, that is why its showing me some value. I need your help how can I resolve this issue or make the value equivalent to zero when there is no pressure.

//Let me know 

0 Kudos
Message 9 of 20
(4,653 Views)

Have you considered the accuracy of the DAQ device that you are using? Also, did you do the calibration steps that Bob suggested?  If so what were the values that you got for the pressures that you input? Are you consistently getting the same voltage when 0 bar is applied? 

0 Kudos
Message 10 of 20
(4,636 Views)