02-21-2010 05:18 PM
Hi all,
Just a quick question - I have been taking strain readings using NI 9237 and am consistently getting values of about 50% less than calculated over a wide range of readings. I have also used an old strain gauge machine which gives strain values consistent with the calculated values which leads me to believe it is an issue on the module/software side of things. Is it worth trying to calibrate the NI 9237 module via the calibrator/DMM method or can this just be done within Labview and a more simple shunt calibration? Note, the module is new and this is the first time is has been used, I am also using a two gauge (equal and opposite strains) half bridge set-up
If you have any ideas as to what I should do, it would be a great help!
Thanks in advance,
Andy S
02-22-2010 07:00 AM
Dear Andy
This is expected behavior. The NI 9237 returns its readings in volts per volt of excitation rather than in volts. For example if it is returning .004 V for every volt of excitation. When using test panels, the NI 9237 will default to an excitation value of 2.5V and the actual voltage at the input to the module will be:
2.5 V X .004 V/V = .01 V
This behavior is preferable for bridge measurements since the
voltage across the bridge is directly proportional to the excitation
voltage. If the returned measurement were in terms of volts variations
and errors in the excitation voltage would be reflected in the
measurement; however by returning a ratio, these variations can be
factored out. This allows fixed strain to remain fixed even if the
excitation voltage varies.
The 9237 is designed specifically
for measuring bridge sensors and the hardware is built to measure V/V
directly rather than making a conversion. It is not ideal for taking
measurements of sources that are not directly proportional to the
excitation voltage. Modules such as the NI 9215 or NI 9219 are better
suited for taking independent voltage measurements.
02-22-2010 03:01 PM
Thanks David, that was a useful bit of theory and a good insight into NI 9237. I'm a bit unsure how this causes my strain readings to be 50% out though. Is it just a matter of me factoring my results by a number to give the expected strain or do I need to change/adjust something within the software/hardware? I have found that factoring all the readings by 2.13 gives strain values within 1% accuracy to the calculated values so obviously this method works but ideally I would like the raw data to be accurate without this step.
Thanks again!
Andy
02-23-2010 05:23 AM
Dear Andy
Many thanks for your reply. Apologises as I wasnt clear in my previous post. What you need to do is perform a shunt calibration and adjust the gain factor. The below knowledge base is very useful of how to perform this:
Shunt calibration Example
http://zone.ni.com/devzone/cda/epd/p/id/5816
What you need to do, is open MAX (Measurement & Automation Explorer), right click on your device (NI 9237), select Create Task.
From here, select Acquire Signals >> Analog Input >> Strain.
Under strain setup, select the device Tab and then click on Strain Calibration. This is where you can perform the shunt calibration.
I have also attached a KB on how to accomplish this that has some screenshots if my steps arent clear. Hope this helps!
http://digital.ni.com/public.nsf/allkb/F47EFDB8B1992282862572AD007CE9C3?OpenDocument
http://digital.ni.com/public.nsf/allkb/892C84122A6501AE86257547007E5C53?OpenDocument