02-13-2017 08:50 AM
Looking for some help with setting up a pressure transducer. This is my first time using LabVIEW or doing anything like this so I suspect I may be missing something simple, but having trouble figuring it out. Here is what I am using:
Endevco pressure transducer 8530C-100 (0-100psia, spec sheet and calibration report attached)
LabVIEW 2012
cDAQ-9171 chassis
NI 9237 module
NI 9949 RJ-50 to screw terminal adapter
Excitation source - 10v internal. Eventually I'll have other sensors involved and will be using an external voltage supply but for now just using internal
My sensor is connected as follows:
Red wire (+IN) to terminal 6 (EX+)
Black wire (-IN) to terminal 7 (EX-)
Green wire (+OUT) to terminal 3 (AI-)
White wire (-OUT) to terminal 2 (AI+)
Cable sheathing connected to ground
To set this up I used DAQ Assistant > Acquire Signals > Analog Input > Pressure (Bridge) and chose the ai0 channel (the 9237 has four RJ-50 connection ports labeled 1-4. My cable is connected to position 1, I'm assuming this would be the ai0 channel)
In the DAQ Assistant window that comes up I changed the minimum signal range from -100 to 0 psi, then changed Vex value to 10V and acquisition mode from N samples to Continuous. Then configured scale with -0.71mV/V for 0psi and 18.98 mV/V for 100psi (my understanding of the sensor calibration report). All other values were left at the default. A screen shot of the setup window is attached.
DAQ assistant then buils the VI and I add a waveform graph to show the output. When I hit Run I get a signal at 3.619, where I would expect something around atmospheric pressure of 14.7psia (a screen shot of this is also attached). It also appears that the signal does not change with changes in pressure. I have a simple hand bulb that I'm using to apply some pressure to the sensor and reading with a multimeter where my green and white signal wires are connected I can see that it goes from around 18mV at rest to about 25mV when I squeeze the bulb, although I see no change in the 3.619 signal being displayed when this happens.
Like I said, totally new to this and assume I'm missing something simple but not sure what to try next. Tried changing the bridge resistance and sample rate values just to see what happens but don't see any real difference in the signal level. Also tried using the 'Bridge - Continuous Input' example as the starting point instead of DAQ Assistant but I get essentially the same results. Any help is appreciated - thanks!
Solved! Go to Solution.
02-13-2017 02:58 PM
When you start working with a new device, always start with MAX (or NI MAX, the Measurement and Automation eXplorer, whose Icon should be on your desktop). Open MAX, find your device, configure it, and use MAX to take a reading. There are several possibiliities:
Bob Schor
02-14-2017 09:24 AM
Bob, thank you for the suggestions and advice.
Both the chassis and the module show up in MAX and if I run a 'Self-Test' for either one I get the message 'The self-test completed successfully'. If I run the 'test panels' for the 9237 I get signals that look different on each channel when autoscaled (image attached), but if I uncheck 'autoscale' they all show a flat zero signal. This is with my sensor connected though at port1, which I'm assuming is ai0 (the ports on the 9237 are labeled 1-4).
Does this sound like my hardware and basic software is OK? Should I be seeing something different at the port where the sensor is connected? If I create a Pressure task in MAX and set it up the same as I did with the DAQ Assistant I see the same thing at ai0 that I did with the DAQ Assistant - flat signal at ~3.6 amplitude
02-14-2017 11:44 AM
I'm not familiar with all of the hardware you are using, but I wonder if you have a "Single-ended vs differential" hookup problem. Do you know these terms? Most A/D systems allow you to use two inputs for Analog In Hi and Analog In Lo and sample the voltage difference between these two. There should also be an Analog Ground somewhere to provide a definition of 0v. On some devices, the Analog Inputs can be used in both Differential and Single-Ended mode (Single-Ended gives you twice the number of channels, but you need to reference every reading to a single Analog Ground). When this is done, you'll often see inputs labelled AI0, AI4, AI1, AI5, AI2, AI6, AI3, AI7. If you connect to the pair AI0/AI4, you are connecting to Channel 0, Differentially, and need to tell MAX (and your software) to use Differential Mode.
Bob Schor
02-14-2017 12:14 PM
No, not familiar with this terminology but I will look into it. I think I must be missing something simple. The sensor has 4 wires that connect to an adapter and that connects to the module with an RJ-50 cable. I can measure directly at the adapter with a multimeter and can verify that I have 10v going to the sensor and that I have 18.5mV coming from the sensor at rest. When I squeeze a hand bulb to apply a few pounds to the sensor I see the voltage go up to around 25mV and then back to 18.5mV consistently with each squeeze, so I believe the sensor has to be working. Regardless of scaling or noise or anythign else that might have to be corrected down the line, shouldn't I see some change to the signal displayed in MAX or LabVIEW when I squeeze the bulb? I've tried all 4 ports, tried a new RJ-50 cable and tried changing everything else I can think of but still nothing. I know it's not supposed to be this difficult, if I can measure the change with the multimeter I'm sure the module should be seeing it too. Uninstalling everything at the moment and going to try to start fresh, not sure what else to try.
02-14-2017 01:27 PM
boletus,
I took a gander at your attachments and dug deep. The DAQmx Task is off by @10:1 for bridge impedance (you are using the default 350 Ohms) Use the input impedance from the cal report and you should start seeing better values. To Wit: 1Atmospere^-2 -0.71 is really close to your reading. if you were off exactly 10dB it'd be exact you are off @9.6dB. use 3212ohms and see how that works
02-14-2017 05:04 PM
Thanks, Jeff. I may be under a lot of pressure, but I know very little about measuring it (although I am a fan of Pascal ...).
Bob Schor
02-15-2017 08:39 AM
So I reinstalled everything - LabVIEW, MAX, device drivers - and tried again but still no response to the signal when I squeeze the bulb to apply pressure. This was at 10v internal excitation though. I then tried it at 2.5v and 5v and it does seem to work! When I squeeze the bulb, signal goes up and then back down to baseline. I don't remember for sure if I tried the lower voltages before (pretty sure I did), so can't really say if reinstalling changed anything or not. Regardless, still confused why it doesn't work at 10v, which is the recommended excitation.
I can verify with the multimeter that I do indeed have 10v going to the sensor. Also tried with an external power supply but same result - get nice resonse at 5 and 2.5v, get no response at 10v. Would this maybe be normal behavior for a sensor like this - different sensitivity at different voltage? It's a 0-100psi sensor and I'm probably only giving it a few psi, so it is a the lower range.
Jeff - tried the higher resistance value and did not see any noticeable change anywhere, but thank you for the clarification there. I noticed 'Input resistance' 3212ohm and 'output resistance' 1847ohm on the calibration sheet and assumed one of these should probably be entered into the software for 'bridge resistance'.
02-15-2017 05:43 PM
That is "Unexpected"
So, what is the device? Make, Model and a link to the user manual would help here.
@Bob- the last time I used Pascal was on an Apple II-E, or maybe I converted that from Tor at realy low pressures measured by a Vacuum Ion Pump after the cyclotron evacuated the chamber
02-16-2017 07:16 AM
I am using a NI-9237 module
http://www.ni.com/pdf/manuals/372306b.pdf
with a cDAQ-9171 chassis
http://www.ni.com/pdf/manuals/372838e.pdf
The information for my sensor is in the first post.
Thanks!