07-22-2005 01:29 PM
I'm new to Labview and am using version 7.0, SCB 100 board, and a PCI 6031E card to read a signal from a pressure transducer.
I have a configured a virtual channel to read the voltage as follows:
Input terminal configuration: differential
Minimum value: -5V
Maximum value: +5V
The units are from a custom scale that takes the voltage value and converts it to a pressure in bar.
The signal to the board is 126mV. A custom linear scale is added to convert the voltage to a pressure value. The scale (272 bar/V) was created in MAX and added to the virtual channel. Without the custom scaling applied the signal is displayed as expected. However, when the scale is added the pressure signal flat lines around 27.3 bar. When the Maximum value field of the virtual channel is set above this value (such as 28) the signal appears, but at a higher value than expected and much noisier.
There are several other pressure signals being measured that use the +/- 5V setting with similar (515 bar/V) custom scales and they are working fine. It seems very odd that the range must be set at 28 to get a signal. I read that the range is set after custom scaling is applied, but the other pressure signals are also scaled and work fine with the +/- 5V. Any suggestions on what might be causing this and how to fix it would be much appreciated.
07-25-2005 01:16 PM