Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

"Signal Input Range" parameters in MAX causing data errors when reading from a task?

I have created a task in Measurement & Automation Explorer to measure temperature with an RTD.  The RTD is on a 4-20mA current loop and I have set the hardware so that 4mA corresponds to 14 deg F and 20 mA corresponds to 114 deg F.  The current loop is powered and measured by a NI 9203 module in a CDAQ 9172 chassis. Data from the task has a custom scale applied to it and is read in a VI.

 

The custom scale is a linear scale with Temp = 5625*Amps-8.5 degrees F.  I have set the "max" and "min" values in the "signal input range" portion of the task to 104 deg F and 14 deg F, respectively (corresponding to 4 and 20 mA).  When I observe measurement data in the VI with these settings, I get a scaled temperature reading in the VI of about 16 deg F, which is incorrect (the correct temperature is about 77 deg F).  If I run the task with all of these settings in MAX, I get the correct value.

 

I've noticed that if I set the "max" and "min" values  to 104 and -112.5 deg F, respectively (this corresponds to having my custom scale applied at +20mA and -20 mA, the range of the NI 9203 module), then I get the correct value in both MAX and from reading the task in my VI.

 

Similarly, if I remove the scaling and set the input range from 4 mA (min) to 20 mA (max), I read about 4.4 mA in the VI but 15.3 when running the task in MAX.  If I change the input range to -20mA (min) and 20 mA (max) then I get 15.3 mA in both cases.

 

Do I always need to set the max/min values to the scaled limits of the module (e.g. scaled limits corresponding to +- 20mA for the 9203 or +-60 V for a 9229)?  This does not jive with the instructions, which say to specify the max and min expected values (scaled).  Why is there a difference between outputs when I run the task in MAX and when I call the task in a VI?

 

I also wonder why the MAX even asks for a scaled input range.  Isn't the information redundant once I have module limits and a custom scale defined?  In some cases (e.g. with a pressure transmitter) I've accidentally left the default signal input range from -20m to +20m even after I've applied a scale to convert to pressure and have seen accurate outputs way above the "max" signal input range specified (e.g. 150 psi).

 

Please let me know if you have any insights.

 

Regards,

Tim

0 Kudos
Message 1 of 2
(3,755 Views)

Hi SwRI Tim,

 

Here is a KnowledgeBase document that discusses how to configure the singnal input range for your task.  

 

You are correct that this gets to be a bit confusing when deciding what to input for your signal input minimum and maximum values for the task.  A good rule to follow is that if you apply a scale to the input, you should configure the input range with scaled minimum and maximum values.  The DAQmx driver will do all of the math behind the scenes to bring in the actual signal value (whether it be current or voltage) and convert the data to the scaled values.

 

If you are configuring and saving a Task in MAX, you should place a Task Constant I/O Terminal in your block diagram, choose your task, and then right-click the task and select Generate Code»Example.  Also, if you are just configuring a low-level task using DAQmx API, you should ensure that you are applying a scale to the task.  Also, in the DAQ assistant there is an option for a custom scale.  If this does not help resolve the issue, it will be beneficial to see screenshots of your LabVIEW block diagram with DAQmx code and MAX setup.

 

Best,

Adam
Academic Product Manager
National Intruments
0 Kudos
Message 2 of 2
(3,739 Views)