I have an AT-MIO-16DE-10 card that is configured to read 16 RSE inputs - eleven of them the same manufacture and make of temperature probe. All the probes are part of the same system, which is currently inactive, so all probes should read virtually the same thing. However, one of the eleven reads 80 mV more than the rest. This is a problem, as this equates to about 4 degrees C.
Here are the steps I have taken: I have measured all incoming signals for these probes 3 ways - (1) with a DMM at the input, (2) using the NI-DAQ Config test program, and (3) using my final VI, which is nothing more than the basic AI Read VI that comes with LabVIEW. The input channels are labeled 0:15, so all 16 channels are
read at once with the same VI and fed into a 2D matrix. All three types of measurements for all channels are normal and agree with one another EXCEPT for the VI measurement for the channel I am referring to. In this one case, the reading is 80 mV higher than the rest. Note that the DMM and NI-DAQ readings are fine for this channel.
The first thing I did was hook up a dc power supply to the channel in question. After this, the channel read perfectly in the VI, so I assumed the problem was a defective probe.
Next, I switched two of the probes. I connected one that had worked on another channel to the defective channel and vice versa. Unfortunately, the same channel (now with the known good probe) still read 80 mV above what it should.
It is my understanding from the manual that the AT-MIO-16DE-10 does not have an interior jumpers for the channels, so I am at a total loss as to what is happening. Does anyone have any other suggestions?