12-15-2008 02:41 PM
Hi,
I currently have my LabVIEW code reading 3 analog voltage inputs and writing 1 analog voltage output. I have set the minimum and maximum values in LabVIEW to be 0 to 10 volts for all channels and still I am getting incorrect values.
I have the 3 inputs grounded and the output supplying 4 volts, the software shows me that the inputs are reading nonzero values (1.87-1.96 V). Even when I send my output directly into one of the inputs, the display shows values not conducive to the voltage the output should be supplying.
I am using a PCI-MIO-16E-1 card.
Thank you,
Jasen Stephany
12-16-2008 06:30 AM
Hi Jasen,
Thanks for your post and I hope your well today.
I would suggest we take a look at your setup.
In the user manual,
http://www.ni.com/pdf/manuals/370503k.pdf
page 2-20, it discusses AI configuration.
There are three types of input configuration, DIFF, RSE and NRSE. I would check what type of configuration you require - which is based on the type of source your trying to measure (floating or not) and if you need to introduce any bias resistors.
I would also advice during resolve your issue you use Measurement and Automatican Explorer (MAX) where you can quickly test different input configurations.
Please let me know what you think,
hope this helps.