04-22-2009 10:57 AM
I am testing a USB-6009 and am checking the calibration of the analog input against a dvm. I have set up a task to measure a voltage from an isolated power source (using RSE and averaging 100 samples), when I run the test panel in MAX the voltage value agrees with the dvm within 0.2% across the range of 0 to 5 volts but when I run the same test, reading the 6009 from a VI I get an error from +0.4% at 0v volts to -1.2% at FS (5.0 volts). The VI is using a mean function and median filter, I have tried without the median filter but same results. The error is linear and I can easily correct for it but I don't understand why there is a difference between MAX and VI outputs.
Any suggestions?
Thanks,
Dave
04-23-2009 03:22 PM
Hi Dave,
I would recommend using the DAQ Assistant to do your acquisition and then just disply that data onto a graph. The DAQ Assistant will most closely mimic the test panel in MAX. This way, we can ensure that no further processing is taking place on the signal other than just reading it in and displaying it. Let me know how this works. Have a great day!
04-30-2009 10:35 AM
Margaret,
Thanks for the suggestion. With the many variations I was trying in order to solve this it turned out that I was looking at 2 different channels (although all 3 channels that I was working with had the same voltage source connected) but coincidentily one of those channels on the 6009 is not working correctly and this was the source of the error I was seeing.
The error on the bad channel seems to be repeatable and linear so it was easy to apply a custom scale to correct the output.
Dave D