10-23-2006 03:26 PM
Hi,
I have been debugging a program that was written using Traditional DAQ functions (AI Config, AI Read, AI Clear).
The first AI Read performed seems to read erroneous data on the first 20 scans or so. The buffer size is set to 100 and subsequent scans do not read erroneous data. It seems as if the channels are discharging to their correct values. AI Reads are performed in a state machine, I was hoping I could add a single AI Read prior to entering the state machine but these readings and the first AI Read in the state machine both end up exhibiting the effect.
Any ideas what this might be from or what I can do to rectify the situation?
Thanks,
David
10-24-2006 03:00 PM