LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Erroneous data on first scan LV 7.1/PCI-MIO-16E-1

Hi,

 

I have been debugging a program that was written using Traditional DAQ functions (AI Config, AI Read, AI Clear).

 

The first AI Read performed seems to read erroneous data on the first 20 scans or so.  The buffer size is set to 100 and subsequent scans do not read erroneous data.  It seems as if the channels are discharging to their correct values.  AI Reads are performed in a state machine, I was hoping I could add a single AI Read prior to entering the state machine but these readings and the first AI Read in the state machine both end up exhibiting the effect.

 

Any ideas what this might be from or what I can do to rectify the situation?

 

Thanks,

 

David

0 Kudos
Message 1 of 2
(2,538 Views)
Hello David,

It's difficult to tell exactly what could be going on here since this is a pretty broad question.  Can you describe the application a bit more?  Please address the following questions:

1.  Are you reading from a single channel, or multiple?
2.  Are you using hardware or software timing?
3.  How are you sure that the values you are reading are incorrect?
4.  Are the first 20 readings trending towards the final, "good" readings?

If you are measuring a dynamic signal, it is possible that there may be some settling involved.  Can you try adding a software time delay before the first AI Read to see if it improves the readings?  Also, I would suggest that you try running some of the Traditional NI-DAQ LabVIEW examples that installs with the driver.  These examples are programmed correcty, so if you see the same behavior there, we can assume that it is some kind of inherent limitation of the system design. 

Post back with the information I've asked about, and we'll see what we can figure out from there. 

Best regards,
0 Kudos
Message 2 of 2
(2,514 Views)