Configuration: DAQ board PCI6071E, win2K, LabView 6.02.
I took a simplest "Acquire n samples" VI and set it to acqire data from one analog channel for 5 sec., 1000 samples/sec. The signal had two levels: 0 and 5 volt with a period of 100 ms and duty cycle 50%. I expected to get value toggle each 50 samples, but ~ each 1 sec there is a packet of 49 values and not 50. It seems like there is a problem with the DAQ clock - can it be calibrated ? How can I check (say, with a scope) what is the real scan rate of the DAQ - is there any scan strobe output I can check ? Any other ideas ?
Thank you,
Sergey.