Hi,
I'm working with a M-Series DAQ card (PCI 6221) and during the first experiments a stumbled across a strange phenomenon. First I tested the speed of the system (in combination with the pc). I connected a frequency generator (10 kHz) to a counter input and made a timed loop with an interval of 10ms. The "actual time" and the counter value are stored each loop. When the stop button is pressed I calculate the frequency and save it to a text-file. As expected there is some deviation caused by the accuracy of the timer readout (1ms) and the phase shift, but also another strange effect occurred. There is a "Time Jump"/"Counter Jump" every 65150/65160 ms (very close to the 16 bit value 65536). Suddenly the readout of the counter value is about 80% higher. I think it is a time correction (about 7/8 ms).
Additional things I tested/observed:
- It related to time not to samples (with a 5 ms time interval it occurs ever (65150/65155 ms)
- It is not related to the start (sometime the first jumps occurs within the first 10 second)
- It is not related to a counter value
- It can't be the frequency generator because it is an analog one
Who knows something about this phenomenon?
In the project I'm planning to solve this by using buffered readouts with a second timer to control the measure frequency. I have to synchronize this with a analog readout but I thought a saw a example about this. This solves the major part of the problem but not all my signals can be synchronized.
Hopefully somebody can shine a light on this problem.
Thank
Mark Otto