I am experiencing strange behavior with my LabWindows program (using the most recent versions of LW and NIDAQmx). I have two separate threads: 1 for reading 16-bit input from a PCI-6120 board, and a 2nd thread for writing 16-bit output to the same PCI-6120 board. For test purposes I read 1-second worth of samples at a time, and then write those samples (without any processing) to the output channel. Depending on the sample rate that I use in my call to DAQmxCfgSampClkTiming(), I will get a proportional shift in frequency for the sampled data. If I input a sine wave with frequency F0, I will either get a sine wave with the same frequency *or* I will get a sine wave with frequency 0.985xF0. In other words, the frequency will shift down by 1.5 %.
I am using a function generator to input the sine wave, and a Stanford Research Systems SR785 spectrum analyzer. This table shows the relationship between sample rate and the frequency of the sine wave that is outputed when I input a 10 kHz wave. The second table shows the same, but for the case when I input a 1 kHz wave.
Fs (kHz) | F0 (kHz)
----------------------
297 | 10.00
298 | 10.00
299 | 9.85
300 | 9.85
301 | 10.00
302 | 10.00
Fs (kHz) | F0 (kHz)
----------------------
297 | 1.000
298 | 1.000
299 | 0.985
300 | 0.985
301 | 1.000
302 | 1.000
I see this 1.5 % shift whenever I use a sample rate that is near one of the (20 MHz / x) exact sample rates that the PCI-6120 actually uses. For the data in the above tables, the nearest corresponding exact sample rate is 20 MHz / 67 = 298.507 kHz or 20 MHz / 66 = 303.030 kHz. But even though when I request a sample rate of (for example) 299 kHz, I actually get a sample rate of 298.507 kHz, that shouldn't produce a shift in frequency in my data because the same sample rate is used for input and output.
Does anyone have any ideas on what could be producing this frequency shift?