I am using Labview 6.1 / Window 2000 and a PCI�6110 card.
My system uses an external clock (connected to the PFI7) to sample a signal in channel 0. The program controlling the DAQ is a modified version of �Acquire N scans ExtScanClk D-Trig,vi� example, I have removed from this example the two sub-vi �AI Clock Config� as the PCI-6110 sample each channel simultaneously. I have also had the �Route Signal.vi� to route the CONVERT* to the PFI2.
To test my system I used a signal generator at 1MHz as EXTERNAL clock and a 50kHz sinewave signal as entry in channel 0. If I do a Fourier transform of 1mega samples of the acquired signal I obtained a peak centred at 50kHz plus several minor sidebands (as obtained with a frequen
cy or phase modulation). With a 8Gs/s digital oscilloscope I found that my generator signal is stable but the CONVERT* signal have a variable width in time: the signal rises after a fixed time after the falling edge of the clock but falls between 80 and 100ns after.
If I use the INTERNAL clock of the board (set at 1MHz), I obtained the previous peak at 50kHz but without the sidebands. The CONVERT* signal is stable in time (always ~80ns).
According to the PCI-6110, a high to low edge on CONVERT* indicates that an A/D conversion is occurring. So if I use an EXTERNAL clock, there is an uncertainty of 20ns on the time when an A/D conversion happens.
My questions:
1) As I am pretty sure that the sideband comes from this variable width of CONVERT*, does a way exit to make sure that the A/D conversion always occurs after a fixed delay from the falling edge of my external clock?
2) Is it normal that the board works like this?
Thank you for your answer.