LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Does NI-DAQmx 8.5 choose the Convert Clock rate incorrectly?

Hello,

For an M-series PCI-62xx card, scanning 8 channels at 20kHz each ("AI Sample Clock") results in an ADC rate ("AI Convert Clock") of 160kHz. This can be verified with this VI, which uses the AIConv.Rate property node to display the actual convert clock rate.

However, various NI documents (such as this one) state that after a sample clock pulse, there are two or three "delay" ticks at the convert clock rate, which set a delay before the first actual convert is performed. How many delay ticks are used can be read out from the DelayFromSampleClk.Delay property node. It is typically 3 ticks for an M-series card, and 2 ticks for an E-series card, according to this figure from the NI website:



So my question is: If I scan 8 channels, and there are three additional "delay" pulses, shouldn't the appropriate Convert Clock Rate be SampleRate x (NumberOfChannes + NumberOfDelayTicks)? More precisely, in my sample code below, I scan 8 channels at 20kHz each. I see from the indicator that my card has 3 delay clicks. However, the delay clicks are apparently not taken into account, because in this setup, NI-DAQmx sets the Convert Clock to 8 x 20kHz = 160kHz, whereas I think that it actually should choose (8+3) x 20kHz = 220kHz as the appropriate Convert Clock rate.

Note that it is possible that data acquisition can work in this setup even at 160kHz, because if the next sample tick arrives too early, i.e. before all 8 + 3 Convert Clock ticks are done, then the early sample clock tick is ignored, according to NI documentation. In practice, this would mean only every other sample clock is actually performed.

Once again - given that there are three empty "delay ticks" per each sample clock tick, shouldn't NI-DAQmx choose a Convert Clock of 220kHz instead of 160kHz in my example?


0 Kudos
Message 1 of 3
(2,829 Views)

Hi Gustep,

Just to clear things up on the issue of how the convert clock is chosen, please refer to the KnowledgeBase article found here. Note that the rate of the convert clock depends on the maximum sample rate with a 10µs delay added between channels to allow for settling. Above certain frequencies, there will not be enough room for this extra delay and the convert clock is then calculated as the maximum sample rate times the total channels being scanned. I think the biggest point of confusion about the delay between sample clock and convert clock is that the delay is going to be 3 ticks of the master timebase, and not the convert clock. This timebase is going to be much greater than the convert clock, and therefore skipping 3 ticks of it will not affect the convert clock greatly. Hope this helps,

Daniel S.
National Instruments
0 Kudos
Message 2 of 3
(2,777 Views)


Thank you for your reply. Indeed, if the delay is based on ticks from the much faster Master Clock (which one, by the way? 80MHz, 20MHz, 100kHz?), then I can see the delay and convert clock working out much better.

The confusion on my part probably stems from misreading the DelayFromSampleClk.Delay property node Help, where I read "Convert Clock" instead of "Convert Clock Timebase".

Thanks!




Message Edited by Gustep on 07-02-2008 06:14 PM

Message Edited by Gustep on 07-02-2008 06:15 PM

Message Edited by Gustep on 07-02-2008 06:19 PM
0 Kudos
Message 3 of 3
(2,754 Views)