Driver Development Kit (DDK)

cancel
Showing results for 
Search instead for 
Did you mean: 

timing

I am trouble understanding how to set the ADC conversion times for the (m-series) 6259 using the DDK. (I have the sample code up and running.) For example, suppose I want to read n channels as fast as possible. How would I set up the board? (I understand that this may not be the lowest noise approach, but a concrete example makes things easier to understand.)

I think the conversion tminig is performed by the function "aiConvert." For example in "aiex1.cpp," the conversion appears to be set by:

aiConvert (board,
280, // convert period divisor
280, // convert delay divisor
kFalse); // external sample clock?

What is the meaning of the "convert period devisor" and the "convert delay devisor?" These parameters do not seemed to be defined in the documentation. I think the divisor is defined w.r.t. the timebase, but I don't know what that is (80 MHz?)

Many thanks,

Melissa
0 Kudos
Message 1 of 3
(7,854 Views)

Hi Melissa-

We are indeed setting the interchannel conversion rate in the aiConvert function.  The actual parameter that determines the convert clock rate (i.e. the rate that channels in a scan list are scanned) is the convert period divisor. 

For your board the timebase is 20MHz and the maximum multi-channel scan rate is 1 MS/s.  This means that the fastest convert period you can use with your board requires that we divide the timebase by 20.  Since we're loading a counter, we actually want to load [desired divisor] - 1, so to get a convert rate of 1MHz you should load a convert period divisor of 19. 

Alan's discussion in this thread mentions other divisor settings for multi-channel, multi-point acquisitions (as shown in aiex2.cpp).

Hopefully this helps-

Tom W
National Instruments
0 Kudos
Message 2 of 3
(7,845 Views)
Tom,

This is exactly the information I need!

Many thanks,

Melissa
0 Kudos
Message 3 of 3
(7,840 Views)