Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

How to set optimal scan and channel sampling rates with NI-DAQ and E-series?

I am working on a C program to synchronously output two analog signals while reading from 2-8 analog input channels. The DAQ hardware is a PCI-6052E board, and the host machine is running Windows 2000. The analog output signals are audio-frequency signals and must be updated at a minimum rate of 40 kHz. The input signals have a bandwidth of 10 Hz to 15 kHz, so I can acquire the input signals at the same rate as the output signals if that is necessary. I am using the analog input clock to synchronize the analog output (see the attached sample program), and the AI start trigger also triggers the analog output. If there is a way to acquire the analog input at a lower rate than the analog
output, please let me know!

My questions involves setting the analog input timebase and sample and scan intervals. My understanding is that the scan rate is what is normally called the sample rate. That is, if I need to acquire the data from the 2-8 analog input channels at 40 kHz, then I should call DAQ_rate() as follows:

DAQ_Rate(40000, 0, &scantimebase, &scaninterval)

Since I want to acquire the data from the multiple channels as close to simultaneously as possible, I assume I should set the sample rate (i.e., the interval between A/D conversions at each channel) to be higher than the scan rate. If I have 2 channels of input, then this would imply that the sample rate should be greater than 2X the scan rate. I would call
DAQ_Rate as follows to get the sample timebase and sample interval:

DAQ_Rate(100000, 0, &sampletimebase, &sampleinterval)

These values are then used in the call to SCAN_Start:

SCAN_Start(DEVICE, ai_buffer, N_OUTPUT_POINTS, scanTimebase, scanI
nterval, sampletimebase, sampleInterval);

Given the constraints on my signals, is my thinking correct on this issue? Thanks for any help/suggestions!
0 Kudos
Message 1 of 3
(2,865 Views)
I don't think this is the way you want to do it.

The analog input and analog output can have different scan/update rates, and they are totally different functions, independent of each other (aside from triggering one from another).

The NI-DAQ software actually calculates the interchannel delay for you based on the number of channels and scan rate that you select for your analog input. The interchannel delay is usually quite small compared to the scan rate - i.e. a 40KHz scan rate with 4 channels would go something like this....

time =0, sample 1st channel, wait interchannel delay; sample 2nd channel, wait interchannel delay; sample third channel, wait interchannel delay; sample fourth channel. Then it would wait until time = 1/40KHz to do the whole
thing again.

That would be considered ONE scan, and the interchannel delay might be 2 microseconds or so.

So I think you just set your analog input scan rate to whatever you want, and NI-DAQ will take care of the rest.

Mark
0 Kudos
Message 2 of 3
(2,865 Views)
The behavior you describe seems to be how things work in LabView. However, I need to know what values to pass for the sample interval and timebase in the SCAN_start() function. In the documentation for SCAN_start(), there is no mention of automatically setting the interchannel delay.

There is this:
"If scanInterval equals 0, the time that elapses between A/D conversions and the time that elapses between scan sequences are both equal to the sample interval. As soon as the scan sequence has completed, NI-DAQ restarts one sample interval later. Another advantage of setting scanInterval to 0 is that this frees the scan-interval counter, counter 2, for other operations, such as waveform generation or general-purpose counting. This applies to non-E Series
devices only."

Of course, this is NOT what I want, since I need all channels sampled within one scan interval (provided I am parsing the documentation correctly, and that what is going on in this situation is that the interchannel delay = the scan interval... correct me if I'm wrong on this). Anyway, this point is moot since it does not apply to my E-series board.

I've looked over my code a bit, and I see your point about the AI and AO clocks. I can set the AO clock to whatever I want and then just trigger the output using the AI_START_TRIGGER signal. For my purposes, the AI and AO processes do not have to be sync'ed.

Thanks,
Sharad
0 Kudos
Message 3 of 3
(2,865 Views)