I am working on a C program to synchronously output two analog signals while reading from 2-8 analog input channels. The DAQ hardware is a PCI-6052E board, and the host machine is running Windows 2000. The analog output signals are audio-frequency signals and must be updated at a minimum rate of 40 kHz. The input signals have a bandwidth of 10 Hz to 15 kHz, so I can acquire the input signals at the same rate as the output signals if that is necessary. I am using the analog input clock to synchronize the analog output (see the attached sample program), and the AI start trigger also triggers the analog output. If there is a way to acquire the analog input at a lower rate than the analog
output, please let me know!
My questions involves setting the analog input timebase and sample and scan intervals. My understanding is that the scan rate is what is normally called the sample rate. That is, if I need to acquire the data from the 2-8 analog input channels at 40 kHz, then I should call DAQ_rate() as follows:
DAQ_Rate(40000, 0, &scantimebase, &scaninterval)
Since I want to acquire the data from the multiple channels as close to simultaneously as possible, I assume I should set the sample rate (i.e., the interval between A/D conversions at each channel) to be higher than the scan rate. If I have 2 channels of input, then this would imply that the sample rate should be greater than 2X the scan rate. I would call
DAQ_Rate as follows to get the sample timebase and sample interval:
DAQ_Rate(100000, 0, &sampletimebase, &sampleinterval)
These values are then used in the call to SCAN_Start:
SCAN_Start(DEVICE, ai_buffer, N_OUTPUT_POINTS, scanTimebase, scanI
nterval, sampletimebase, sampleInterval);
Given the constraints on my signals, is my thinking correct on this issue? Thanks for any help/suggestions!