01-31-2007 05:03 PM
01-31-2007 05:19 PM
01-31-2007 05:38 PM
I am using hardware timing and Windows XP. The only knowledge I have of the timing is from the file. I realize that there is only one adc so nothing is truly being measured at the same time, but I thought it could do better. I currently have my adc running at 20kS/s (ai.convert.rate) which suggests each channel should only be 50us apart. That's 200 channel scans before the next 100S/s interval. In my mind, there should be plenty of time for it all to fit.
Are you familiar with the DAQmx Read chopping the samples per second in half, or could this be happening just in the file write?
I know some people on the forums suggest using write to spreadsheet instead of lvm, but now I am wary of the timing. If I assume a time stamp/delta t for a spreadsheet file, I could be drastically wrong.
Your thoughts?
02-01-2007 08:11 AM
02-02-2007 03:30 PM
02-02-2007 03:39 PM
I've solved the problem. After posting my example code, I realized my example code doesn't fully reflect what I am doing. I actually have 2 DAQmx Read VIs on the same task. One reads a waveform, the other a 2D DBL. I believe having 2 Read VIs on the same task screws up the timing for both. By disabling the second Read VI I was able to record data as expected.
Thanks for your help as I learn what, and what not to do in LabView.
HP