06-30-2009 11:49 AM
Hello,
I am taking data points from a USB 6009 card using Single Sample (On demand) in a for loop. I am doing this because I only want to record every 100th point, and additionally, I want the program to stop if a data point is out of the range where it should be, meaning the testing device is breaking down. The single sample by itself seems to take 10 ms approximately to take a data point. I made a separate VI to check this out which is just a DAQ assistant that acquires one analog value, all in a for loop set for 1000 times. I did not use any wait (ms) or any other pause program to my knowledge, yet the entire program took about 10 seconds to execute.
My understanding was that the sample rate was 40kS/second, and that should be much faster than 10 ms per sample. When I run N number of samples with N =1000, I can set the sample rate to faster than 1 ms (10kHz). Is there some default time that the sample rate is set to for a Single Sample, or is their just a 10 ms delay accompanying the calling of the DAQ assistant?
Thank you very much for any help.
06-30-2009 01:10 PM