09-24-2008 01:19 PM
Hi,
I got a usb-6008 device.
I need to acquire a digital signal with an acquisition time of Microsecond (and this device should be able to do it...).
Anyway, I created a Labview software with a While loop for recording the signal and the acquisition time and I save the results in a Txt file: when I open the file the difference in time between two acquisitions is Millisecond at least!
Is it due to my software? Is Labwiev not able to run a step of the while loop in less than ms? Or I should use the analog input?
Many thanks,
Giovanna
Ps: is it possible to use an external clock with this device? I tried but I got error!
09-24-2008 02:19 PM
Giovanna:
Some bad news on both fronts: The 6008 digital inputs are not hardware-timed to be able to capture 1us intervals, and the 1msec time difference you are seeing is a limitation of Windows. The analog input maximum sampling rate is also too slow to capture the signal. I'm not familiar with NI's offerings for high speed digital inputs, hopefully someone can suggest some options.
09-25-2008 04:49 AM
Hi...
so just to be clear...
I need to sample at about 100microsecond rate.
And it looks like I cannot do it.
Now I have one more question: If I use the analog input can I tune the sampling rate? and How? or is a fixed one?
Many thanks
Giovanna
09-25-2008 06:45 AM