Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Looking into replacing an Arduino Uno unit with a USB 6009 for timing delay solution

Hello, I am relatively new to DAQs and I am looking for a more perminatent solution for implementing a variable timing delay in my electromechanical system. I am currently using a Arduino Uno to control certain electrical triggers within the experiment which only needs an accuracy of 1ms for 3 outputs. I am pretty sure an NI USB - 6009 is suitable for my needs but I would like to confirm that I do not need a more robust system. Thanks!

0 Kudos
Message 1 of 5
(3,729 Views)

No.

 

The USB-6009 has software timed outputs.  There is no way you can achieve 1 ms accuracy.

 

Lynn

Message 2 of 5
(3,718 Views)

Can you elaborate on software timed outputs? My assumption with these systems is that as long as the proper coding is done on the software side through LabVIEW, it could be done. Is the limitation with the LabVIEW software itself or  is it that the lag between the program to daq to system would not be able to do so at 1 ms? Is there another product that would be able to do this? Theoretically speaking, what is the minimum accuracy I can expect?

0 Kudos
Message 3 of 5
(3,655 Views)

It's all about the software here.  Most of the time you can time things within a few mSec, but the potential exists for many seconds of delay. 

 

It all depends on what the computer's doing... operating system, USB system, drivers, etc.

0 Kudos
Message 4 of 5
(3,652 Views)

Think about how your project would work with the USB-6009.

 

You need to respond to a trigger signal.  You did not specify the voltage, rise time, or duration of the trigger signal. For this example let's assume that you have a logical signal which goes from 0 V to +5 V in a few ns and stays high for 1 second.  The response will be a digital output.

 

With the 6009 you can use an analog input with a digital trigger or software timed digital inputs.  Since the digital output is software timed, let's consider the digitally triggered analog input.

 

Your program will set up the analog input to acquire one sample when the digital trigger occurs. Ignoring the set up process which can take place long before the trigger hand has no effect on the response timing, this is the way things will happen.  At each point I show in parentheses which software or hardware process is involved.

 

DAQ Read request (VI). Pass request to DAQ Driver (OS). Pass request to hardware (DAQ Driver). Wait for trigger and acquire 1 sample (HW). Return sample message (DAQ Driver). Pass message to DAQ Read (OS). Receive and process sample (VI). DO Write request (VI). Pass request to DAQ Driver (OS). Pass request to hardware (DAQ Driver). Write DO (HW).

 

Things in the VI category can happen in nanoseconds to very long times, depending on how you write your program. For this example a few microseconds is probably a reasonable estimate.  Similarly for single point data the DAQ Driver parts are probably in the microsecond scale. The OS can take as much time as it wants.  You have no control unless you are using a real-time, deterministic OS.  Even Linux can have unpredictable latencies. On any desktop OS the latencies may often be in the few milliseconds range and can on rare occasions be much longer. The hardware latnecy is probably in the microsecond range.

 

Lynn

0 Kudos
Message 5 of 5
(3,643 Views)