LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Fastest software response from DAQ TTL trigger

We want a USB camera to start acquisition when a TTL edge is detected by our NI PCI 6013 card (WinXP), with minimal time delay (<0.1 ms). 

 

It works fine, but too slow and unpredictable. CCD acquisition occurs at a 1.5 KHz rate, by calling a DLL inside labview, which is working fine. We do not miss a single pulse once we start calling the DLL in the for-loop, USB + camera + labview are fast enough. The problem seems to be the unpredictable time delay between detecting the edge and calling the DLL. The current program is attached.

 

Right now, a dummy voltage measurement is performed, triggered on the falling TTL edge. Of course, the voltage measurement itself costs time unfortunately. I am looking for a smarter way of progamming, such that execution proceeds directly after TTL edge recognition. Any ideas?

 

 

Background:

1.5 kHz laser system, which gives electric triggers to the CCD controller and chopper. Part of the laserlight is modulated by an optical chopper, which gives a TTL signal to communicate the current state: blocking or open. We want to start acquisition with a known chopper state. (This topic was originally posted in hardware, but I realised this is the proper board.)

program zoom.PNG

 

0 Kudos
Message 1 of 3
(2,929 Views)

I am not sure, but I would guess the "problem" is the USB. It is non deterministic. With a low performance PC and since it runs on windows... it may become more "unpredictable".

 

If this the case, you should choose a camera that takes a trigger instead.

 

from:

 

https://www.ni.com/en/shop/data-acquisition/how-to-choose-the-right-bus-for-your-measurement-system....

 

" Another important factor in single-point I/O applications is determinism, which is a measure of how consistently I/O can execute on time. Buses that always have the same latency when communicating with I/O are more deterministic than buses that can vary their responsiveness. Determinism is important for control applications because it directly impacts the reliability of the control loop, and many control algorithms are designed with the expectation that the control loop always executes at a constant rate. Any deviation from the expected rate makes the overall control system less effective and less reliable. Therefore, when implementing closed-loop control applications, you should avoid buses such as wireless, Ethernet, or USB that are high in latency with poor determinism."

 

0 Kudos
Message 2 of 3
(2,909 Views)

Hello LucasKun,

 

If you want to have a guaranteed maximum delay/latency of 0,1ms, then I would not advise using a USB-device (if you don't have a trigger input on it).

 

Defining a guaranteed maximum delay/latency of 0.1 ms for your measurement system actually comes down to defining that your measurement system should be deterministic.

 

Depending on what device you're using and what you're trying to do USB can easily have a delay bigger than 0.1 ms (that is variable).

 

What kind of camera are you using?

Does it have a trigger input?

 

Some things that are unclear and can also have an effect:

- What does the VI "read ccd" look like?

- How do you measure the "slowness" and "unpredictableness" of the system?

  Is this seen by a varying and "too big" latency?
  If yes: can you provide values (like 0,3 seconds) for this latency?
 

 

Kind Regards,
Thierry C - CLA, CTA - Senior R&D Engineer (Former Support Engineer) - National Instruments
If someone helped you, let them know. Mark as solved and/or give a kudo. 😉
0 Kudos
Message 3 of 3
(2,883 Views)