07-05-2013 08:54 AM
It works fine, but too slow and unpredictable. CCD acquisition occurs at a 1.5 KHz rate, by calling a DLL inside labview, which is working fine. We do not miss a single pulse once we start calling the DLL in the for-loop, USB + camera + labview are fast enough. The problem seems to be the unpredictable time delay between detecting the edge and calling the DLL. The current program is attached.
Right now, a dummy voltage measurement is performed, triggered on the falling TTL edge. Of course, the voltage measurement itself costs time unfortunately. I am looking for a smarter way of progamming, such that execution proceeds directly after TTL edge recognition. Any ideas?
Background:
1.5 kHz laser system, which gives electric triggers to the CCD controller and chopper. Part of the laserlight is modulated by an optical chopper, which gives a TTL signal to communicate the current state: blocking or open. We want to start acquisition with a known chopper state. (This topic was originally posted in hardware, but I realised this is the proper board.)
07-05-2013
01:37 PM
- last edited on
08-13-2024
11:12 AM
by
Content Cleaner
I am not sure, but I would guess the "problem" is the USB. It is non deterministic. With a low performance PC and since it runs on windows... it may become more "unpredictable".
If this the case, you should choose a camera that takes a trigger instead.
from:
" Another important factor in single-point I/O applications is determinism, which is a measure of how consistently I/O can execute on time. Buses that always have the same latency when communicating with I/O are more deterministic than buses that can vary their responsiveness. Determinism is important for control applications because it directly impacts the reliability of the control loop, and many control algorithms are designed with the expectation that the control loop always executes at a constant rate. Any deviation from the expected rate makes the overall control system less effective and less reliable. Therefore, when implementing closed-loop control applications, you should avoid buses such as wireless, Ethernet, or USB that are high in latency with poor determinism."
07-08-2013 02:49 AM
Hello LucasKun,
If you want to have a guaranteed maximum delay/latency of 0,1ms, then I would not advise using a USB-device (if you don't have a trigger input on it).
Defining a guaranteed maximum delay/latency of 0.1 ms for your measurement system actually comes down to defining that your measurement system should be deterministic.
Depending on what device you're using and what you're trying to do USB can easily have a delay bigger than 0.1 ms (that is variable).
What kind of camera are you using?
Does it have a trigger input?
Some things that are unclear and can also have an effect:
- What does the VI "read ccd" look like?
- How do you measure the "slowness" and "unpredictableness" of the system?
Is this seen by a varying and "too big" latency?
If yes: can you provide values (like 0,3 seconds) for this latency?