02-16-2015 11:45 AM
I am using a camera to start acquiring at 300 fps when IR sensors are activated. The program is also saving the frame and the correspondent time stamp in a txt file.
When I analyze this file using 'diff' function in matlab (who gives me the time difference (in ms) between each frame)) I see something weird happening.
Most of the time the camera is acquiring correctly, (each 3/4 ms acquires an image) however each 1000 ms there is a weird peak, where the datarate slows down. Also, in the beginning of the movie (first ~180 the camera is acquiring 2 images in the same milisecond (see camera_frames.txt).
My question is why these two phenomenon are happening and how can I fix them?
The labview program, the txt file and the diff figure are attached. Thanks in advance.
02-16-2015 05:58 PM
Try decoupling the image acquisition from saving to disk by placing the corresponding code in separate loops. Also make sure you have enough cores available, say i5 or similar. The thing to remember is that your code is not running on a RT target, so asking why something is not consistent is you know...
02-17-2015 05:57 AM
The image acquisition and the image saving are already in two different loops using the master/slave architeture.
I checked CPU usage and is using the full core with high performance plan (i7 3.40 GHz processor).
02-19-2015 03:50 AM
Hi,
It seems that you have on the same while loop a function that acquire DAQ datas and Image datas...
I also don't know where do you extract does information about time between frame. Is it a text file that you build
or extrat on video file?
Paolo_P
Certified TestStand Architect
Certified LabVIEW Architect
National Instruments France
02-19-2015 10:19 AM
02-20-2015 11:32 AM
I solved half of the problem. I removed the IR sensors acquisition (the trigger to finalize the camera recording) from the camera acquisition loop and the slow data rate peaks that were appearing each 1000 ms disappeared.
The phenomenon that happens always in the beginning of the acquisition (~ first 200 ms) is still occurring. I think this problem is coming from the camera itself. Anyone have any clue how to fix this?
02-20-2015 12:45 PM
You need to configure your image buffers in advance of the acquire. This because is takes time to reserve that memory.
The overall structure of your code is problematic in that everything is happening linearly, along the wire paths.
What kind of camera are you using? I can assume that it is either CameraLink, or Firewire. If the camera has trigger lines, you should make use of them, instead of relying on software triggering.
I would strongly recommend that you look at the state machine architecture. The JKI Statemachine toolkit is particularly good. Use at least two. One for Camera management, and another for the DAQ, if you still need to use it.
Here is what I have in mind for the camera state machine:
ON_IDLE_UPDATE STATUS
OPEN_CAMERA (called when initializing)
RESERVER_BUFFERS (called when initializing)
WAIT_FOR_TRIGGER (called during IDLE, or if you want to use it as an arming state, you can think of it as a sencondary IDLE state
CAPTURE (called when trigger asserts)
CLOSE CAMERA (called when program is quit)
SAVE_IMAGES (can be called from IDLE, or GUI event)
Again, don't write to disk during acquisition. Store as much as you cam in memory, and dump to disk at IDLE.