Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

image acquisition timeloop whileloop

Hi,

Can you please help me to understand why the results are different for the two attached block diagrams?

I have a linescan camera. The operating frequency is 1500Hz. In order not to face speed problems I have set to the MAX to send packages of 200 lines.

That means that it needs 133msec for each package so it is 7-8 frames per second.

For the acquisition I have created the same program twice once using a while loop with wait function and another one using a timed loop.

The block diagrams of the two programs are illustrated in the attached files.

Both programs have an initial part of creating and configuring buffers which is the same with the ring acquisition example shipped with labview.

So what you see in the pictures is the basic part of extracting the buffer and just displaying it.

In the while loop as you can see in the picture the value in the wait function is 130ms. I just want to wait 130ms between consecutive runs without doing anything and then to execute. The result is that the time needed to exctract the buffer is 3 or 4 msec. That is normal to me meaning that 130 msec waiting plus another 4msec for the execution give the same frame rate as written above. I have the same the result either if I have tha wait function in the begging or in the end as it is in the attached picture.

Now I tried to reproduce the same thing with timed loop. It is the same block diagram with the previous but using a time loop.

I have entered dt=135 msec and an offset in the begging of 135msec so there is enough time for the first image to come in the buffer. Then every 135msec shall just extract the image. The problem is that the calculated time to extract the buffer is varying around 130ms. Since the image is in the buffer why it is not 4 or 5 msec as in while loop and it is around 130msec. I can't understand this. How can I with the timeloop have the same behaviour as in the while loop which look more normal to me?

I have tried various setting for dt and offset but never managed to reproduce the same result as using the while loop.

Any explanation?

Thank you in advance.

 

Achille

 

P.S why in the front panel in the dipaly image if I just roll over my mouse with do anything cahnges so much the calculated time? Is there any way to deactivate this effect?

imaq_whileloop.PNGimaq_timeloop.PNG 

 

0 Kudos
Message 1 of 8
(4,121 Views)

Achille,

 

The timed loop will (by default) try to maintain the phase of each loop iteration's timing. Thus, if you start off out-of-phase with the camera's frame interval then you will remain forever out-of-phase. The normal loop that simply waits _after_ getting an image will thus self-correct automatically to remain in-phase with your image acquisition.

 

I think the problem stems from the fact that the t0 offset into the timed loop is just relative to other timed loops by default. It is not necessarily a fixed offset from when the loop starts executing. I think in LV 2009 you can have it start at an absolute time.

 

You could try fixing it by making the timed loop adjust its next iteration offset within each loop iteration so it would compensate the same as the non-timed loop. You likely would want to do this anyways since the camera's frame rate is likely driven by its own clock which will drift in relation to the timed loop's clock and will eventually become out-of-phase regardless of how close they start off at.

 

I would be curious to know why you are using the timed loop anyways. They have their uses, but you have to be careful where and how you use them or else they can hurt you more than they help.

 

Eric 

0 Kudos
Message 2 of 8
(4,113 Views)

Thank you for the answer. The reason that I am using timed loop is because I need two of them. One with high priority that do the image acquisition and the second one with lower priority that process the captured frames.

The image acquisition is crucial since I don't want to miss any frame.

As I have understand from your answer is difficult to synchronize the time loop execution with the card sending frames.

If someone would like to run this application is real time OS ( like in PXI) how can synchronize the timed loop with the frames acquisition? I mean in RT OS as far as I know it is must to use timed loops.

Thank you once more for trying to help.

 

Any idea about the issue written in the P.S of the original message? 

 

0 Kudos
Message 3 of 8
(4,096 Views)

Hello,

 

I am not exactly clear on the behavior you are trying to describe in the P.S. of your original post. Can you provide a more detailed explanation? If it simply takes additional time when the mouse enters the image display, this could be because the display is having to redraw with the mose over it.

 

-Zach

0 Kudos
Message 4 of 8
(4,067 Views)

Thank you for the reply Zach.

As you can see in the while loop block diagram of my original post I am calculating the time needed to copy the buffer and display it. Normally this process when the mouse is not in the display image area of the front panel but anywhere else in the front panel is about 2msec.

If I move the mouse in the image display area while displaying images the same indicator shows needed time about 100msec.

I don't know why this happens but I want to have in the front panel the display of acquired image without the calculating time being affecting by the mouse position.

Is it possible ?

Thank you once more.

 

Achilles 

0 Kudos
Message 5 of 8
(4,021 Views)

Hello,

 

It sounds like the front panel is having to redraw the image display when your mouse is over it. I would recommend using the property Defer Panel Updates to control when the front panel is updated. You would set the property to true in a sequence frame before you get your first tick count and then set it back to false immediately after you get the second tick count. 

http://zone.ni.com/reference/en-XX/help/371361F-01/lvprop/pnl_defer_pnl_updts/

Let me know if this has any effect.

 

-Zach

0 Kudos
Message 6 of 8
(4,001 Views)

Hello Zach,

I have tried your proposed solution without sufficient success. The reason is that when updating the image display update also the rest of the front panel controls in case there are changes. That means that the calculated time varies and depends on the number of front panel pending updates.

What I need is to display the image without the calculated time being affected from the mouse position.

As I have written I will have two loops in my vi. One for the acquisition and one for the process. If we assume that the calculated time in the acquisition loop remains constant then you know how much time is available for process without loosing frames or overwriting buffer. But if the time in acquisition loop varies then always after some time definitely you will loose frames.

Why is so necessary to redraw the image if someone place the mouse on it. I have seen a lot of applications developed on other platforms that the time do not varies or depends according the mouse position. Isn't possible this to labview?

If not please take it into consideration because at least to my eyes is a serious bug to try someone to create an image acquisition application and the time to vary according to where you place the mouse in the front panel.

0 Kudos
Message 7 of 8
(3,986 Views)

Hello,

 

If you are placing the mouse over the image display, it has to redraw the display in order to account for the mouse pointer on the screen and any movement from the mouse. If you need to process the image in a certain amount of time, separate the display and processing functions. You could create a copy of the image for a display loop and then use that loop to display the image with the current copy. This would separate the processing and display functions so that they would happen in parallel and you shouldn't lose frames in processing.

 

-Zach

0 Kudos
Message 8 of 8
(3,960 Views)