LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How does Elasped Time feature work and expected accuracy

Solved!
Go to solution

I am trying to time an operation to millisecond accuracy and have been using the Elapsed Time function found in Labview. Becasue I am getting so much lag in my timeing cycle, I had to go back and review the Elapsed Time function.  I set up a test VI (attached) whcih allows me to set an elapsed time target input and collect the actual time.  In my full app I have been using the output as the timing parameter, which results in a large lag in my process.  Why doesn't the input target match the output result and is there a way around this?

 

I attached a test VI (version 8.5) that allows me to set an input target and view an output value.  I also tried to add an offset correction value hopeing to establish an offset to accommodate the error.  However, I find that the output does not match my target, but the error is random and not predictable.

 

This random inaccruacy makes it impossible to use this as a timing device.  Note that I need accuracy to the millisecond level and the help level research that I have done suggests that Labview should have not problems with millisecond accuracies.

 

Why does this happen and is there any way I can get around it to get the level of accuracy I need?. 

 

Thanks to anyone that can help

0 Kudos
Message 1 of 13
(4,430 Views)

First,  the elapsed time function uses the Get Date/Time function at its core which is based off of the Windows clock.  The accuracy of the Windows clock is only on the order of about 16 msec.

 

You may want to consider using the Tick Count function on the Timing palette.  It isn't based on the Windows clock but uses the clock at a lower hardware level.

 

Also, why are you reading and writing to Value property nodes of your front panel controls?  Property nodes are slower than reading or writing directly to the terminals, or even to local variables by 1 or 2 orders of magnitude.

 

Also, you have several while loops in there that are running as fast as possible consuming 100% of the CPU resources.  You may want to put at least a 0 msec Wait function in those loops to allow the CPU time to handle other tasks (such as updating the Windows clock.)

Message 2 of 13
(4,426 Views)

As Ravens already said, your code is highly flawed. Here are a few more points:

 

If something needs to happen after an exact time interval, you would set up an appropriate delay so it happens automatically at the right time. You don't want to poll a timer like crazy. You could use an event timeout or occurrence timeout, etc.

 

If you need reproducilbe accuracy, you need a LabVIEW RT or FPGA system or some other hardware timed solution that is independent of the windows clock.

 

Yes, the elapsed time express VI carries a lot of baggage to support a lot of functionality. All you need is compare ticks.

 

Don't architect a diagram like an onion with layers and layers of stacked structures. A single outer while loop is enough, ther rest could be a single case structure with one case for each "state" of the program.

 

None of your local variables and value properties are needed. Use latch action booleans.

 

There are race conditions! For example your code does not guarantee that the elapsed time is set via the value property before the associated control is read inside the inner while loop. The loop and the two value properties don't have any data dependency so execution order is not defined. (LabVIEW does not execute left to right!)

 

Message 3 of 13
(4,419 Views)

Using the tick count function actually solves a few problems.  I updated a VI to implement this and attached it.  Do you feel the tick counter is sub-millisecond accurate?  Note that I am a brute force programmer whose expertise lies in the application and the programming so many of the suggestions I recieved is a bit over my head..... you mentioned putting a 0 msec Wait feature in this.  Can you point out where it should go in the code and what I gain by doing this.   

0 Kudos
Message 4 of 13
(4,389 Views)

I am viewing a video signal while I am watching the timed cycle and after the time target is hit (I hope) I am saving an image.  Does your suggestion about "not polling the timer would be better" still apply if I want to see the video signal running in real time.  Can you provide an example of an event or occurance timeout?

 

Note that my expertise is the applcation of the program and not the programming so some of your suggestions are over my head.  Any expamples of your suggestions would be appreciated.

 

I updated my sample VI to implement the new method per recieved suggestions.  Does this better satisfy the race conditions you mentioned?  

0 Kudos
Message 5 of 13
(4,385 Views)

Hi irfocus,

 

- Don't use property nodes where a local is sufficient.

- Don't use locals where a shift register or a terminal is sufficient!

- Don't apply text-based programming style to LabVIEW!

 

You don't need 5 significant numbers for your timer - you only get 3 decimal digits...

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
Message 6 of 13
(4,370 Views)

The 0 msec wait should just be placed inside your while loop.

Message 7 of 13
(4,347 Views)

 


@irfocus wrote:

Do you feel the tick counter is sub-millisecond accurate?   


How can you even ask this question, since it is an integer in units of milliseconds. 😮

(actually, on some systems it is quantized to two milliseconds). The answer is definitely no!

 

See also this answer for more details..

 

At least on Windows, you can create a microsecond tick count analogue, but it is pretty meaningless.

 

The problem with you polling approach is the fact that it consumes so much CPU that it will actually delay other processes.

 

0 Kudos
Message 8 of 13
(4,332 Views)

Sorry that you miss understood the question, in fact, this is a legitimate question when trying to determine accuracy.  It doesn't matter that the value we get is an integer value.....  that says nothing about its origin or its accuracy. If the tick counter provides a millisec value as an integer it must be monitoring something with more precision than that integer to have confidence in the value.  Case in point, the Elapsed Time function that I started with provided double precision input and output but, for a precision timer, both seem to to be a useless number.

 

So my question remains,  how accurate is this function? 

 

By implementing the Tick Counter by replacing the Elapsed Time function it appears to have improved my application but I would like to know where I stand since I have no way possible to bench mark this and I have to go on faith.

 

 

 

0 Kudos
Message 9 of 13
(4,325 Views)
Solution
Accepted by topic author irfocus

Did you read the link i posted?

 

You are not on a realtime OS, so, no matter what, you are digging yourself into a hole here. 😉

Message 10 of 13
(4,314 Views)