LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Measuring time for a while loop to execute once

Solved!
Go to solution

Hello guys,

 

I want to measuring time for a while loop to execute once. There are chunk of code sinvolved. So I just created a simple VI to try, please let me know is that the right way to do it?

 

And I was wondering in this case, when I run the VI without Highlight Execution, it would not give me a number, Maybe it because the code is to simple and run really fast? Do I have to Highlight all the time?

 

Thanks

0 Kudos
Message 1 of 8
(6,063 Views)
Solution
Accepted by topic author huskerli

Not quite right.  Both get time value primitives will execute at about the same time.  use a sequence frame to force the execution order as shown.  I also brought in the precision relitve seconds vi from VI.LIB\utilities as it depends on the system precision timer rather than the mSec timer.

!0.png


"Should be" isn't "Is" -Jay
Message 2 of 8
(6,054 Views)

Ok there are two issues with your code.

One is you have both of the ms Tick counts out side of the loop, therefore you cannot control whether the second Tick Count will be executed before or after the while loop is complete.

If you are simply trying to test the speed of a loop you can use Sequence Structures and put the Tick Count meant to indicate the termination of the loop in the final Sequence. see attached VI.

 

Your second issue is that you are trying to time a loop which is most likely taking a fraction of a ms to complete using a ms Tick Count. therefore unless there are some serious issues with your pc you will always get zero as an answer since the loop will complete in less than a ms. one way you can work around this is if you increase the number of executions in the loops and then devide the time it took to complete by the number of iterations.

 

Good luck 

Message 3 of 8
(6,050 Views)

Nah, don't use a sequence frame to force sequential execution, just put your second timer in a second while loop, like this:

 

time.png

 

Of course, this will only give an approximate time, because it will be the total time for 999.999 loops and one timer read.

 

BTW, if you leave your waveform chart inside the first loop, you're talking about 540 msec/1,000,000 loops (on my computer, YMMV). If you take it out, you have cut that time to about 150 msec/1,000,000 loops (150 nsec/loop). Screen refresh is expensive.

 

 

But, if you just want to see how long it takes to execute a generic while loop, take the random number function out (and the graph, just to make sure that it doesn't interfere), then you get about 9 nsec/loop!

 

 

Cameron

 

To err is human, but to really foul it up requires a computer.
The optimist believes we are in the best of all possible worlds - the pessimist fears this is true.
Profanity is the one language all programmers know best.
An expert is someone who has made all the possible mistakes.

To learn something about LabVIEW at no extra cost, work the online LabVIEW tutorial(s):

LabVIEW Unit 1 - Getting Started</ a>
Learn to Use LabVIEW with MyDAQ</ a>
Message 4 of 8
(6,045 Views)

Thank you for reply.

 

I just want to ask when I increase the number of executions in the loops and then divide the time it took to complete by the number of iterations. What I get is not quite linear. Is that because taking a fraction of a ms?

 

Thanks

 

0 Kudos
Message 5 of 8
(6,003 Views)

Not Quite Linear.

 

Yup that's one way to put it.  The optomizer can really do a number of things under the hood and will do things to improve vi performance.  Its pretty smart!  If the perfomance was linear I'd be really surprised or suspect you had a really old processor. (Not too likely with you running a modern version of LabVIEW)

 

The granularity on the mSec timer will add some small error as well but if you arn't using the UI thread in the while loop, there is about as much processor time involved in setting up the loop boundries and much less involvement to actually execute an iteration.


"Should be" isn't "Is" -Jay
Message 6 of 8
(5,992 Views)

Last question

 

Since the mSec time is not that precision relative seconds and cannot convert millisecond timer value to a real-world time, is that mean I can not use that value as a reference to switch to AC module? 

 

The reason why I am doing this is because I want to add functionality to AC

 

Do I need to measure the real- world time in order to switch to AC?

 

Thanks

0 Kudos
Message 7 of 8
(5,972 Views)

As far as having none linear time periods,

majority of the time difference is caused by the fact that you are using your windows based PC not a real time system.

You have to understand that sometimes when you are running software on the PC, windows will decided to perform some low level tasks, (check the network connection, check for usb connections, start running a anti-virus software). All of these tasks will effect the determinism of your code. Meaning there is no gurantee that a certain task will be perform within a given time period.

 

Also there are more major interenferences, for example if you are running your while loop with the display control outside of the while loop instead of inside you should have a more stable timing plot (most likely depending on your system) this is because you are running the while loop without a delay and the UI Queue will only use one thread to execute, therefore most likely you are building up the UI Queue.

 

The moral of the above comment is that "You cann't have a deterministic windows based software, that's why we have real time systems."

 

If you would be a little more specific on what exactly it is that you are trying to do. The Ms counter is "fairly" accurate depending on what it is that you are trying to perform. 

 

Dan,

 

Message 8 of 8
(5,966 Views)