LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to measure the execution time of a specific process in a VI?

Solved!
Go to solution

Hello! My VI has two processes: image compression and edge detection and I'm trying to measure the execution time for the edge detection process of my VI, but I don't know how to do it. Please give me ideas on how to do it. Thank you!

0 Kudos
Message 1 of 9
(9,695 Views)
Solution
Accepted by topic author ariahs

There are multiple ways you can do.

-One is by using tick count before and after your vi to get the time like here: http://digital.ni.com/public.nsf/allkb/6F6B9F4E149C80578625652800784764

-or use Profiling: https://zone.ni.com/reference/en-XX/help/371361H-01/lvhowto/profiling_vis/

 

Edit: If you are open to suggestions:

-You don't have to load the image inside loop instead do it outside loop.

-And i see many IMAQ buffers are not disposed properly. You can dispose all at once by making All Images (No) by connecting to Boolean TRUE: http://zone.ni.com/reference/en-XX/help/370281P-01/imaqvision/imaq_dispose/

Thanks
uday
Message 2 of 9
(9,687 Views)

Another option, (maybe not the best but...), if you have the Desktop Trace Execution Toolkit you can insert "Generate UseriDefined Trace Events" before and after the edge detection step.  Then while the VI is running filter the trace for your events and see the elapsed time between them.

 

Edge Detect Benchmarking.png

Quentin "Q" Alldredge

Chief LabVIEW Architect, Testeract | Owner, Q Software Innovations, LLC (QSI)
Director, GCentral | Admin, LabVIEW Wiki | Creator, The QControl Toolkit
Certified LabVIEW Architect | LabVIEW Champion | NI Alliance Partner



0 Kudos
Message 3 of 9
(9,672 Views)

May be the best menthod is,you can get time just before the  edge detection step and get the time just after edge detection step and Subtract the values which will give you the time taken for executing a specific task.

----------------------------------------------------------------------------------------------------------------
Palanivel Thiruvenkadam | பழனிவேல் திருவெங்கடம்
LabVIEW™ Champion |Certified LabVIEW™ Architect |Certified TestStand Developer

Kidlin's Law -If you can write the problem down clearly then the matter is half solved.
-----------------------------------------------------------------------------------------------------------------
0 Kudos
Message 4 of 9
(9,653 Views)

Adding few more ways to get more Higher resolution methods:

-You can use High resolution relative seconds vi instead of tick count(mS) which will give moe accurate than tick count(mS): https://zone.ni.com/reference/en-XX/help/371361L-01/glang/high_res_rel_sec/

-Or you can use this Tick Count(uS) vi in community example :https://decibel.ni.com/content/blogs/EvanP/2010/10/04/tick-count-us--microsecond-timing-granularity-...

@Palanivel: Are you talking about about Get time in seconds? https://zone.ni.com/reference/en-XX/help/371361H-01/glang/get_date_time_in_seconds/ If it is i don't think it will be more accurate below Seconds level.

Thanks
uday
0 Kudos
Message 5 of 9
(9,642 Views)


@Palanivel: Are you talking about about Get time in seconds? https://zone.ni.com/reference/en-XX/help/371361H-01/glang/get_date_time_in_seconds/ If it is i don't think it will be more accurate below Seconds level.


You are right, get time in seconds will give the difference in the range of seconds which will miss the difference in msec values.

i was just mentioning the method, without explicitly mentioning the pallete, may be my explanation might have created an impact as get time in seconds.....:sorry about that.....Smiley Happy

----------------------------------------------------------------------------------------------------------------
Palanivel Thiruvenkadam | பழனிவேல் திருவெங்கடம்
LabVIEW™ Champion |Certified LabVIEW™ Architect |Certified TestStand Developer

Kidlin's Law -If you can write the problem down clearly then the matter is half solved.
-----------------------------------------------------------------------------------------------------------------
Message 6 of 9
(9,634 Views)

Thank you so much!

0 Kudos
Message 7 of 9
(9,579 Views)

Also have a look at the "benchmarking" part and the optimization examples of the recent (2016) NI Week presentation. We compare some of the clock sources, their accuracy and long-term behavior.

We also demonstrate some of the typical benchmarking harnesses. Also be aware of factors that can falsify the results.

 

For typical benchmarking I prefer the high resolution relative seconds already mentioned earlier.

Message 8 of 9
(9,572 Views)

@PalanivelThiruvenkadam wrote:


@palanivel: Are you talking about about Get time in seconds? https://zone.ni.com/reference/en-XX/help/371361H-01/glang/get_date_time_in_seconds/ If it is i don't think it will be more accurate below Seconds level.


You are right, get time in seconds will give the difference in the range of seconds which will miss the difference in msec values. (No, it will not!)

i was just mentioning the method, without explicitly mentioning the pallete, may be my explanation might have created an impact as get time in seconds.....:sorry about that.....Smiley Happy


Let's clear up some misconceptions. Please only write what you know, not what you think or don't think! Before making such random statements, actually try it! (You can format a timestamp with fractional seconds and you can convert to DBL for subtraction and display fractional seconds). Get "data/time in seconds" returns a a 128bit(!) timestamp, which has all the resolution you possibly need. It does not mean that it is quantized to one second, it can theoretically deal with fractional seconds with near infinite resolution (but there are typically some OS limitations).

 

From the mentioned presentation, here are the three main ways to get an elapsed time by taking the difference between two successive calls to the function:

 

Get date/time in seconds

Shows some nonlinearity due to resynchronization to the atomic clock. Very good fractional second resolution. linear segments separated by larger correction drifts.

 

Tick count

Quantized to 1ms and thus not suitable for fast processes. Rolls over after about 49.71 days (2^32 milliseconds) and thus not suitable for anything that takes longer than that.  A rollover does not matter if the elapsed time is below ~50 days due to the magic of unsigned integer math. Very low and very linear drift! Units are milliseconds.

 

High resolution millisecond timer

Very high resolution. Very low and very linear drift that is slightly more than for tick count. This is typically the recommended function for benchmarking. Units are seconds.

 

Message 9 of 9
(9,523 Views)