LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

how to measure time taken by VI for run

Hello,
I am using PCI-DAS card for Analog Input and Digital Output.
In the Analog Input for the voltage measurement labview takes near
about 1 second.But i want to reduce that time...means it has to take
one reading only in some miliseconds and there is no loops nothing
complicated just simple functions r there and as i run the vi it shows
one voltage reading.
I dont know how to reduce time period in this case.
And another thing is that i am saying this 1 second as per my
watch.but i want Labview itself shows the total time taken for the
execution.

Anybody knows about it please mail me and post the answer.

Thanking You

Safdar...
0 Kudos
Message 1 of 3
(2,825 Views)
If you want to know the time, an easy way is to use the Profiler (Tools>Advanced>Profile VIs). There's also the shipping example called Timing Template (data dep) that you can drop you VI into and get a time result back.

As far as why it's taking 1 sec, you first need to determine where the time is being spent and the Profiler should help with that. You might be doing some setup of the daq board and the actual acquisition is taking some small amount. If that's the case, putting the acquisition in a loop should help. You might also have configured the acquisition such that it takes that amount of time. For example, if you set it for 1000 scans per second and return 1000 scans, each measurment will take 1 second. Thirdly, the board or driver might be slow
. The Profiler should also help here. Lastly, can you post your program? Seeing what youo're actually doing will help isolate the problem.
0 Kudos
Message 2 of 3
(2,825 Views)
Hello Mr.Dennis Knutson
Thank you very much for the answer.
with the help of Timing Template(data dep) i can measure the time taken for my vi to run.
Once again Thanks a lot.
Bye
Safdar...
0 Kudos
Message 3 of 3
(2,825 Views)