LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

measure time lapsed from reaching threshold to execution

I have code written up in LabVIEW as shown below.  If I remove the clock (my feeble attempt to measure the execution time of the boolean), the code works fine.

 

It is a block of code I wrote such that when the measured voltage exceeds the threshold voltage, set at 1 in the image below, the cutoff case gets executed and the power supply gets turned off.  

 

I need to know how long it takes for LabVIEW to cut off the power output once it reaches the threshold voltage. 

 

I suppose to be exact, I need to include the time it takes for LabVIEW to execute 'the greater than 1' comparison to accurately measure time but I wasn't sure how I would include that either.

 

For the first iteration, I would be happy just to be able to measure the time it takes for 'the and gate' (1) to be processed and voltage to be cut off (2).

 

Any help would be much appreciated. 

 

 

time measured.JPG

0 Kudos
Message 1 of 60
(3,509 Views)

Suggestion

Use dataflow from error bus.

then use tick count and compare the value before and after.

Benoit

0 Kudos
Message 2 of 60
(3,508 Views)

What precision of timing are you looking for?  How fast do you need that to execute?  How long is too long that makes you concerned about the time of execution.

 

Are you saying you want to know how long it takes for LabVIEW to go from executing the AND function to executing the case structure?  That is going to be virtually instantaneous.  You won't be able to put any timing measures in there to that will show you how long it takes.

0 Kudos
Message 3 of 60
(3,488 Views)

I would like it to be less than 1 micro seconds.  I dont know if it's possible though.

0 Kudos
Message 4 of 60
(3,486 Views)

1 microsecond timing is not good for windows, since it is not a real-time OS.

2 choice.. use CRIO product or use micro-controller or even FPGA.

 

Benoit

0 Kudos
Message 5 of 60
(3,484 Views)

Why such a tiny number as a microsecond?  As nighthawk pointed out, Windows can't guarantee anything like that.  And throw in GPIB or anything serial, delays there pretty much make timing resolutions of microseconds meaningless "noise".

0 Kudos
Message 6 of 60
(3,451 Views)

You have better performance improvements to make than an X And TRUE.  Despite that being the same as X. (with debugging disabled you would really need to trick the compiler to not throw out the AND in compiled code.)

 

You are suffering from a case of DDTs. The Express vis that cast real data types to and from the virtual DDT are designed for ease of use and not performance.  If it has a blue border, get it replaced when performance has even a remote possibility of becoming a concern.


"Should be" isn't "Is" -Jay
0 Kudos
Message 7 of 60
(3,442 Views)

Hello,

It seems you are suggesting I do not use comparison (circled in the image below) 

I apologize for not being able to understand your suggestion very well.  I have little experience with LabVIEW and the whole thing in general.

 

if you could elaborate a bit more, I would appreciate it 

 

circled.JPG

0 Kudos
Message 8 of 60
(3,430 Views)

I have written a program as shown in the image below.

 

It's a program that cuts off voltage power supply when the measured voltage value exceeds some threshold value set by your input numeric. 

 

I waned to know how long it takes for LabVIEW to execute that cutoff command the moment the measured value exceeds the threshold value. 

I have artificially added a comparison (marked as 2) to include the time lapse of the comparison (marked as 1) into the total time. 

 

I would appreciate it if you guys spot any obvious ways I can speed up the program. 

 

I think that the only possible source of improvement is the comparison (1).   

comparison.JPG

0 Kudos
Message 9 of 60
(3,448 Views)

I assume that you asked me.  Sometimes it helps to mention to whom you are replying.

 

The comparison is not the problem.   That you are comparing a dynamic data type is the problem.   Those values need to get changed to a real discrete type compared and then converted back to dynamic data.  That takes more time and memory than you would expect.  The DAQ Assistant is also a poor performance node.  It's much faster (like 100x faster) to use the DAQmx API calls that are optimized for performance. 

 

There is a DAQmx course available in the DAQmx badge learning path.  And the LabVIEW and DAQmx help files are, well...I don't want to sound condescending but, they are "Helpful!"

 

 


"Should be" isn't "Is" -Jay
0 Kudos
Message 10 of 60
(3,427 Views)