LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

While loop deterministic timing

 

I have a program operating a series of tests.  The main while loop period is set at 30mS.  The picture shows the configuration.  The wire to dt carries the value 30.

 

Timing 

 

The loop contains a call to a DAQ running at 20KHz.  During the 30mS loop time, the program acquires waveform data, condenses it, and gets the timestamps.  During tests, the condensed data and timestamps are saved in a buffer.  When the test completes, the buffered data is formatted and written to a file.

 

When the file is written, timestamps are subtracted from t0 and the differences saved to the file.  A sample of the file output is below, with some data removed to get under the 10,000 character limit.

Notice that for most lines the time difference is 31mS, but every 11th or 12th line it appears that the loop time was 15 or 16mS.  Why would this be happening?

 

Sec Pos (mm) Amps Brake V Time Cycle
0 -0.105 0.014 0.676    
0.031 -0.105 0.014 19.083 0.031  
0.062       0.031  
0.094 -0.105 0.542 26.713 0.032  
0.125       0.031  
0.156 -0.101 1.015 26.751 0.031  
0.187       0.031  
0.219 -0.008 1.462 24.335 0.032  
0.234       0.015 0
0.266 0.24 1.859 13.519 0.032 1
0.297       0.031 2
0.328 0.503 2.253 13.526 0.031 3
0.359       0.031 4
0.391 0.769 2.113 13.529 0.032 5
0.422       0.031 6
0.453 1.048 2.022 13.531 0.031 7
0.484       0.031 8
0.516 1.323 1.962 13.534 0.032 9
0.547       0.031 10
0.578 1.604 1.947 13.536 0.031 11
0.609       0.031 12
0.625 1.875 1.94 13.536 0.016 0
0.656       0.031 1
0.687 2.165 1.93 13.537 0.031 2
0.719       0.032 3
0.75 2.444 1.934 13.539 0.031 4
0.781       0.031 5
0.812 2.73 1.936 13.54 0.031 6
0.844       0.032 7
0.875 3.001 1.937 13.541 0.031 8
0.906       0.031 9
0.937 3.288 1.937 13.541 0.031 10
0.969       0.032 11
0.984 3.569 1.939 13.542 0.015 0
1.016       0.032 1
1.047 3.856 1.931 13.542 0.031 2
1.078       0.031 3
1.109 4.13 1.918 13.542 0.031 4
1.141       0.032 5
1.172 4.418 1.909 13.542 0.031 6
1.203       0.031 7
1.234 4.698 1.897 13.543 0.031 8
1.266       0.032 9
1.297 4.978 1.89 13.544 0.031 10
1.328       0.031 11
1.359 5.264 1.89 13.545 0.031 12
1.375       0.016 0
1.406 5.543 1.89 13.545 0.031 1
0 Kudos
Message 1 of 7
(2,933 Views)

I do not understand the fine details for the timed loop, but I'm willing to bet that your problem here has something to do with one of these settings:



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 7
(2,931 Views)

It proberbly also has something to done with the resolution of the timestamp that the OS can supply.

Windows XP has only a resolution of around 15ms in it's timestamp.

 

You should use the Tick Count (ms) vi or some of the time info from the while loops Left Data node for this type for timing measurement.

Message 3 of 7
(2,914 Views)

@dkfire wrote:

It proberbly also has something to done with the resolution of the timestamp that the OS can supply.

Windows XP has only a resolution of around 15ms in it's timestamp.

 

You should use the Tick Count (ms) vi or some of the time info from the while loops Left Data node for this type for timing measurement.


Not to mention that if you are running under Windows the OS itself is not a realtime OS. That is, Windows is nondeterministic.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 4 of 7
(2,908 Views)

My bet's that dkfire nailed it.  If so, add your kudos to mine.  The stuff mentioned by crossrulz may also prove relevant after you handle the more fundamental issue of limited resolution.   I'm not actually sure whether the resolution problem kicks in natively with the timestamp or whether it comes the subtraction which converts to floating point "seconds since 1904".  Hang on a sec (no pun intended).

 

Hmm, looks like it may be inherent to the timestamp.  I tried converting to a time record cluster and saw the same quantization.  Go with dkfire's suggestions.

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 5 of 7
(2,867 Views)

This does not answer your question, but just so you know, you can extend the terminal where the "error" input is and select previous iteration time so you don't have to calculate it yourself.

0 Kudos
Message 6 of 7
(2,859 Views)

Personally, I'm not sure of the usefulness of a timed loop on windows and if you have an NI DAQ card, you can get much better timing just by using hardware timed input and the timestamp that you get from DAQmx. You really should post your VI.

0 Kudos
Message 7 of 7
(2,857 Views)