LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

wait until next ms multiple adding miliseconds

Hi all. 

 

I couldn't find a similar issue via the search, sorry if this has been solved already.

 

I've found this error in my 2009 version as well as in the older 8.2.1 that's still on my laptop.  When doing a basic while loop, I grab the time/date stamp off of my laptop.  The code is included as an image below. 

 

 

block.JPG

 

 

Essentially, I'm getting intermittant additional ms added to my loop (the first data file shows the incrementing ms column, the second shows a constant ms column which is how it should be).  The first data file snapshot shows my data collection from ~1pm today, the second from an hour ago.  I ran the same code, so something is wonky somewhere (maybe in my laptop setup?).  Has anyone ever encountered this before?  Essentially, I'm just trying make sure I don't get this "spill over" of a couple extra ms every loop b/c I'm taking a large dataset and they'll add up over time. 

 

Extra bonus question:  I'm trying to run this via a distributed application and the error propagates on the PC that I run the .exe code as well (the PC does not have labVIEW, just the run-time engine).  Do you think that the problem is in the distributed application or something to do with each PC that's running it?

 

1pm_data.JPG

 

 

 

data.JPG

 

Any help is greatly appreciated!

0 Kudos
Message 1 of 17
(4,617 Views)
One possible issue is with the write to file, which I suspect won't be a consistent time, will depend a lot on what else is going on with the computer. Which version do you have of the LabVIEW development? If you have "timed loops" available in your pallette I would suggest looking into how they work. They have a mechanism that detects when the loop timing runs long. Also, the "wait on next ms multiple" is known to run short on its first execution, it is spelled out in the "detailed Help", go to "Help" in the toolbar, select "Show context help".
Putnam
Certified LabVIEW Developer

Senior Test Engineer North Shore Technology, Inc.
Currently using LV 2012-LabVIEW 2018, RT8.5


LabVIEW Champion



0 Kudos
Message 2 of 17
(4,600 Views)

Hi there

 

The two clocks "system date/time" and "Tick count" have NOT the same timing source, they are actually two different mechnisms. The "system date/time" is just a simple clock, the tick count is calculated by the number of cycles of the processor since application start. Those two time sources are not synchronized, so it is impossible for the date/time to return the same number of ms as the tick count. Additionally the accuracy of the date/time clock is ~10 - ~100 ms, depending on your motherboard. On RT targets the date/time clock even stands still when time critical code is running!

 

if you need date/time AND ms then get the date/time of the start of the loop (in s as double), truncate to full seconds and add the number of tick count ms since start of the loop and convert to date/time string.

 

 

 

Best regards
chris

CL(A)Dly bending G-Force with LabVIEW

famous last words: "oh my god, it is full of stars!"
0 Kudos
Message 3 of 17
(4,589 Views)

Thanks for the replies Putnam and Chris.

 

Putnam,

The first data point can be ignored in my appication, so that limitation of "wait until next ms multiple" is fine.  I'll look into the "timed loop" function and post results.

 

Chris,

I'll look deeper into the "tick count" option.  Perhaps I can use that to drive the timing in my loop. 

 

Unfortunatley, I'm still stuck as of right now.  Logging the ms from the Date/Time isn't important to my VI, I was just using it to diagnose the problem.  Rather, looping each second on the second is what's critical.   I need a data point every second.  If my loop exectues every 1.001 seconds then I'm in trouble:

1 [sample/sec] * 3600 [sec/hour] * 72 hours = 259200 samples

(1sample/1.001sec) * 3600 [sec/hour] * 72 hours = 258941 samples

 

Isn't there a way to force the loop to hit the exact number of ms each time?  Is this what Chris was recommending with the "tick count" option?

 

 

0 Kudos
Message 4 of 17
(4,561 Views)
I'm not sure you will be able to acheve your desired accuracy on a Windows PC. Windows is not deterministic. To achieve your desired accuracy you will need a system that is deterministic.


Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 5 of 17
(4,557 Views)

Plus... you cannot rely on "wait until next ms multiple" to synchronize time. Open Context Help (under Help Menu or big question mark at top right).  Read the description. 

 

If you want precise time-triggers, you may need to rely on a different approach.  If you want to trigger every 1 second, you may want to try a timed loop.  There is another approach, but trying to write the words so that it makes sense is not easy...  If you still need help, I can create a small example.  Actually, there are a couple of examplesin this forum.  I recall a long thread discussing this a couple of years ago.

 

R

0 Kudos
Message 6 of 17
(4,550 Views)

Ray.R wrote:

Plus... you cannot rely on "wait until next ms multiple" to synchronize time. Open Context Help (under Help Menu or big question mark at top right).  Read the description. 

 

If you want precise time-triggers, you may need to rely on a different approach.  If you want to trigger every 1 second, you may want to try a timed loop.  There is another approach, but trying to write the words so that it makes sense is not easy...  If you still need help, I can create a small example.  Actually, there are a couple of examplesin this forum.  I recall a long thread discussing this a couple of years ago.

 

R


Would a timed loopeven give the level of accuracy he is looking for on a Windows platfrom? I don't think that NI would guarantee that.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 7 of 17
(4,547 Views)
The fact is Windows does not have any tracibility to the NIST "second"  so 'time' is a moot point unless you find a standard to compare to.

"Should be" isn't "Is" -Jay
0 Kudos
Message 8 of 17
(4,528 Views)

Mark Yedinak wrote:
...

Would a timed loopeven give the level of accuracy he is looking for on a Windows platfrom? I don't think that NI would guarantee that.


I have run multiple Timed loops (using hardware timing source) and I was able to maintain determinism at 1KHz and a short test worked fine at 2KHz.

 

That test was done of LV 7.1

 

No NI will not guarentee it because of people suing over injury etc.

 

Back the original Q

 

THe wait until next ... lookt at what the time is now, find the next even multiple of the value you specified and waits until that time. if your file I/O takes a little extra time (allocating more space etc, as Putnam mentioned) the "wait until will get pushed back to the NEXT NEXT even multiple.

 

I never use the Wait until next unless the customer explicitly wants that type of functionality (no one has asked for it yet).

 

So the Timed loop is the best way to keep a loop that does NOT occationally take extra time running at a steady rate.

 

Now since writing to file is occationally interupted (due to OS non-sense), the file I/O is probably the root of your timing going off. To aviod that you have to decouple the timing of the file writes from the hardware I/O....

 

So find the "Producer Consumer Design pattern" and learn to use it.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 9 of 17
(4,521 Views)

Ben wrote:

I never use the Wait until next unless the customer explicitly wants that type of functionality (no one has asked for it yet).

 


I remember when I took LabVIEW Basics I & II, the exercises in the book always instructed to use the Wait Until Next ms Multiple function rather than the Wait (ms) function.  I believe the instructor even put the emphasis on it as well.  It wasn't until months later that I really got into writing my own code and started browsing the forums that I started using the Wait (ms) function.

 

Perhaps if anyone has taken the Basics course more recently (mine was Sept. 2005) can tell where NI is placing the emphasis now.

 

The Wait Until function is going to give you the ability to have the While loop be basically synchronized so that they run on particular multiples.  But I don't know when that is a particular advantage.  If I need that kind of functionality, I am going to use the Timed While Loop structure.  The disadvantage of the Wait Until function is that it could really give a wider swing on how long a loop iteration lasts.  It could take 1 second (if the other code inside takes 0.999 seconds or less to run) or it could take 2 seconds (if the other code inside takes 1.001 seconds or more to run.)

 

That kind of variation could easily show up as a change in responsiveness to the user if that function is being used in polling a user interface.

 

I can't remember a case since I took Basics course that the Wait until was a more appropriate function to use for me.  So does anyone know why NI (and if they still do) placed an emphasis on the Wait Until function?

 

 

0 Kudos
Message 10 of 17
(4,515 Views)