LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

What is going on "under the hood" of the 'Wait Until Next ms Multiple' vi?

Hi,

 

This question is really for the developers of LabVIEW, and I don't know if they are willing to divulge what might be considered "trade secrets". Anyway, I just thought there's no harm in asking.

 

I use the Wait Until Next ms Multiple vi and it works great. For example, if I set my program to output a character on the serial port every 20 millisecond (mS), and check the output on an oscilloscope, I see the output at 20 +/-2 mS (that is, 18 to 22 mS). That's quite good and is all that can be expected on a PC running a non-real-time OS such as Windows.

 

My question is this: Since the default Windows timer has a tick rate of 15.625 (1/64th of a second) (some PC's may default to 10 mS), how does LabVIEW attain roughly 1 mS accuracy? If the same thing is coded in any .NET framework program, such as C# or Visual Basic using any of the available timers (the System.Windows.Forms timer or the .System timer), the timer interval may be set in 1 mS increments, but if you time it you will see it actually has a granularity of 15.625 mS. So instead of getting 20 mS I get 2 ticks, or about 31 mS. So I can get an output every 15.6, 31, 47, 62.5, etc. but nothing in between.

 

The System.Diagnostics.Stopwatch timer can be use to time code with very high resolution, or to create a high-CPU usage delay loop, but I haven't been able to find an easy way of generating an interrupt (timer tick) at say 1 mS. There are Win API functions that can set the timer to 1 mS on XP and 0.5 mS on Windows 7 and 8 Microsoft, but I have not tried them. (They come with the caveat that another program or process can come along and reset your timer!) There is the "multi-media" timer (now the HPET or High Performance Event Timer"), but it is meant for sound and video at the driver/kernel level, and not for the UI level, as LabVIEW or C# are.

 

So how does the 'Wait Until Next ms Multiple work? It obviously is timer interrupt-driven because it yields to other processes (which is a primary reason for putting it in your overall While loop). And is it guaranteed on all recent OS versions (mine was timed on an dual core XP OS)?

 

Thanks for any insight.

 

Your typical curious engineer,

Ed

0 Kudos
Message 1 of 17
(5,219 Views)

Hi fellow curious engineer,

 

In my understanding, LabVIEW uses the default windows timer to control timing. With the wait(ms) function node, a resolution of 1ms is not guaranteed. You are correct that the default tick count is approximately 15.6 ms. Were you able to see a resolution of 1 ms?

 

Regards,

Ian K.
Software Developer
Data Ahead AG
0 Kudos
Message 2 of 17
(5,169 Views)

I'll start off by saying I don't EXACTLY know how the "Wait Until ms Multiple" is implemented under the hood (and I likely couldn't divulge that information even if I had the source in front of me) - but that's probably best because I bet my answer is going to be pretty close to how it's implemented.

 

On Windows, you're absolutely correct that the basic timer is very imprecise.  That's why many drivers implement their own thread schedulers - these thread schedulers "game the system" for their threads.  One of the "tools of the trade" on Windows is the QueryPerformanceCounter for effectively measuring in sub-millisecond resolutions, and another "tool" is the timeSetEvent() and CreateTimerQueueTimer() facilities for scheduling thread callbacks (see this handy page for a good description of precision timing resources).  

 

-Danny

0 Kudos
Message 3 of 17
(5,155 Views)

Ian,

 

Thanks for your reply. Yes, I'm sure LabVIEW uses the (default) Windows timer. And yes, 1 mS is not guaranteed due to the preemptive nature of Windows (and even "RTOSs" to varying degrees), which is why I see about plus or minus 2 mS. 

 

Apparently the Windows timer can be set by API calls. See: http://www.lucashale.com/timer-resolution/. Here's a screen shot of his TimerResolution.exe on a Windows 7 PC:

Win7 Set Timer Resolution.pngHere it is on my Windows XP PC after I set it to "Maximum" (initially it was 15.625 mS): Win XP Timer Resolution.PNG

 

Notice that it sets the Maximum to less than 1 mS, which is supposed to be the max, so there are some bugs. Plus the Default button does not reset it in XP, but does work on Windows 7 or 8. (I know this is not the place to "debug" non-LabVIEW applications!)

 

I'll bet LabVIEW sets it, too. The only caveat, as I said, is it looks like another application can change it, since the hardware timer is a "global" timer. I have not seen this issue in my LabVIEW applications, have you?

 

I guess I need to do some more digging to see the code to set the timer, but it looks like the developers of LabVIEW have it figured it out.

 

(FYI, I did notice that running my LabVIEW app (which gives about 2 mS resolution) or a C# app, which gives 15.625 mS resolution, does not affect what TimerResolution.exe reports, so I'm not sure if it's really working correctly. If I figure it out I'll post the results.)

 

Ed

 

 

Message 4 of 17
(5,108 Views)

Texas_Diaz,

 

Thanks for your reply, too. Your "handy page" link did not work when I just tried it. I'll try again later.

 

Yes, QueryPerformanceCounter is available as an API call in kernel32.dll. There are multi-media timer (HPET) functions in winmm.dll. But as the Lucas Hale page I linked to said, "So now I had a solution but it required linking to the winmm.lib which as a programmer didn’t sit well with me since my application had nothing to do with multimedia."

 

I'm guessing that LabVIEW uses that "undocumented native API" call he mentions. That's what I'm trying to confirm.

 

Ed

0 Kudos
Message 5 of 17
(5,098 Views)

Hi Ian,

 

It's been a few weeks and I haven't gotten any more email notifications of replies, but I thought I'd check anyway. I re-read your post and wanted to answer your question more directly than I did in my first reply to you.

 

You asked, "Were you able to see a resolution of 1 ms?" Not quite. But I did see +/- 2 mS. That's a whole lot better than can be attained with C# or VB.net, which gives 15.625 mS of granularity.

 

In other words, if the timer tick interval is set to any value less than or equal to 15 (the interval property is an integer), you will get an interrupt pretty close to 15.625. If you count a large number of ticks, it will average out to 15.625. The same is true with the sleep function, "Thread.Sleep(interval)".

 

If you set the interval from 16 to 31, it will average out to 2 * 15.625 = 0.03125.

 

If you set the interval from 32 to 46, it will average out to 3 * 15.625 = 0.046875, and so on.

 

So the bottom line is that LabVIEW's Wait Until Next ms Multiple vi (and the Wait (ms) vi) perform better than the equivalent .NET timer tick (or sleep function). Therefore I concluded that LabVIEW does something with the system timer more than C# or VB do. This is what I am trying to confirm (as well as exactly what was done).

 

But I guess that whomever knows at NI is either not seeing this post or is not telling! Can you inquire with the developers?

 

In any event, thanks for your reply, Hopefully someone who knows will reply.

 

Ed

0 Kudos
Message 6 of 17
(5,022 Views)

Edjsch- I think it is simpler to ask what 'wait(ms)' does rather than what 'wait ms multiple'.

 

All 'wait ms multiple' does is figure out the next time after current that is a multiple of the value you asked for and calls wait(ms) with the difference between 'next multiple' and 'current'. As such, it can give an average over many loops that is accurate to what you requested to many decimal places of a millisecond. Obviously there is much more jitter on individual values for reasons of timer resolution and latency.

 

Now technically 'wait (ms) multiple' may not actually call 'wait(ms)', but I can replicate the one with the other in the manner described above and it performs identically.

 

As to the accuracy of plain old wait, the previous poster's guess sounds like a good one.

 

0 Kudos
Message 7 of 17
(4,989 Views)

I'm certainly no expert (and have no idea how it actually works), but these KnowledgeBase articles may be relevant:

 

Why are the Time Resolutions for Get Time/Date In Seconds and Tick Count Different?

http://digital.ni.com/public.nsf/allkb/4E12F6841016929D86257126007A9D94?OpenDocument

 

How Do the Timing VIs Function On My Real-Time FieldPoint Controller?

http://digital.ni.com/public.nsf/allkb/37AD9D1202068F1B86256EE60078B3B0?OpenDocument

 

Obviously the FieldPoint reference may or may not be applicable to Windows, but I'd expect something similar would be possible as it's still x86.  The more guesses, the better the thread- right?

 

Regards

 

Tom L.
0 Kudos
Message 8 of 17
(4,975 Views)

ToeCutter,

 

I appreciate your reply. You are correct on what the Wait Until Next ms Multiple vi "does", but this has significant implications.

 

This is from LabVIEW help on the Wait Until Next ms Multiple vi.: "Waits until the value of the millisecond timer becomes a multiple of the specified millisecond multiple." (You can look up the rest of the description, but it is irrelevant for this discussion.) You can also check this link: https://decibel.ni.com/content/docs/DOC-14149

 

What this means is that it provides a "heartbeat" for your loop. If you set it to, say, 20 mS, your loop will execute on every 20 mS multiple, regardless of how long the code in your loop takes. (If your code takes longer than 20 mS, it will execute on the next 20 mS multiple.)

 

Here's an example: Say your code in the loop takes 5 mS. If you put a Wait (20) in the loop, the loop will actually execute every 25 mS. In other words, the Wait (ms) is in ADDITION to (in series with) your code execution time. However, the Wait Until Next ms Multiple executes in parallel with the code in your loop, hence providing the "heartbeat".

 

When I was learning LabVIEW, I researched and tested this and verified the operation of both the Wait Until Next ms Multiple vi and the Wait (ms) vi, and it is as I described.

 

Now, to get back to your suggestion of "to ask what 'wait(ms)' does rather than what 'wait ms multiple'": They both use the same timer, so it really doesn't matter. They both have the same resolution of much closer to 1 mS than .NET software. This is the reason for my post: LabVIEW does it it better, but how does it do it?

 

Here's a good link from one of Tom L's links: http://www.ni.com/white-paper/4120/en

 

Ed

0 Kudos
Message 9 of 17
(4,971 Views)

@Edjsch wrote:

Here's an example: Say your code in the loop takes 5 mS. If you put a Wait (20) in the loop, the loop will actually execute every 25 mS. In other words, the Wait (ms) is in ADDITION to (in series with) your code execution time. However, the Wait Until Next ms Multiple executes in parallel with the code in your loop, hence providing the "heartbeat".


This is incorrect!

 

Both wait primitives will execute in parallel to any other code (unless you serialize things using e.g. sequence frames). If you place three wait primitives in the same loop, the loop time will not be the sum of the three values, but the largest of the three.

 

The loop rate will be very close to 20ms in both cases, but the code with the "wait" might have some slight longterm drift while the "code with the "wait next" will have no drift unless the code takes longer than the wait, in which case the loop will take an integer multiple of the expected loop time.

 

If the code takes more than 20ms in the "wait" case, the code will determine the loop rate and the wait value is irrelevant.

Message 10 of 17
(4,963 Views)