02-01-2004 12:11 PM
02-01-2004 02:02 PM
02-02-2004 08:47 AM
02-02-2004 02:45 PM
02-05-2004 08:34 AM
02-11-2004 12:54 PM
10-15-2007 05:51 PM
10-10-2008 12:48 PM
Stephan A. wrote "Tick Count Even though the RT Engine loses time, the timer tick count does not loose ticks. Every tick count is not exactly a millisecond, but actually 1.000686 milliseconds."
Hi Stephan, this is the first time I have seen the accuracy of the Tick Count (ms).vi defined. I have two questions: Where does LabVIEW get this tick count? My guess would be from counting the the motherboard's PIT 0 count, which is updated via the 1.931316666... MHz clock. Is this true?
I am also wondering if there is a similar thing in the Windows OS or in C++ running under windows. I found if I call the GetTickCount function in kernel32.dll in Windows XP, it does not update every millisecond, but instead updates every 15 or 16 milliseconds.
Thanks in advance,
Allan1
10-11-2008 01:27 PM
Allan1 wrote:
Hi Stephan, this is the first time I have seen the accuracy of the Tick Count (ms).vi defined.
You should note that this probably only applies to the PXI device mentioned earlier. I assume the numbers change depending on the machine.
As for the Windows thing, as you noted, its time resolution is about 16 ms (you can see this in LabVIEW by getting a timestamp), but as Stephan mentioned, you can compensate for this somewhat by using the tick counter and adding to a base time. I posted a simplistic example of this here. That thread also refers to using the performance counter in Windows to get microsecond resolution, but in any case, you should note that getting high accuracy in a desktop OS is a game doomed to failure.
10-11-2008 02:46 PM