LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Is Get Date/Time in Seconds showing a cpu time?

Hi all,

 

I have a data acquisition application where I am trying to compute velocity based on my position values. So the actual data acquisition time has a crucial influence on the accuracy of my application. In order to that, I need to check the accuracy of my time stamps with respect to the real CPU time. So I am trying to read and write the CPU time in the same .txt file as my position and velocity data.

So I was wondering how I can access to the CPU time stamps in Labview. Does the "Get Date/Time in Seconds.vi" get the values from the CPU or is that a dependent measure calculated by labview?

 

According to help files, it is calculated by labview, which I think is still dependent on labview and hence not what I should rely on.

But I also found the following example on NI website that says it is reading a CPU time:

 

"How Do I Include a Time Stamp in a File Using the Write To Measurement File VI? (http://digital.ni.com/public.nsf/allkb/68806B93A21355E98625726F0064822B)"

 

So is it a CPU time or a Labview calculated time? Can someone help me with this issue?

 

I just need to access the CPU time stamps in my block diagram.

 

Thanks.

Amir

Amir
0 Kudos
Message 1 of 3
(2,788 Views)

Get Date/Time in Seconds.vi calls the system clock.  (Simply change you CPU time and test it)  It does convert the time from the OS definition to a LabVIEW (OS Independant) format however, it is the OS's time.


"Should be" isn't "Is" -Jay
0 Kudos
Message 2 of 3
(2,786 Views)

Jeff already answered your question, but your use of CPU time is rather ambigues. I assume you mean the time of the real time clock but you should be aware that it has only a relatively low resolution of milliseconds and the real resolution might be in fact even less, since the RTC hardware is not necessarily going that low and Windows might be calculating the higher resulotion from its timer tick. The timer tick is another sort of clock and it's simply a software counter that gets incremented continously in the background. It's start point is during initialization of the operating system when the computer is started up, so its absolute value is meaningless, and it simply rolls over when the maximum value of 2^32 ms is reached. It's actual resolution is 1ms but the update rate is in nowadays Windows usually 10ms here. In earlier days (talking Pre Win32 here) that used to be 55ms but you could reduce it to 1ms with some ini file hack.

 

Last but not least there are several counters directly in the CPU usually there for performance measurements. Their resolution goes down to below microseconds up to naonseconds but you need to call into Windows APIs to get at their value (and accept that the Windows API call delays add a relatively large uncertainety to these values).

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
0 Kudos
Message 3 of 3
(2,751 Views)