LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

precision of timestamp

Solved!
Go to solution

When running Labview on a pc (not a real-time system like a CRIO), what is the precision of the timestamp function output LabVIEW?  I want to make sure I understand this when interpreting data for a project, because I'm getting time outputs from the timestamp with 6 digits of precision for fractions of a second (i.e. 2.0592060 seconds is one output).  I'm trying to acquire data at a rate of 300 Hz, so it would be good to know if at least the timestamp data is accurate to 3 decimal places for a second (2.059 seconds).

0 Kudos
Message 1 of 2
(3,479 Views)
Solution
Accepted by topic author WyoEng

Hi WyoEng,

 

If you will store the values the timestamp (Get Date/Time in seconds) generates in an array (neglecting redundant data when sampling too fast) and then check the difference between them ,  you will see that the outcome will be 1ms.

 

Try to save timestamp faster than 1000Hz and you will see that you will get the same data.

What Hardware are you using for the Acquisition ?

Is it Hardware timing ? or Software timing ?

 

I believe the digits precision that you see, is due to the way that LabVIEW stores the Timestamp datatype -

 

you can see in the LabVIEW help - (view the following link - in the section Time Stamp)

How LabVIEW Stores Data in Memory

https://www.ni.com/docs/en-US/bundle/labview/page/how-labview-stores-data-in-memory.html

 

Time Stamp

LabVIEW stores a time stamp as a cluster of four integers, where the first two signed integers (64 bits) represent the time-zone-independent number of complete seconds that have elapsed since 12:00 a.m., Friday, January 1, 1904, Universal Time [01-01-1904 00:00:00]. The next two unsigned integers (64 bits) represent the fractions of seconds.

"

 

 

 

 

 

0 Kudos
Message 2 of 2
(3,459 Views)