LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

VI to convert Unix time to standard time?

hi,

 

I am in need of a VI to convert Unix time to Standard time?

 

eg: Unix time: 1268845177 in seconds

Standard time: GMT: Wed, 17 Mar 2010 16:59:37 GMT
                        Your timezone(U.S): Wednesday, March 17, 2010 10:59:37 AM

 

I have the calculation but not full formula.

Unix time is in seconds from Jan 1,1970, 00:00:00 UTC, starting at 0 counting forward.

When 1268845177/86400 = 14685 days.

86400 = 60*60*24 seconds

 

But how to convert the 14685 days and relate to the current date, how it should be done?

 

Either the formula or a VI to convert Unix time to standard time will be a help.

Regards,

Arvinth

0 Kudos
Message 1 of 6
(5,822 Views)
See thread Unix to labVIEW Time
Visualize the Solution

CLA

LabVIEW, LabVIEW FPGA
0 Kudos
Message 2 of 6
(5,810 Views)

I am trying to do the sam thing: keep in mind that there have been leap seconds between 1970  and today and UNIX time does not count them. See http://en.wikipedia.org/wiki/Unix_time

0 Kudos
Message 3 of 6
(5,605 Views)

I don't think any time-based format including LabVIEW's timestamp would account for leap seconds.

 

Are leap seconds really a concern for you?

0 Kudos
Message 4 of 6
(5,597 Views)

Leap seconds are of concern to me (unfortunately) because I am in the timing business.... I assume that the timestamp function of labview gets its time from the computer's clock. If the PC is on the net and gets its time from internet, then the leap seconds are included.

0 Kudos
Message 5 of 6
(5,583 Views)

Timestamps and the PC clock vs. time on the net are different and I'd say somewhat unrelated concepts.

 

You are worried about accuracy of the current time.  Timestamps have nothing to do with current time, but are a way of identifying and particular moment in time.

 

Yes, the PC clock gets its time from the net assuming the time server settings are all set properly.  Even the correction of the current PC time may not happen exactly when the leap second gets inserted.  It may be some time before the PC resynchronizes with the net time.  The PC doesn't know if the existence of when a leap second will occur.  And neither does the LabVIEW timestamp.

 

Here is an example.  According to http://en.wikipedia.org/wiki/Leap_second, a leap second will occur at the end of June 30 of this year UTC.  Since my clock is eastern daylight time at that date, I put together an example for 8pm on June 30  (UTC +4).  You'll see that the 3 values (a second before, the exact time, and a second after) show they are all 1 second apart.  There is no extra second accounted for.  So in real life the time a second after  vs. a second before are actually 3 seconds apart, but LabVIEW only show 2 seconds.  Feel free to play with the attached snippet for different timezones and daylight savings time and all of that.

 

I don't think you really have as big of a problem as you are trying to investigate here.  If your timing needs are that critical where you need accuracy of the current time to better than a second, then you shouldn't be looking at LabVIEW timestamps, PC time clock, or internet time at all, but look at a high accuracy timing source perhaps based on a GPS clock.

 

Unless you have a particular test planned to run over the expected point of time where a leap second is inserted, then you shouldn't have to worry.  Even if you do have a test run over the leap second, you probably don't need to worry either.  If the test is based on a waveform datatype, then you have a T0 timestamp, and a dT that will keep track of all the individual data points accurately.

0 Kudos
Message 6 of 6
(5,572 Views)