LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

shift in elapsed time measurement (tick count.VI)

Hello:
I know that my question is a basic one, but I would prefer to make it. I will write first the questions, and then the
explanation:
1) Are the functions "tick count.VI" and the other timing functions hardware dependant? If so,¿"How much dependant" are they?
2) I have noticed that the function "tick count.VI" has a shift error of 2 seconds per hour. ¿Can I fix that?
Explanation:
I´m using LABVIEW 5 to aqcuire temperatures. After that, I have to make a graph of the temperature as a function of time, so
I have to know at which moments did a make the measurements. I'm not interested in absolute time (date, hour, etc), but in
elapsed time since the beggining of the measurement. In order to know the elapsed time, I have used the "tick count.VI"
function. One inside e loop ans the other outside the loop. The loop is activated when I star te measurement. Then I
susbstract the two timer values, so I have the elapsed time in ms.
Two seconds per hour is a very little error. But I don't know if that error could be grater if a run the software on a
different PC or if in the future the software becomes more complex, that error could grow.
Thank you very much for your time, sorry for the length of the message and for my English.
Good bye
0 Kudos
Message 1 of 2
(2,677 Views)
The two timers are obtained from different sources. Both are linked to oscillator circuits in the computer. The Time of Day (Day, Hour, Minute ...) is the same clock the computer uses to date stamp files and to display a clock. The Tick count uses a separate circuit which counts milliseconds since the last time the computer booted up. The two timekeepers are not synchronized by hardware or software so it is expected that they will exhibit slight differences in timekeeping. Considering that neither clock was designed to be a highly accurate timekeeper, this is not too bad. You can expect that any other computer will have a similar magnitude ov variation.

So you need to decide what timing is most important to your application: absolute time or time increments. Your Tick Count method will give good increment measurements. The Time of day can be better if the data needs to be synchroized with other devices, especially if the TIme of Day clock is updated with a time server connection (although that can have its own problems).

A third approach is to use some external oscillator or clock in which you have hig confidence and record the value of that clock along with your other data. If you need precision better than a few milliseconds or accuracy better than the computer's clock, this may be your best option.

Lynn
0 Kudos
Message 2 of 2
(2,674 Views)