LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Timestamping Precision

The issue I have is related to manually timestamping the data (voltages only) received from a flow meter connected via serial communication.  After sending a "go" command using VISA Write to the flow meter, it streams data at 200Hz until "S" command is sent to terminate the streaming.  I have developed a VI that successfully reads/parses (with VISA Read) the streaming data into numerical values.  I set up the Timed Loop at 200Hz to read one voltage at a time from the flow meter.  I was able to receive every data from the flow meter smoothly.  I attempted to time stamp each data value received in each iteration of the "Timed Loop" through subtraction of current and initial time with "Get Date/Time in Seconds" which is similar to Elapsed Time.vi.   The timestamps I recieved are odd in that I get a series of two same timestamps (for example one time I got 0.401 0.401 0.411 0.411 0.421 0.421 0.431 0.431 0.441 0.441....etc.).  The corresponding flow meter data are variable and do not exhibit the same pattern.  The expected time stamps are 0.401 0.406 0.411 0.416 which are in increments of 0.005s.  It looks like as if labview cannot update the 3rd decimal point since every other timestamps made sense and are in increments of 0.01s, but it was noted by LabVIEW that if it is used as a timing source the fastest speed is 1kHz which is fast enough.

 I have spent a long time on this trying to get an accurate timestamp readings for the data acquired at 200Hz via serial communication without having to use the DAQ devices. Let me know if you need more information.   I have attached the VIs in zip format below.  Any inputs will be most appreciated.
0 Kudos
Message 1 of 5
(3,350 Views)
I noticed one more thing.  When I changed the sampling period for the Timed Loop to 1ms rather than 5ms, I get a series of 10 same timestamps
(e.g. 0.01 0.01 0.01 0.01 0.01 0.01 0.01 0.01 0.01 0.01 0.02 0.02 0.02 0.02 0.02 0.02 0.02 0.02 0.02 0.02 0.03 0.03 etc.).

In addition, if I change the sampling period to 10ms I do not get any repetitive timestamps.   I get the same results for any sampling period greater than 10ms.

So it appears that LabVIEW cannot read time in the 3rd digit of precision, but this does not make sense to me.  Can anyone explain why I cannot read time smaller than 0.01s?  I would like to be able to read the time down to the 3rd digit of precision since the sampling rate is 200Hz.
0 Kudos
Message 2 of 5
(3,322 Views)
Unless you receive the timestamp data from your device, your timestamp values are going to be software timed instead of hardware timed.  If you are ok with this, I would recommend using the "actual end [i-1]" or "actual start [i]" property of the timed loop to get your timestamp instead of calculating it yourself.
Robert Mortensen
Software Engineer
National Instruments
0 Kudos
Message 3 of 5
(3,293 Views)


@pulmat wrote:
The issue I have is related to manually timestamping the data (voltages only) received from a flow meter connected via serial communication.  After sending a "go" command using VISA Write to the flow meter, it streams data at 200Hz until "S" command is sent to terminate the streaming.  I have developed a VI that successfully reads/parses (with VISA Read) the streaming data into numerical values.  I set up the Timed Loop at 200Hz to read one voltage at a time from the flow meter.  I was able to receive every data from the flow meter smoothly.  I attempted to time stamp each data value received in each iteration of the "Timed Loop" through subtraction of current and initial time with "Get Date/Time in Seconds" which is similar to Elapsed Time.vi.   The timestamps I recieved are odd in that I get a series of two same timestamps (for example one time I got 0.401 0.401 0.411 0.411 0.421 0.421 0.431 0.431 0.441 0.441....etc.).  The corresponding flow meter data are variable and do not exhibit the same pattern.  The expected time stamps are 0.401 0.406 0.411 0.416 which are in increments of 0.005s.  It looks like as if labview cannot update the 3rd decimal point since every other timestamps made sense and are in increments of 0.01s, but it was noted by LabVIEW that if it is used as a timing source the fastest speed is 1kHz which is fast enough.

 I have spent a long time on this trying to get an accurate timestamp readings for the data acquired at 200Hz via serial communication without having to use the DAQ devices. Let me know if you need more information.   I have attached the VIs in zip format below.  Any inputs will be most appreciated.


Under all Windows NT platforms
Get Date/Time in Seconds has a resolution of 10ms. This is a limitation of the underlying Windows API functions.

Rolf Kalbermatter
Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
0 Kudos
Message 4 of 5
(3,277 Views)
Thanks Robert! Using the "actual start [i]",  I was able to get the timestamps in intervals of 5ms corresponding to the data from my device.  I had no choice but use the software based timestamps since I cannot read time from the hardware.  Rolf, thanks for explaining why "Get Date/Time in Seconds" only has a resolution of 10ms.


0 Kudos
Message 5 of 5
(3,247 Views)