LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQ Timing Puzzler

Software: LabVIEW 7.1 (not 7.1.1) and DAQmx 8.0.0f0
 
I have enclosed a simple VI that does hardware-timed, buffered acquisition. Each iteration, I take the timestamp and tick count, then subtract that number from the previous time values to get elapsed time since the last acquisition.
 
Since the DAQ operation is set up for 1000 Hz, 1000 samples per read, I expect to see 1000 ticks per iteration and 1.000 sec per iteration.
 
ALAS! I get 1000 ticks per iteration and 0.997 or 0.998 seconds per iteration. Which is right? Isn't a tick one ms?
 
The timestamp number just won't keep to a one second pace - it "loses" time.
 
Can anyone explain what is going on and how I might fix it?

Message Edited by 10Things on 06-01-2006 11:57 PM

0 Kudos
Message 1 of 6
(3,150 Views)
Hello,
 
It sounds like you're looking at timing based on pure LabVIEW functions, perhaps the millisecond timer.  When the value from the millisecond timer is read relative to your other code will depend on precisely how you coded it - remember that the "right-left" orientation of your code doesn't necessarily matter if it's not controlled by dataflow... performing timing operations in parallel with the operations you are actually timing can yield subtle deviations from the actual time to perform those operations.  I think you intended to attach your VI here, but I don't see the attachment.  If you can attach it, we can take a look and see if you're running into such a case, or at least attempt to better explain the behavior you are seeing.
 
Thank you,
 
JLS
Best,
JLS
Sixclear
0 Kudos
Message 2 of 6
(3,136 Views)

Here's the stuff - I don't know why it didn't attatch.

When you look at the code you'll see that dataflow is not the problem.

Download All
0 Kudos
Message 3 of 6
(3,128 Views)

One possibility would be to put your data acquisition inside a timed loop, with the actual DAQmx function being selected to return waveform data, which has built in time stamps generated by the DAQ hardware. The timed loop will impose a little more accurate timing on your acquisition loop, the timestamps will more accurately represent when the data was actually captured. With the timed loop there are "signals" available on the left hand side that show whether the loop finished late and inputs on the right hand side that allow you to vary the loop timing dynamically.

 

 

Putnam
Certified LabVIEW Developer

Senior Test Engineer North Shore Technology, Inc.
Currently using LV 2012-LabVIEW 2018, RT8.5


LabVIEW Champion



0 Kudos
Message 4 of 6
(3,123 Views)
I understand your suggestion for a workaround and I appreciate it and will look into implementing it (difficult given the other stuff going on in the actual application), but what I'm really trying to get at is why is there a difference?
 
If the two clocks say that I am looping at two different rates, how can I know which one is right?
 
If the timestamp clock is right, then doesn't that mean that the DAQ card is not really looping at the rate that I specified?
 
If the DAQ card is really looping at the correct rate, then doesn't that mean that the timestamp isn't accurate?
 
It seems like a bug must exist in one of these clocks.
0 Kudos
Message 5 of 6
(3,112 Views)
10 things wrote;
 
"
but what I'm really trying to get at is why is there a difference?
 
If the two clocks say that I am looping at two different rates, how can I know which one is right?
 
If the timestamp clock is right, then doesn't that mean that the DAQ card is not really looping at the rate that I specified?
 
If the DAQ card is really looping at the correct rate, then doesn't that mean that the timestamp isn't accurate?
 
It seems like a bug must exist in one of these clocks.
"
 
The DAQ card is correct plus or minus the spec on the crystal.
 
The system time is whats goofy. You simply can not trust Windows to give to timestamps with resolution on the order of 1 msec.
 
Suggestion:
 
If you configure your DAQ read to return a "waveform" data-type, the "t0" will be much better than what you get from the "Get system time"
 
Ben
Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 6 of 6
(3,106 Views)