Counter/Timer

cancel
Showing results for 
Search instead for 
Did you mean: 

difference between counter and systemtime

I used the counter (24 bit) of my PCI-6023 card for following experiment:
I started the counter (timebase = 100kHz) and incremented a variable by 1, after the counter was reseted to 0. After constant time-intervals I multiplied the variable with 2^24 (24 bit-counter), added the actual worth of the counter and the starttime (the time the counter was started) and compared the result with the actual systemtime of my PC. Strangly (for me) the time-difference increased up to 40 ms per hour....
Since the systemtime of my PC was constantly refreshed by a DCF77-Receiver, I guess that the cause for the growing time-difference is the inaccuracy of my DAQ-Card.
Could that be true???
What can I do, if I want to measure longer timeinter
vals (more than 1 hour) with a precision of max. 10ms?

hans
0 Kudos
Message 1 of 4
(3,960 Views)
Hans,

I am having a bit of difficulty clarifying your exact algorithm. I assume you are performing a buffered event counting operation with your PCI-6023, but I don't see what you are using as the source signal for that operation. Can you clarify your setup for me?

According to the specifications for the PCI-6023, the base clock for the counter/timer functions has an accuracy of +/- 0.01%. If the 6023 is not accurate enough for you, the 6601 and 6602 boards have a base clock accuracy of +/-0.005%.
0 Kudos
Message 2 of 4
(3,960 Views)
hello,

I didn't use a source signal. I used the elapsed- time-output of the "Count Event or Time.vi" in order to check the accuracy of the counters of my PCI-6023 card.
Therefore I incremented a variable each time the 24bit-counter reseted to zero (after about 167 sec) and compared this time with my DCF77-controlled-systemtime.

I tried this experiment because I have to write a program that acquired data from 3 sensors with a max. scanrate of 100/sec and write the scans with a precise timestamp (accuracy = 10ms) to a file. The measuring-periode should be up to 24 hours and the accuracy of the timestamp should be constant over this whole periode.
Therefore I wanted to test the accuracy of the counter...

hans
0 Kudos
Message 3 of 4
(3,960 Views)
I believe the problem here is that you are mixing hardware-timing with software-timing. If you are incrementing your variable with software calls even if they are based on System Timing, the accuracy will have software latency associated with it. Likewise, if you are setting up your counter for a hardware-timed count-down and then restart the count-down operation when it's done, you have some software latency involved when you reconfigure your counter and before you arm it to start the next operation.

The best way to test the accuracy of your counter is with a strictly hardware-timed experiment. For example, you can run some of the Measure Period examples that ship with LabVIEW to test the counter's
results when used to measure a high
ly accurate hardware signal's pulse train. You will want to provide the pulse train from an external source. (or you can use one of the board's counters to generate a pulse train and the other counter to measure it). To find the Counter Examples, go to Help>>Examples>>I/O Interfaces>>Data Acquistion>>Counters>>Period Measurement.
0 Kudos
Message 4 of 4
(3,960 Views)