Ed,
You're correct - the first count value to be buffered will be the "time" (literally the number of source edges) from when the counter is armed until the first Gate (sampling) edge. In a triggered operation it will be the time from the trigger edge until the first Gate edge.
There's really no way to eliminate this artifact except simply ignore the first value when evaluating period/frequency of the measured intervals. I don't think you can do any better than that. The only hw solution that *might* produce a buffer whose first value is "correct" would be to trigger off the same signal that is set up as the Gate. However, that would set up a race condition between the trigger and gate edge detection circuitry, and you may wind up with a first interval time of 0.
Can you describe why you can't just ignore the first value? Do you need to do a buffered measurement or could you simply take a single sample once in a while? With a single sample, the counter doesn't start counting time (source edges) until after it sees the first Gate edge, so your measurement would be correct.
-Kevin P.
ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.