Hello -
We've got a 6601 PCI card that's being used to characterize the stability of some crystal oscillators. We're essentially feeding a 10MHz TTL square wave into the board. The problem is, once the counter rolls past 16 million or so, the least significant bits start to get padded to zero. It appears that only the 24 MSBs are read out. We want to look at variations of 1 part in 100 million, so 24 bits isn't enough.
Various VIs in Labview (7.0 Express) all show this behavior. The first VI used DAQ assistant, but was scrapped due to flaky software timing. The latest version adapts one of the DAQmx examples: "Count Digital Events-Buffered-Continuous-Ext Clk". The signal comes in through ctr0, and ctr4 is used to create the sample clock. For the purposes of characterizing LSB dropping, a 10Hz sample clock is used.
Here's a data excerpt for illustration. The first column is the counter value; second column is the difference between counter samples; third column is the counter value in binary. This is the transition from 24 bits counted, to 25 bits.
14823821 1000018 111000100011000110001101
15823839 1000019 111100010111001111011111
16823858 1000018 1000000001011011000110010
17823876 1000020 1000011111111100010000100
18823896 1000016 1000111110011101011011000
19823912 1000020 1001011100111110100101000
When the counter value is 25 bits, the LSB always returns 0. When the counter value is 26 bits, the 2 LSBs always return 0, etc.
This appears to be a hardware limitation, but perhaps I've missed something. Advice would be greatly appreciated.

Thanks,
Mike