Doug,
Win 98 is not a real-time (deterministic) operating system... you're always
going to loose samples and the interval will never be precise.
I haven't used it but, I believe NI makes a LabView RealTime version that
actually uses a separate interrupt driven microprocessor on one of their DAQ
boards to do the periodic sampling... at least that's how I think it works?
Are you trying to trap really narrow spikes or could an averaging technique
be used to average out the varying period?
Bill
D. Berry wrote in message
news:3D0BB082.FFDE327F@houston.rr.com...
> I'm trying to sample digital levels + buffered period measurement at 1mS
> or better. Setting the tick counter for 1mS intervals apparently causes
> too much overhead on my old la
ptop, and the samples are 3-4 mS intervals
> at best. Running the acquisition loop flat out with no waiting will
> sample the digitals at barely 1mS or faster, but some buffered period
> measurement samples are missing, even though it's contained within the
> same acquire loop. D'oh! Since the period measurement is apparently the
> slowest, I'm thinking there should be a way to hold off the digital
> sampling until the period measurement is done, so they all have the same
> time correlation, but I dunno how, seein' as I 'aint never tried this
> here Labview stuff before.
>
> Doug
>