LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

1mS Digital aquisition realistic for AI-16XE-50/Pentium/Win98?

I'm trying to sample digital levels + buffered period measurement at 1mS
or better. Setting the tick counter for 1mS intervals apparently causes
too much overhead on my old laptop, and the samples are 3-4 mS intervals
at best. Running the acquisition loop flat out with no waiting will
sample the digitals at barely 1mS or faster, but some buffered period
measurement samples are missing, even though it's contained within the
same acquire loop. D'oh! Since the period measurement is apparently the
slowest, I'm thinking there should be a way to hold off the digital
sampling until the period measurement is done, so they all have the same
time correlation, but I dunno how, seein' as I 'aint never tried this
here Lab
view stuff before.

Doug
0 Kudos
Message 1 of 3
(2,681 Views)
Doug,

Win 98 is not a real-time (deterministic) operating system... you're always
going to loose samples and the interval will never be precise.

I haven't used it but, I believe NI makes a LabView RealTime version that
actually uses a separate interrupt driven microprocessor on one of their DAQ
boards to do the periodic sampling... at least that's how I think it works?

Are you trying to trap really narrow spikes or could an averaging technique
be used to average out the varying period?

Bill


D. Berry wrote in message
news:3D0BB082.FFDE327F@houston.rr.com...
> I'm trying to sample digital levels + buffered period measurement at 1mS
> or better. Setting the tick counter for 1mS intervals apparently causes
> too much overhead on my old la
ptop, and the samples are 3-4 mS intervals
> at best. Running the acquisition loop flat out with no waiting will
> sample the digitals at barely 1mS or faster, but some buffered period
> measurement samples are missing, even though it's contained within the
> same acquire loop. D'oh! Since the period measurement is apparently the
> slowest, I'm thinking there should be a way to hold off the digital
> sampling until the period measurement is done, so they all have the same
> time correlation, but I dunno how, seein' as I 'aint never tried this
> here Labview stuff before.
>
> Doug
>
0 Kudos
Message 2 of 3
(2,681 Views)
I haven't been able to get a loop time under 1ms using the Eseries digital port. If you look under VI profiling you can see how long a subVI takes to execute. When I did it the read was returning in less than 1ms, but the loop processing kicked it over.
If you need deterministic timing, you should use a device with timed (strobed) digital I/O. Then you can clock the data in at whatever rate you want, then read it from a buffer.
0 Kudos
Message 3 of 3
(2,681 Views)