> I have a LabView program that takes data from a laser doppler
> velocimeter counter. The program waits for a rising edge, then sets an
> output low, then reads sixteen bits of data, then sets the same output pin
> high, and then waits for more data. According to the oscilloscope, this
> is taking about twenty milliseconds. Does LabView automatically go DMA,
> or will it try to temporarily store it on the hard drive? The data isn't
> processed until all of it has been taken, so all the pertinent part of the
> program is doing is reading sixteen bits of data and hanging onto them
> until all the data has been taken.
> Frankly, 20 ms read time (50 Hz!) is just WAAAAAAAY too slow for
> the flows being analyzed. I am still new to Labview, and I on't kno
w if
> my program is storing things in ram or writing to the HD. Whatever it's
> doing, it's too slow. THe computer is an older P120, but it seems like it
> should be faster than this. Any input?
>
LabVIEW does what the diagram tells it to. If you are calling a single point
Analog read function in a loop, it is software timed and the overhead
for this
type of acquisition is pretty high. If you configure the card to trigger
when it sees the edge and tell it how many post-trigger points to return,
then the DAQ will be hardware timed and is limited by the clock on the
board and the configurability of its various counters and ADCs. Look at
some examples that do triggering and HW timing.
LV doesn't write anything to disk unless you tell it to with a write icon.
The values you collect can be collected in an array and written to disk when
there is computer time to do so. If you don't have much memory in your
computer, the OS may be using virtual memory without LV even knowing
about
it.
Greg McKaskle