Hi,
I am using a DAQCard 6036E (and in the past a DAQCard 6062E) to do a simultaneous AI DAQ and buffered period measurement.
The buffered period measurement fails with very low frequency pulses (~68Hz on a 700MHz PIII) when the AI DAQ is also running. However, when I change the interrupt generation from ND_AUTO (which defaults at the AI sample frequency in use to ND_INTERRUPT_HALF_FIFO) to ND_INTERRUPT_EVERY_SAMPLE the buffered period measurement copes with much higher frequencies (~2-3kHz min).
I understand that PCMCIA devices use interrupts not DMA, and also that the DAQ-STC has only one hardware save register for each counter, which I assume must be read by NI-DAQ before the next gate sign
al causes the counter value to be written into the save register.
My question though is why changing the AI interrupt generation method as described above should help as it seems counter-intuitive. I would expect better counter performance when there are less AI DAQ interrupts being generated (i.e. when there is only one interrupt being generated per half FIFO) and not when there are more being generated.
Any enlightenment would be appreciated.
Jamie Fraser