LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Does DAQ input or output buffer affect real time perforamce?

I think there are a lot of people out that who have already made use of a DAQ to collect the signal from an external circuit and then analyse the signal and finally output the analysed data to another or probably the same circuit.. I am attempting to achieve this. However, I am concern that the process of collect the data and then processing the data and finally converting the digitised data to an analog output may not be fast enough.

I am using PCI-MIO-16E4 DAQ and sampling the analog input at scan rate of 10000. This is for 2 channels. I am aware that in order to achieve a reliable sampling rate at such frequencies I need to make use of hardware timing rather then software timing. However, the former would require a input buffer. I wonder if such a buffer would slow down the whole process as the input signal would have to go through the buffer and "wait" to be processed and then feedback to the external circuit. I am also making use of a buffer at the output. And the signal that I am concern with is of around 3kHz.

Can someone with such experience advice me if such a feedback process is real time enough to control to the external circuit?

0 Kudos
Message 1 of 2
(2,522 Views)
I'm going to go with the odds and assume you're running LabVIEW under some flavor of Windows.
 
Unfortunately, you may not be able to accomplish what you need without a real-time OS (such as LabVIEW RT).  My statement is driven pretty much entirely by your need for updating outputs in support of a 3 kHz signal.  If only your inputs are 3 kHz and your needs for updating the output(s) are much slower (and absolutely regular timing isn't critical), there may be more hope.
 
Here's the problem: suppose you perform buffered outputs.,  On the first loop iteration, you perform some calcs on the input data, then write some data to the output buffer.  Now here comes the next loop iteration.  You perform your calcs again, and now you write some different data to the output buffer.  These new outputs won't be generated as physical real-world signals until after ALL the data from the first loop's buffer write is generated.
 
So there's going to be some lag time between the calc & buffer-write and the actual signal generation time.  If you loop around too fast, or write too much data at a time, the lag will keep increasing and increasing until you produce a DAQ error due to overwriting a portion of the buffer that hasn't yet been generated.
 
When you mention 3 kHz, it sounds like you might want that to be your loop rate, right?  So that about every 333 microseconds, you examine the most recent input data, perform calculations, then immediately update the output signal(s)?
 
The next question is, how critical is the 333 microsecond update interval?  Under Windows, you won't be able to count on meeting that kind of timing goal accurately or regularly.  It wouldn't be surprising if you hit it 95%+ of the time, but there are liable to be some intervals that last for several, tens, or even 100's of milliseconds between output updates.
 
Can you live with the irregular timing?  If so, there's likely a decent Windows-based solution.  If no, you should look into LabVIEW RT.
 
-Kevin P.
ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 2 of 2
(2,518 Views)