07-12-2010 09:31 AM
Hi all,
I want to acquire data from an instrument as fast as possible. An external trigger is used for synchronization. To avoid delays related to starting and stopping AI task at each iteration, I used a retriggerable counter with finite samples with a continuous AI task (see http://decibel.ni.com/content/docs/DOC-6801).
Now, I'm testing the VI with an older computer and the data is generated quicker than it is read. Even by separating reading and processing I get synchronization errors. (see the simplified VI attached).
How can I handle this issue?
If the "Overwrite mode" property of DAQmx Read is set "overwrite unread samples" I have no error but the read position is random.
thanks,
Bernard
08-05-2010 03:07 AM
Hi Bernard,
I just had a look at your VI and I have a few questions:
You say you want to acquire data as fast as possible, that's why you use an external trigger. Okay. As far as I understand you are configuring counter output (clock) as your "external" trigger. Okay. But why are you configuring the Counter Output as "Finite Samples"? Why not "Contionous"?
another thing I see is, regarding your problem with loosing data when processing is much slower than acquisition, your maximum queue size is set to "2". Why dont't you set it to "-1" (unlimited)? You would not loose data.
Also another advice: Try to avoid Variables when not necessary. The queue reference "queue out" can be wired directly.
Kind Regards,
Matteo
NI Germany
08-05-2010 03:32 PM
Hi Bernard,
I understand what you are trying to do--using the finite counter output as a sample clock allows retriggerable operations for various subsystems that do not natively support retriggering. It is a very common workaround on older hardware (I actually wrote the example that you have linked to). Out of curiosity, what hardware are you using? Our newer X Series boards support retriggerable tasks on all subsystems so the counter workaround would not be necessary for this hardware.
A few questions that are relevant to troubleshooting your issue:
1. What hardware are you using?
2. What are the contents of fitting.vi?
3. What is your sample rate?
4. How many samples do you read per trigger?
5. How often do triggers occur?
6. What error message (if any) do you receive?
I notice you are using the "Lossy Enqueue Element" VI. If no space is available in the queue, this function removes an element from the front of the queue and discards the element to make space. Unlike the Enqueue Element function, this function does not wait for room in the queue to become available.
So... given that your queue size is set to 2, and you are using the Lossy Enqueue Element, it would be very possible that you might be throwing away data if the processing is not occurring fast enough to keep up with the data.
The best solution is going to depend on the answer to the above questions. For instance, you may need to read more than a single trigger per loop iteration if the triggers are coming in too quickly. Increasing the size of the DAQmx Buffer (currently Samples to Read + 10000 in your program) as well as the maximum size of your queue might also be desirable. If you want to guarantee to keep all of your data you should use Enqueue Element instead of Lossy Enqueue Element.
Best Regards,
08-31-2010 08:36 AM
Dear John,
Thanks a lot for your answer. I've been away for some time but now I'm facing the problem again.
Here are some answers to your question:
1) I'm using a USB-6251 board
2) The fitting VI performs a nonlinear LM curve fit LM of 8 channels with a selectable function (gaussian or Lorentzian) using a static VI reference. It includes also a guess VI estimating the parameters for the fit.
3) Sample rate: 18 kHz
4) Samples per trigger: 333
5) Trigger frequency: 53.6 Hz
6) Error message
In my main VI, I accumulate as many peak positions (obtained by fitting the data read) as possible and then at a given time interval (usually 1 sec) perform an average. In the example VI posted above I included only one channel. In reality there are 8, that are all fitted and the results are stored in a table that is finally averaged each second. Depending on the computer and the accuracy of the guess parameters, between 3 and 14 acquisitions are averaged. In this context, I found more important to be able to do the averaging at a given time than to keep all data. It seemed that the "lossy enqueue element" VI was appropriate for that. Do you think that this should be changed?
Kind regards,
Bernard
08-31-2010 12:09 PM
Hi Bernard,
Sorry, I can't quite tell what you mean from your answer to number 6. Do you receive an error message? If so what is the number associated with it?
From your first post I'm guessing that you run into error -200279: Attempted to Read Samples that are no Longer Available. If this is the issue, you'll probably need to increase the number of samples to read so that your loop does not have to iterate so quickly. To keep data aligned with the trigger, I would read back in Sampes Per Channel x N, where N is an integer.
Best Regards,
09-01-2010 01:56 AM
Hi John,
Sorry for the mistake. It was indeed error -200279.
If I understand it correctly, you suggest that I should acquire data not after each trigger but after N cycles and then separate the data for the fitting? What if there is a small change in the trigger frequency?
Thanks for everything,
Bernard
09-01-2010 12:48 PM
Hi Bernard,
If you're getting error -200279 this means that the software loop is not keeping up with the data (i.e. the DAQmx input buffer is overflowing). The solution is to keep the loop rate to a manageble level by reading in multiple triggers per loop. You could also try to increase the buffer size, but this only gives a larger buffer--you will still get the error if the loop can't keep up with the data in the long term.
A small change in the trigger frequency shouldn't make a difference in the data, although a slower trigger frequency would slow down your loop rate. You can adjust the number of samples to read on-the-fly, so maybe if the trigger rate starts to slow down you can decrease the number of samples to read.
Best Regards,