11-03-2011 11:00 AM
Hi,
I have a feeling that my mental model of the operation of the USB-6221 A/D converter is not entirely correct.
Any hints that could clear up my view are most welcome.
Here is what I think today.
I make the basic setup with the DAQmx Create Channel
Then I use the DAQmx Timing to specify
-Sample Rate (less than the 250 kS/s specified in the doc's)
-Continuous Sampling (or ask for a specified number of samples)
-Samples per channel (to allocate room for, in this case)
After Starting, I then Loop through the DAQmx Read to get some values each time, having specified...
-Samples per channel (to Read, each time through)
-TimeOut (after which time the 6221 is considered faulty)
Now, the 6221 is said to have a 4k internal FIFO buffer. I guess this is not an important number for me in LabView. My understanding is that "some driver" makes sure that this buffer is never overrun, as long as the "Samples per channel" buffer that I allocated in the DAQmx Timing can accept new data. Is that correct?
I have been looking for a way to know in advance if my buffer is slowly filling up, (beacuse the sample rate is too high, or the read out rate is too slow) but have not found any (like a %full indicator) . Is there a way somewhere, or don't I need it for some reason?
I am somewhat uncertain about how to dimension the sampling.
The data I read with DAQmx Read is averaged over the Read Out interval that I specify, from milliseconds to tens of seconds (during an hour or so).
If I then specify, say, 25kS/s I get an average of 25 samples every millisecond. Should I worry about not being able to keep the data from overflowing my buffer (because other tasks, like averaging and plotting the data, keep me from getting back for another Read in time). If so, what would be "best practice" guidelines for the minimum recommended Read Out interval.
In the other end, using the same 25 kS/s I get 2.5M data per channel every 10 seconds. Would that represent a problem? (I am thinking about the buffer size, and perhaps the averaging). Are there any "best practice" advice regarding the buffer size?
I realize that I can monitor the time that DAQmx Read has to wait before the required data is delivered. In the application I have the freedom vary sample rate and the Read Out rate within quite generous limits. So, if there are any rules of thumb that could limit my search for the optimal sampling parameters for the task, I would welcome your ideas.
Kindest regards / Ake
11-03-2011 11:29 AM
Do not worry about the onboard FIFO size. A buffer allocated on the PC side will be the main buffer. The latter buffer size will be set by DAQ-MX but you may also override this by setting the buffer size your self. How often you do a read out will depend on many factors. Like what the software is meant to do. If it is only storing and showing. One read out every second may be adequate. If the end user want a more real time feeling for the update perhaps 10 times pr second. Use as guidence the human eye and how fast it is. It is no point of doing updates on msec bases.
11-03-2011 11:59 AM
Thanks for your insight into the buffer.
However, I am reading an "infinite" sheet with "extreme" spatial resolution.
My total measurement time reads "hours". So in order to get maximum resolution at the shortest possible total time, I want to run the sheet at "maximum speed", as well as the A/I readings.
If I reach the limits for LabView (or the USB-6221) I can of course run the sheet slower, but at the expense of an extended reading time.
(The data is later saved to disk for off line evaluations).
This is the reason for wanting to push the limits towards the milliseconds.
Thanks again / Ake