LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Problems with buffered reads, buffers

I have a few questions and issues that I have not been able to resolve with forum searches, I will try to keep them concise here.  #1 is the most important one to me.

 

LabVIEW 7.1, Windows XP.

 

Buffered Reads / Lost Data

 

When error 10846 (AI Buffered Read, app unable to retrieve data from background acquisition buffer fast enough) occurs, I receive no data from the AI Buffered Read function for a period of time.  The error states data may be lost but it seems like I get NO data.  See the attached VI and JPG for examples.  The bottom graph shows a normal read while the top graph shows one with this problem.  I realize the AI Read error cluster is not handled in the acquisition loop.  From watching the backlog while the program runs, it seems the “dead spots” occur after the backlog reaches the buffer size.  When stopping acquisition, sometimes the error handler outside the loop reports 10846, sometimes it doesn’t (when the dead spots are occurring).  I can exaggerate dead spots with setups as follows:

 

  Buffer size 5, Scan rate 1000, # Scans to read 100

  Buffer size 10000, Scan rate 5500, # Scans to read 100

 

The size of the dead spots and continuous data are proportional to the buffer size.

 

Questions....

 

1. Shouldn’t the AI Read function wait until it has # scans to read of valid data before returning?  For instance, scan rate of 100 and # scans to read of 1000 slows loop iteration speed to 0.1 Hz.  It seems like it is returning nothing for a period of time after the backlog reaches the buffer size.

 

2. Error 10846 refers to the “data acquisition buffer.”  Is this the software buffer?  If so, what would LV refer to the hardware buffer as?

 

3. I know the PCI-MIO-16E-1 has a hardware buffer, does this mean I can accurately acquire data over a period of time at a specified frequency (within limitations of the card of course) without having to worry about Windows bogging down due to delayed writes and such?

 

4. Will hardware acquisition buffers generate an error if the data in them is lost/overwritten?

 

5. I tried loading the acquisition VI (attached) onto a LV computer that has only a new USB-6008 connected but the AI read/config/clear functions were not available on that system.  Does this device not have hardware buffers (either I’m looking in the wrong places to find the answer this question and it uses different VIs OR it doesn’t have them). 

 

6. The buffer/scan configuration listed above was used to exaggerate error 10846 and the accompanying dead spots but this problem has been intermittently plaguing a system I have been trying to understand/fix.  I have logged the backlog value as the program runs and see that it increases during certain processor-heavy state machine states and am hoping upgrading from 256MB to 2GB RAM will improve this.  Is there any reason I should not look at this as a solution?  (I am also planning on adding an error handler for AI Read into each acquisition loop so the problem can not continue to go on without detection!!!!!!!!!)

 

 

7. Shouldn’t the AI Clear function report an error from the AI Read error cluster, even if it happened several reads earlier?  It seems like it does not do this; if I put the error handler in the acquisition loop I receive an error as soon as the backlog reaches the buffer size, EVERY time.  I thought I figured this out and changed the acquisition loop tunnels to shift registers... the error is reported each time this way; if AI Read will not acquire data if an error is fed into it in the first place then I guess I understand this one so I’m moving it to the bottom of the list.

 

Thank you all for your time, I have been struggling to get up to speed with DAQ and LV and appreciate all the support.  I look forward to giving back to NI’s community in the future. 

 

Regards,

 

David.

0 Kudos
Message 1 of 6
(3,645 Views)
 

Hello David,

Before we start discussing your questions I wanted to mention that you might consider using DAQmx instead of Traditional DAQ.  The reason I mention this is because it sounds like you are just starting to use LabVIEW and DAQ.  DAQmx is our newest driver and it was designed to improve on the Traditional DAQ driver.  Not only is DAQmx much easier to use but it also faster and more powerful.  The DAQmx driver is also smarter and can handle a lot of buffer issues for you.  Traditional DAQ is still supported by LabVIEW and many of our devices, but any new devices that come out will use DAQmx.

Since you are using a PCI-MIO-16E-1 you have the option to use either DAQmx or Traditional DAQ.  Unless you have a reason you might want to use DAQmx especially if you are just starting to program and are new to the driver.  You can download the newest version of DAQmx (version 8.3) here: https://www.ni.com/en/support/downloads/drivers/download.ni-daq-mx.html

If migrating to DAQmx is not an option then please let me know and I would be happy to help you with your questions.  I was going to try and see what your code looked like but it was not attached like you had mentioned in your post.  So if migrating to DAQmx is not an option then you might want to include it on your next post.  Or if you were modifying an existing example could you let me know what example you were using?

Have a great day,

Brian P.
Applications Engineer

0 Kudos
Message 2 of 6
(3,602 Views)
After all that typing I forgot the attachements, they should be on this post.
 
I take it the AI.llb functions are "traditional DAQ."  I've started to experiment with the express VIs with newer hardware that I have been working with (USB-6008, USB-6211).  I will have to look at the DAQmx functions, it seems the express VIs are frowned upon by most.
 
I have been working on modifying an older program that uses the traditional DAQ functions.  I will have to get more comfortable with DAQmx before attempting to change it over to them.  The attached example was wrote as I tried to understand what was going on in the program I was modifying.
 
I believe the missing error handler on the AI Read function was causing unexpected (i.e. nothing returned) results the next time AI Read was called when the backlog exceeded the buffer value.  Subsequent calls to AI Read worked as expected unless the buffer overflowed again.
 
Assuming this is correct the hardware buffer is the only thing I would like to understand better.  If it overflows what error (if any) will be generated in LV? 
 
The other questions can be chalked up to the way traditional DAQ functions work and I think I've chased my tail enough to figure that out.
 
Thanks,
 
David.
 
Download All
0 Kudos
Message 3 of 6
(3,580 Views)
 
David,
 
LabVIEW should generate an error if data is overwriten before it is read.  I generated this error on my PC and the error that I received is Error -10486.  I attached a screenshot of this error below.  If you still want to use Traditional DAQ then you might want to also look at the example programs that ship with LabVIEW.  Example programs are a great resource to see how to use the Traditional DAQ VIs to create a working VI.
 
I wanted to point out a few fabulous documents that can help you better understand the differences between Traditional DAQ and DAQmx.  Within these documents are even more great links that you might want to look at as well.  These links really go indepth and explain very well the reasons why to make the switch and also help you better understand how you can implement the transition.
 
(pay attention to the Upgrading from Traditional NI-DAQ (Legacy) section)
 
 
I hope this answered the question that you have.  If you have any other questions that I can assist you with please let me know.
 
Have a great weekend,
 
Brian P.
Applications Engineer
0 Kudos
Message 4 of 6
(3,554 Views)
That is the I have been seeing (10846).  I've assumed this is the software buffer though, as I can change the software buffer size and the error occurs at different times.  Perhaps the "background acquisition buffer" refers to both the hardware AND software buffers?


@wildcat_600 wrote:
 
David,
 
LabVIEW should generate an error if data is overwriten before it is read.  I generated this error on my PC and the error that I received is Error -10486.  I attached a screenshot of this error below.  If you still want to use Traditional DAQ then you might want to also look at the example programs that ship with LabVIEW.  Example programs are a great resource to see how to use the Traditional DAQ VIs to create a working VI.
 
I wanted to point out a few fabulous documents that can help you better understand the differences between Traditional DAQ and DAQmx.  Within these documents are even more great links that you might want to look at as well.  These links really go indepth and explain very well the reasons why to make the switch and also help you better understand how you can implement the transition.
 
Answers to Frequently Asked Questions about NI-DAQmx and Traditional NI-DAQ (Legacy)
(pay attention to the Upgrading from Traditional NI-DAQ (Legacy) section)
 
Transition from Traditional NI-DAQ to NI-DAQmx
 
I hope this answered the question that you have.  If you have any other questions that I can assist you with please let me know.
 
Have a great weekend,
 
Brian P.
Applications Engineer
0 Kudos
Message 5 of 6
(3,545 Views)
There are a lot of good documents on the www.ni.com website that talk more about error 10846. I pasted an excerpt from one of the links below that describes a little more about why the error is being generated because of the software buffer. You can find more documents on the 10846 error by just searching the website for the error number. If you have any other questions please let me know.


How Can I Avoid Error -10846 (Overwrite Error) in LabVIEW?
http://digital.ni.com/public.nsf/3efedde4322fef19862567740067f3cc/75bf35eb4afb834f862566210065fa61?OpenDocument

"LabVIEW uses an internal acquisition buffer to store data as it is acquired. If you acquire more data than will fit in the buffer, then the buffer acts as a circular buffer, and is filled more than once. You will get an overwrite error (-10846) if you do not read data out of the buffer before it is overwritten with new data. There are several factors that affect LabVIEW's ability to keep up with the acquisition: the scan rate, the size of the data buffer, and the number of scans to read at a time. The "Cont Acq&Chart [buffered]" example lets you set all of these parameters on the front panel. If your buffer is too small, or you are not reading data out of it fast enough, then you will get the overwrite error. If your buffer is too big, you may get an out of memory error.

Experiment with different values for scan rate, buffer size and number of scans to read at a time. The best combination will result in little or no scan backlog. A good rule of thumb to start with is: make your buffer size 2 - 4 times as large as the number of scans to read."

Brian P.
Applications Engineer
0 Kudos
Message 6 of 6
(3,501 Views)