LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

error 10843 with new PC

Can anyone point me to information on how to prevent buffer overrun when I run the function generator VI that was attached to my first post?  As I indicated in my last post, when I started a seperate VI running at the same time I stopped getting the error.  This seperate VI is simply an empty while loop - no wait state.  Evidently this eats up enough CPU time to prevent the buffer overrun (task manager shows 50% CPU usage).  Is there a setting for the analog output that I can use to directly control the write speed?  The waveform is ~10X the size of the onboard memory of the card, so I can't run it directly from the buffer.
Jim

LV 2020
0 Kudos
Message 11 of 15
(950 Views)

Hi Jim,

It could simply be that this version of LabVIEW and the DAQ driver were not written to operate on a core duo computer. A way to test this might be opening the Windows Task Manager>> Selecting the Process tab>> right click on LabVIEW >> clicking Set Affinity>> and then removing one of the CPUs. You can then see if the program runs without error.

If you still get errors, I might be able to try and recreate the issue.

Regards, Mallori M.

Mallori M
National Instruments
Sr Group Manager, Education Services

ni.com/training
0 Kudos
Message 12 of 15
(934 Views)
Hi Mallori,
The whole point of getting the new PC was to increase performance.  With the old PC running 8 channels I can only process the analog in data sampled up to ~640KS/s.  I need to get up to at least 1.5MS/s for the new components that we are making, and preferably 2.5MS/s.  Data collection and processing would probably use enough processing time to help, but data is not collected continuously.  The function generator does run continuously.  Data underflow is easy to understand because there is a finite amount of processing time, and a fixed amount of data, but I just don't understand why the AO process doesn't have sufficient handshaking to prevent overflow. 
Jim

LV 2020
0 Kudos
Message 13 of 15
(924 Views)

Hi lmtis 

Did you try the suggestion that Mallori suggested?  If so, how did it work?

 It sounds to me that you are just running this program in the background while you acquire data with another application.  If this is correct, and you are not using the on the fly updating functionality of this VI, then you may want to consider creating another output application that is not system dependent.  Have a look at the following example programs that are available on our website.

Chris_K

0 Kudos
Message 14 of 15
(882 Views)

Hi Chris,

As I indicated in my last post, I don't want to limit LabVIEW to running on one CPU because I need to increase data rates.  The 6110 card is capable of 5 MS/s, and with 8 channels of data I need all of the processor I can get.  Actually I may have solved the problem.  I experimented around with the Set DAQ Device Information VI that Vanessa suggested for using DMA.  I finally tried setting the "data transfer mode for analog output group 1" to "interrupts", and I have not had the error now for almost 5 days.  I have no idea what the default configuration is or what this new setting does to prevent the overrun (I would have assumed that with an 8k buffer and a 220k waveform that it had always been using interrupts).  Enlightenment on this would be welcomed if you know the answer, but in the absence of enlightenment I will gladly settle for my problem being solved.  Thanks.

Jim

LV 2020
0 Kudos
Message 15 of 15
(852 Views)