Measurement Studio for .NET Languages

cancel
Showing results for 
Search instead for 
Did you mean: 

Still getting 200361 buffer overflow error - HELP!

We talked about this in another thread but I still haven't resolved the problem.  I have NI working on it but I want to run it by you guys one more time to see if anyone has any ideas.

 

I'm talking via USB to a 9162 carrier with a 9239 DAQ unit.  I'm performing a finite acquisition, once a second, of 3 channels, 5000 samples per channel.  The sampling rate is 50kHz.  On most computers here it runs fine but on some of our laptops, I get the dreaded 200361 buffer overflow error after it runs for somewhere in the 5 to 60 seconds range.  I can definitely affect how long it takes for the error to occur by connecting other USB devices or running other apps at the same time.  In other words, the less the processor/bus has to do, the longer my program will run.  This, however, is hard to understand considering that with a 1 second cycle, I'm taking data for 100ms and then the computer has 900ms to transfer the data before the next set is taken.  With USB 2.0, this should be plenty of time.

 

For testing purposes, I stripped down my program to some bare essentials and the overflow still occurs.  The following code is used in the form load to initialize the DAQ:

 

'Create a new DAQ task

myTask = New Task()

 

'Create DAQ channels

myTask.AIChannels.CreateVoltageChannel(Device + "/ai0", "Voltage", AITerminalConfiguration.Differential, -10, 10, AIVoltageUnits.Volts)

myTask.AIChannels.CreateCurrentChannel(Device + "/ai1", "Current", AITerminalConfiguration.Differential, -0.01, 0.01, 0.1, AICurrentUnits.Amps)

myTask.AIChannels.CreateCurrentChannel(Device + "/ai2", "Current Probe", AITerminalConfiguration.Differential, -0.01, 0.01, 0.001, AICurrentUnits.Amps)

 

'Configure DAQ timing specs

myTask.Timing.ConfigureSampleClock("", 50000, SampleClockActiveEdge.Rising, SampleQuantityMode.FiniteSamples, 5000)

 

'Verify the task

myTask.Control(TaskAction.Verify)

reader = New AnalogMultiChannelReader(myTask.Stream)

 

This code is the timer routine which occurs once every second and simply performs a read of the DAQ and displays the data to some graphs and meters:

 

'Get 100ms of data

Try

     'Get data from NI-9239

     AcquiredData = reader.ReadWaveform(5000)

 

     'Parse acquired data into voltage and current arrays

     VoltageArray = AcquiredData(0).GetScaledData()

     CurrentArray = AcquiredData(1).GetScaledData()

     CurrentProbeArray = AcquiredData(2).GetScaledData()

     'Graph voltage and current

     VoltageGraph.PlotY(VoltageArray)

     CurrentGraph.PlotY(CurrentArray)

 

     'Calculate RMS values

     ...

     ...

     ...

     'Update voltage and current meters

     VoltageMeter.Value = VoltageRMS

     CurrentMeter.Value = CurrentRMS

 

Catch ex As NationalInstruments.DAQmx.DaqException

 

     MessageBox.Show(ex.Message, "DAQ Error", MessageBoxButtons.OK, MessageBoxIcon.Warning)

 

End Try

 

So, why is this code causing overflow errors?  Is there a buffer that needs to be cleared after each read operation?  Is the data gathering operation running continuously even though I'm doing a finite read?  I wouldn't mind a pause in the timer if I knew of a way to check whether the current read is complete before continuing on to the next one.  Any light that anyone can shed on this would be greatly appreciated.

 

Thanks,

Randy

 

0 Kudos
Message 1 of 2
(3,984 Views)

i have a very similar problem,  but on the other hand my situation is completely different.   We acquire continuously at 12500 kHz from a PCI 6259 using our own C# software.   we read from the device using a fixed buffer size, and we use the Begin/End - MemoryOptimizedRead  functions and asynch callbacks.

 

but here's where the similarities start :    sometimes we get buffer overflow errors!   in particular,   we're able to fix these sporadic overflow errors by opening a couple additional programs (increasing the latency with which windows handles the callbacks, perhaps)  or by adding a Threading.sleep() command before re-registering the callback(100 milliseconds seems to be a magic number for now).  

but this is what's strange ...   the "overflow"  error  seems to be an "under-run"  error  if it can be avoided by adding a sleep command or doing other things that keep the system  busy until more samples are ready. 

 

it would be really helpful if someone from NI could confirm whether the "overflow"  error can also be sent in the case of an empty buffer,  as that's what all diagnostics we can run point to.    obvi i found this post searching for overflow errors and  i'm struck by the fact that in both cases, if we slow things down, the supposed "overflow"  goes away. 

0 Kudos
Message 2 of 2
(3,606 Views)