LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

UDP receive buffer default size

Solved!
Go to solution

Hi,

 

I have an issue regarding receiving data over UDP:

 

Description of the problem:

A 3rd party application is retrieving measurement data from a MGC+ at 2400 Hz.

All samples are then streamed to an UDP port locally.

 

I then use a labview application to read the data and do some processing.

The issue is, at 2400 Hz i loose a lot of packets due to UDP receive buffer overflow, i.e. new data appears before all old data is read.

This is because data is comming in uneven bursts, 10-20 times each second.

 

I have tried increasying the UDP receive buffer size according to this:

http://digital.ni.com/public.nsf/allkb/D5AC7E8AE545322D8625730100604F2D

And it seems to solve the problem.

But here arises another issue:

If i modify the UDP buffer size while the 3rd party application is running, the 3rd party application will crash.

 

Therefore, my question is:

Is there any way to modify the default UDP receive buffer size in windows?

Such that when the UDP connection is opened, it will have a buffer size of for example 32768, regardless of which application that accesses the UDP connection first?

 

 

 

0 Kudos
Message 1 of 6
(20,583 Views)

Are you reading and processing the data in the same section of code? If you are I would suggest that you split your processing task from the receiving task. The receive task would do nothing more than read from the UDP connection and post that data to a queue. This should be able to keep up with the buffer. Your processing task will run in parallel and the queue will allow you to buffer the data while you are processing without affecting the UDP buffer.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 2 of 6
(20,582 Views)

My code is already split up like that:

One task is doing nothing by receiving data and sending it to a queue.

Another task is doing all the processing.

 

When i increase the buffer size the problen dissappears and no packets are lost.

The only problem is that i don't know how to override the default buffer size in windows when the 3rd party application is the first one to start...

0 Kudos
Message 3 of 6
(20,580 Views)
Solution
Accepted by topic author fredlaks

Is there any other code inside your read task such as a Wait ms? Is the read task as lean and mean as possible? When do you increase the buffer size?

 

I found this registry key. It might be worth a try. The post was old so it may not apply to Win7.

 

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Afd\Parameters \DefaultReceiveWindow (REG_DWORD) = 16384 (decimal)



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
Message 4 of 6
(20,575 Views)

 

The only thing the read task does is to put the data into aqueue, and clear possible error status. No Wait functions.

 

I increased the buffer size just after opening the connection, in the initialization of the program. Additionally, the connection might get closed and reopened in case no data is comming, or data stops comming.

 

I tried adding the registry key you gave me (on windows XP) and it works. Now the buffer is the size i specified (32768 decimal) each time the connection is opened.

This will most likely solve my problem, and i therefore no longer need to increase the buffer programmatically.

 

Thanks a lot for your help, Mark Yedinak Smiley Happy

0 Kudos
Message 5 of 6
(20,564 Views)

You're welcome. I'm glad I could help.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
Message 6 of 6
(20,557 Views)