LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Can I do a UDP peek of the receive buffer in LabVIEW

I want to get the size of the buffer before I receive it using the UDP Read function.  The size can vary and be quite large so I want to set the buffer size programmatically instead of setting it to a fixed length.  I see that in LabWindows the function allows you to read the buffer.  Can I do something like that in LabVIEW?  Or can I call this LabWindows function within LabVIEW?

 

From the UDPRead function in LabWindows:

 

If you pass 0 for inputBuffer or inputBufferSize, the function peeks at the port and returns the size of the first available datagram.

 

 

 

0 Kudos
Message 1 of 6
(4,400 Views)

Hi RocketGirl,

 

The UDP Read functions in LabVIEW and LabWindows are from different libraries, which is why they don’t have the same features.  In LabVIEW the buffer size is automatically generated when using UDP.

 

In order to get your application running I think it might help to know a little more about what you’re trying to do.  What are you trying to read in?  Does the default max size input not work for your application?  Also, what version of LabVIEW are you using?

 

 

Regards,

0 Kudos
Message 2 of 6
(4,329 Views)

I am using LabVIEW 2009, SP1.  I need to receive a varying UDP packet size depending on a test that is being run.  I tried to set it as large as what I thought the maximum buffer would be - which was 22448.  However I get an error when I run the application saying I don't have enough LabVIEW memory.  I then set the buffer to 11448 and when I run the application I don't get a memory error.  I would like the buffer size to be large enough to read the whole packet, however I won't always know what that packet size is.  So I wanted to "peek" into the buffer first.  It says the 'default' size for Windows is 548.  However, I can't believe I have a memory issue on my system trying to read 22kbytes!

 

 

0 Kudos
Message 3 of 6
(4,316 Views)

Did you read the help? "The default is 548. (Windows) If you wire a value other than 548 to this input, Windows might return an error because the function cannot read fewer bytes than are in a packet." If you need to read more data than that, do several reads in a row until you get the entire buffer, and concatenate the results. If you really want to do a UDP Peek, there's probably a way to get the UDP Socket from the connection refnum (NI supplied a VI to do this for TCP sockets in an example that disabled the Nagle algorithm on Windows, maybe the same technique works for UDP) which you would then pass to a Windows function via a DLL call, but that's complicated. Why are you concerned about having a preallocated buffer?

0 Kudos
Message 4 of 6
(4,297 Views)

Yes, I did read the help and the way I interpreted it was that I would receive an error if I did not read all of the bytes in the packet i.e. that I had not allocated a buffer size large enough to read the entire packet.  My concern is not with preallocating a buffer. My concern was in setting it large enough to do so.  I set it as large as I thought I would need, but then I got a memory error.  Would this memory error be due to the fact that the buffer size wasn't large enough?  I assumed that I had set it too large and got the memory error.  If I could see ("peek") into the packet length first then I could be assured of not creating an error.

 

The same help also said the UDP Read function reads the entire datagram.  Can I really just read the first four bytes of the datagram - which in my case are the length of the bufffer - and then read the rest of the datagram?  I didn't think I could do multiple reads of the same datagram.  When you refer to doing several reads of the buffer, is that referring to a single packet or several reads of multiple packets?  Maybe my terminology is incorrect,

0 Kudos
Message 5 of 6
(4,274 Views)

The technique you describe, where you send the size first followed by the actual data, works well for TCP but not for UDP. With UDP, a packet can get dropped and you won't know it, so you should not use UDP to send data that's larger than one packet. Let's say you read the size of the incoming message, and then try to read that number of bytes. There's no guarantee that you actually received that number of bytes, though - one packet somewhere in the middle could have gotten lost. The correct solution here is either to switch to TCP, which will be more reliable but slightly slower, or rework your data format so that you don't rely on message spanning across multiple packets.

0 Kudos
Message 6 of 6
(4,268 Views)