09-08-2005 07:58 PM
09-08-2005 09:05 PM
Hi Steve,
Sorry if this sounds simplistic, however...:
When the LabVIEW application starts-up, why not just call "UDP Read" (w/ 0 ms timeout) and ignore the returned data?
09-10-2005 04:46 PM
09-13-2005 09:17 PM
Hi xseadog,
Steve's request sounded reasonable, but I'm not that familier with the details of UDP. Why (again) would you not treat a UDP input buffer, like a serial (COM port) input buffer - where one might reasonably worry about trash characters accumulating? Re: TCP, same question. Another ("remote") LabVIEW application could be sending unsolicited or incomplete messages to me periodically, and the packet-level detail is transparent to my application. From my point of view, bytes are accumulating in an input buffer, and I might want to flush that buffer before explicitly requesting data from the device. So I don't understand your point about packet-level header-sizes!?
Clue me in, man!
D.
09-14-2005 10:42 PM
Good evening,
I think the issue here is that with a UDP connection the idea is to create a fast half-duplex connection between a server and a client. If the client does not have a socket open, an error is generated and no datagram is buffered. Therefore there should be no “junk” packets of information in a port buffer to clear. In addition I believe that the UDP read can never read more than one packet worth of data – further illustrating that there really is no UDP buffer that is set up in your OS to clear.
Thanks for posting -- hopefully this helps. Please let us know if anyone has any other questions!
09-15-2005 04:36 AM
Thanks Travis,
I get the point (I think) - there should be no junk when the [UDP] port is first opened, and this seems to be the specific concern of the original post. I've never found junk in a serial-port, when first opened, for that matter!
Thanks.