09-14-2009 02:11 PM
As long as the sender and receiver know what the data looks like it is fairly straightforward to keep them in sync with each other without using any type of addiitonal protocol. If on the other hand your data will be variable length data then you will need to have some mechanism for deliniating the data.
Blog for (mostly LabVIEW) programmers: Tips And Tricks
09-14-2009 02:27 PM
09-14-2009 02:38 PM
In both loops I use a short timeout ( 10ms) and ignore error 56.
I would recommend using ZERO timeout, using BUFFERED mode, and ignoring error 56.
In BUFFERED mode, the TCP READ returns either all the bytes you requested or nothing.
You don't need to spend that time waiting - the data will come in when it comes in.
You can service many more connections / UI events / whatever that way.
Blog for (mostly LabVIEW) programmers: Tips And Tricks
09-14-2009 03:44 PM
Mark,
Thanks for the info. Do you have a ref document or a link that explains the minimum ethernet link packet size (and the rest of the info you shared)?
09-14-2009 03:50 PM
09-14-2009 04:06 PM
While I'm sure that's a fine book, I would offer the suggestion that, at your stage of the game (you mentioned being new to TCP/IP), you should NOT get bogged down in the fine print details of what's happening under the hood.
Whether the minimum packet size is 1 byte or 6 bytes or whatever is not germane to solving your problem, getting the thing working is.
TCP/IP is organized in layers, and this padding discussion is down in the transport layer, whereas the only thing your LabVIEW program has access to is the application layer.
Again, feel free to read all you like, but don't think that you have to know those details to make things work.
Blog for (mostly LabVIEW) programmers: Tips And Tricks
09-14-2009 04:14 PM
I agree with what Steve said. Although I was the one to dive into the nitty-gritty details of TCP/IP packets this information is not really relevant at the application layer. Just make sure that both your sender and receiver know what the data should look like and what to expect.
I apologize for the diversion. I just spend so much time down in the details working with protocols I get caught up in the details sometimes.
11-04-2009 09:56 PM
Hello, does anyone know how to trap the timeout error (code 56) coming out of the TCP Read VI? I set the TCP Read to Buffered Mode with 0ms timeout. When there is no data, it times out and returns 0 bytes, which is what I expected. However, I need a way to recognize this condition and subsitute the 0 bytes with a constant for use downstream.
I tried using the Find First Error.vi but it is not what I expected; it always returns TRUE.
Is there another way to detect Error Code 56 and return a boolean True? Do I have to parse the error out stream cluster?
Thanks.
11-04-2009 11:52 PM - edited 11-04-2009 11:53 PM
Yes, just unbundle out the error number from the error cluster and compare it to 56. If true, clear the error and set your constant into the output wire. If false, then just wire the error and the data through.

11-05-2009 10:17 AM