(Note: I've posted this to the general LabVIEW forum as well)
Hello,
I have been struggling with this problem for quite some time now, and haven't been able to work it out:
I am using the LV TCP functions to communicate between a PC and a Real-Time system (PXI-8186). I am trying to transmit ~4kB from the RT system and ~200 Bytes from the PC every 100 ms (i.e. only ~40 kB/s). On most iterations, it works fine - however, sometimes there is a delay of approx. 3 seconds (always between about 2800 and 3000 ms). The delay appears to occur while the RT system is writing and the PC is waiting for data to read. I have tried splitting the data up into various sizes, with various wait times between each write, which does seem to improve things, but I still am not able to totally eliminate these delays.
I have experimented with disabling the Nagle algorithm or just appending the size value to the data string and doing only one write from the PC. Both of these let me communicate properly (without doing one of them it would only write every 600 ms), but they don't change the problem.
I've also tried out the "netbench"/"Maximum TCP Transfer" tool to try and map the transfer speeds using different packet sizes and wait times, but wasn't able to find anything conclusive. The delay can happen on the order of once every minute or more, so it doesn't show up using these VIs.
I've seen this problem on more than on PC and more than one Real-time system.
Any help would be greatly appreciated.
Jaegen