I've a vi written in LV 6.02 that connects to a socket on a remote server. When the connection is established, the host streams data until the connection is broken. The server is one or more processors on a Quad PowerPC VMEBus board, connected via ETHERNET.
The client is running on a 2GHz Pentium with 512MB of RAM.
Each new connection spawns a template vi that displays the streamed data on a waveform graph.
If each instance of the client vi connects to a different remote processor, I am able successfully to stream data from each remote connection.
If, however, a single host cpu opens multiple sockets, the second client instance times out on its first read, after which the original instance also times out.
We've made multipl
e connections to sockets on a single processor of the remote board before using normal Winsock calls from VC++ or Cygwin, and we've a ton of experience with the OS on the other end of the connection (vxWorks)
I'm using LV 6.02 Professional Development System. I get the same result whether I run two instances of my template vi or if I run two distinct client vis. Running the executable as opposed to running under the development environment doesn't seem to make any difference.
Any thoughts?
Charles Krug