LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Bytes at port for Telnet sessions

Solved!
Go to solution

This is my first foray into communicating with a Telnet server.

 

I am upgrading my old code to navigate the menu structure on our previous devices that used a serial port to now communicate over Ethernet using Telnet.

 

Since the VISA read needs to know how many bites to read my older code depended heavily on the "bites at port" vi.

 

Is there a similar way to do this for a Telnet connection?

 

Right now I have to count the Bytes and put in a number try it out, play with the values until I find something that works.

 

The trouble is I do not know exactly how many Bytes will be returned after entering a command because the device will "inject" error and system messages right onto the telnet port.

 

 

========================
=== Engineer Ambiguously ===
========================
0 Kudos
Message 1 of 5
(4,656 Views)
I have faced the same issue. Depending on how quickly your sending device transmits data you can write code that can read variabel length data without the data indicating how many bytes are in the message. One way to achieve this is if your data is terminated by some specific character or characters. If you have some sort of termination characters such as a carriage return/line feed you could read your data and the parse out individual messages based on the termination characters. If you don't have messages that are terminiated or that are multiline, you could write a read VI that reads a single byte from the connection. This read would contain a longer timeout so you wouldn't wait forever for data. Once a single byte is read then read data quickly in larger chunks using a short timeout. When this timeout occurs you can assume that there is no more data available and complete the read. This approach works fairly well for a command/response type of conversation. Yet another alternative is to spawn a read task that will simply read data from the connection. It will pass the data it reads to another task that will process the data and messages. The read task should use a queue to pass data to the parsing task. Obviously the parsing task must know how to interpret all of the messages you expect to receive. Once it identifies a complete message it can either process it or pass it along to another task for processing. This approach works well for asynchronous data.


Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 2 of 5
(4,653 Views)

Another option that I have used in the past is based o the Simple TCP Messaging Library. Basicly it sends a fixed message style where the first 4 bytes define the size of the message, the next 2 bytes are the message ID, and the remaining bytes are the message itseft.

 

I have found it schema very useful when the message size is unknown at code time or if the message size changes during operation.

Ryan Podsim, CLA
Message 3 of 5
(4,644 Views)

rpodsim wrote:

Another option that I have used in the past is based o the Simple TCP Messaging Library. Basicly it sends a fixed message style where the first 4 bytes define the size of the message, the next 2 bytes are the message ID, and the remaining bytes are the message itseft.

 

I have found it schema very useful when the message size is unknown at code time or if the message size changes during operation.


If I were implementing the complete system this would be the approach that I would use or a more robust defintion of the data if necessary. I got the impression from the original post that the OP does not have control over what the sending device transmits hence my suggestions.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 4 of 5
(4,638 Views)
Solution
Accepted by topic author RTSLVU
Telnet is mostly a line oriented protocol. So basically it should be enough to enable the termination mode in VISA and simply start reading data with a large enough number for the bytes to read to make sure you always get at least one line. VISA will automatically terminate each read when it encounters the termination character in the stream.
Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
Message 5 of 5
(4,618 Views)