LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Read bytes at serial port direct and in-direct

 

Hello,

 

I have questions regarding write-read operation from a serial port.

 

I have an instrument with message based communication. I prepared a simple VI to send command and read the response. (I disable the default termination character and appends EOL to the command format).  I prepared three versions:

 

In the first I determine myself the expected bytes to read (in the attached image I read the temperature of the device).

Direct Write-Read.jpg

 

This works and I get an answer quickly - but I want to process to be more general so it will suite for different number of byte for example - serial number readout.

 

Byte at Port Write-Read.jpgWrite-Read Connection.jpg

0 Kudos
Message 1 of 8
(6,535 Views)

Search the forums for questions about serial and read up on the messages you find.

 

1.  Only disable the termination character if you don't expect to get back a termination character when you read a message.  Usually with text-based messages, you can expect a termination character.  With messages based on binary data, you don't get a termination character back because the character could either signal the end of the message, or be a perfectly valid byte inthe middle of the messages.  The enable/disable termination character on the serial configure has absolutely nothing to do with the tranmission of the termination character when you do a VISA write.  There is a property node that you can use to determine whether to send a termination character on a VISA write automatically, but generally there is no good reason to do that.  It is better programming practice to send the termination character yourself in the string you write to the VISA Write function.

 

2.  Reading up on the forums, you'll see this is just wrong.  You write and immediately check the bytes in the receive buffer.  Since you didn't really give the device time to get the message and turn around with a response, your bytes at port will very likely be zero, and almost certainly won't represent a complete response.

 

3.  This is completey wrong.  The number coming out of the VISA write is the number of bytes written.  Why do you think the number of bytes coming back in the response would be exactly equal to the number of bytes you wrote?  It is usually different, and even if it is the same, it is usually just a coincidence.

 

The best way to figure out how to make your serial communications work is to start reading the instruction manual that comes with the device and pay attention to serial port settings (baud, parity, stop bits, data bits) and the structure of the commands that you send to it and the responses you should receive from it.  Only when you read and understand that will you figure out the appropriate way to program your LabVIEW application.

0 Kudos
Message 2 of 8
(6,521 Views)

In the bottom code, you are requesting to read exactly as many bytes as you write. Is that actually what you want to do? What is your question?

0 Kudos
Message 3 of 8
(6,520 Views)

Ravens pretty much covered it quite well.

 

If your instrument is sending the termination character, just use it.  I simplifies the communication so much.

 

If it doesn't, at the very least as a delay between sending the command and checking for data in the port.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 4 of 8
(6,492 Views)

thanks for the detailed answer.

 

I understand point (1) and will change it. in my program I append EOL to the string for termination characters.

 

I also understand point (3) - this is trully incorrect.

 

I'm not sure I follow you on the main issue - point (2)  - as you can see from my attaced images, I did set baud rate, parity and others... my first image when I read with expectation of constant byte number (22) is working! and as you can see I don't do any delay between the write and the read in this case. so It doesn't seem likely to be timing issue.

 

this will work for one command who return 22 bytes, and by changing the constant I can read other command which returns different number of bytes.

 

but how do I set it to read all the bytes on the port - I don't want to use constants?

0 Kudos
Message 5 of 8
(6,459 Views)

mrish wrote:

this will work for one command who return 22 bytes, and by changing the constant I can read other command which returns different number of bytes.

 

but how do I set it to read all the bytes on the port - I don't want to use constants?


It works because VISA has a built-in timeout, so what's happening here is that you are waiting for either 22 bytes or for the timeout to occur, whichever happens first.

 

If your device does send a termination character, then using it is the easiest solution - the read is ended by a termination character, regardless of how many bytes were received (if no termination character, then it still waits for a timeout). Or you can do it with the "bytes at port" like your second image, but in that case you should insert a short wait between the write and checking for bytes at port, to allow time for device to process the command and send the response.

0 Kudos
Message 6 of 8
(6,448 Views)

When you do a VISA read, it will read the data until one of three things happen:

 

1.  You read the number of bytes you requested.  So if you request 22 bytes, the VISA Read will wait until the 22 bytes have arrived at the serial port.  It will also give you a warning on the error wire that there MAY still be available bytes left in the receive buffer.  (My opinion is that this is a completely useless warning.  It doesn't directly help you as the programmer to figure out that there are more bytes in the port or not.  And as a programmer, you should be smart enough to know that there always could be more bytes in the port you should look for in the event it matters to you.)

2.  You get the termination character if you have enabled the termination character.  So if you request 22 bytes, and the buffer gets 10 "normal" bytes and the 11th byte is the termination character, the VISA read will return the 11 bytes.

3.  The timeout value that you had set with the Serial configure has been hit.  So if you request 22, and only 5 bytes come in (and none were the termination character in the event you enabled the termination character), by he time timeout value has been reached, then you will get the 5 bytes and also get a timeout error on the error wire.

 

If your device uses the termination character to mark the end of messages, then enable it and use it.  All you then need to do is request a sufficiently large number of bytes to read, a number equal to or larger than the largest message you ever expect to receive.

 

If you want to read all the bytes at the port, then the proper thing to do depends on your situation.

1.  If you do not use a termination character, then you use a wait function to give your device a sufficient amount of time to return what you expect to be a full message.  Then do the bytes at port method and read that many bytes.

 

2.  If you do use a termination character, then read a sufficiently large number of bytes.  Any properly developed device you behave with a single command, give a single response protocol, so when you get the response terminated by the character.  You should have a complete message.  If for some reason there is a chance that a response could have multiple instances of the termination character in it (I would consider that a poorly written device), then you might have to do a read continuously in a loop until you detect that the number of bytes at the port is zero.

Message 7 of 8
(6,441 Views)

Bytes at Serial Port is in 99.9% of the cases the absolutely wrong function to use. It makes a serial port driver almost impossible to write in a way that will work across varying conditions like OS, line quality, device state, etc.

 

When possible always use EOM signaling. If that is not an option because of a binary protocol, it is normally a fixed size protocol, either with truely fixed size data blocks or a fixed size data header that specifies the size of the following variable sized data block(s).

 

Using Bytes at Serial Port is a perfect way to get into insanity if you want to create a reliable communication. With reliable I mean more than just sending a few commands and reading the answer and restart when it gets stuck. That operation mode may be ok for some quick throwaway lab apps but will truely fail in any industrial grade application without a lot of extra fail and retry logic and intermediate data buffering in your own code. And since not using Bytes at Serial port costs nothing at all, once you try to think in other terms of instrument communication, but rather makes your life a lot easier, there is no reason to still use it even in your throwaway lab app.

 

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
Message 8 of 8
(6,411 Views)