Instrument Control (GPIB, Serial, VISA, IVI)

cancel
Showing results for 
Search instead for 
Did you mean: 

Serial bus-- bytes at Port returns 0 with termination character disabled

Solved!
Go to solution

I am new to labview...

 

I am trying to communicate from PC to a custom hardware device with a UART (9600 Baud, 8 bits, No Parity, 1 Stop bit, HW flowcontrol). The device operates as follows: When a packet of data (<256 bytes) is sent to it via serial port, each byte is echoed by the UART. After the packet is completely received the device verifies that the packet is correcly received, and after a delay (~ 200 mSec or more), sends 2 bytes to the host indicating whether the packet was ok or not. This works perfectly when I use Hyperterminal to communicate with the device.

I have designed a simple vi to handle this. As each byte is sent, I use the "Byte At Port" property and read the echo byte. The problem is after all the bytes are sent, I get 0 instead of 2 for the 2 bytes I am supposed to get. I tried to do this with the Write-read property-read vi's in sequence but that did not work. So as you can see in the attached vi, i have seperated this into 2 cases;

(True Case) during the time the packet is being sent and (false case) where I simple (try) to read the 2 bytes. I get 0 bytes at port and the VISA Read vi gives me a time out error ! ....Increasing the time to 1000 mSec has not helped either.

 

Can anyone take a look at the vi and suggest what is wrong and what can be done ?

It will be much appreciated.

 

Bhal Tulpule

0 Kudos
Message 1 of 23
(6,748 Views)

I would start with connecting the error ot of read to error out and at the same time unbundle the error out from read and stop the while in case of error.

Then look at the error. or ask again.

greetings from the Netherlands
0 Kudos
Message 2 of 23
(6,738 Views)

Hi Albert,

I am new to LV so I did not fully understand your "unbundle the error out from read and stop the while in case of error.

Then look at the error. or ask again." comment.

In any case I am getting the timeout error as seen in the screenshot (see attached).

In any case connecting the error cluster to something does not help understand why i get 0 Bytes at port after the sent bytes are echoed. (and this works perfectly with Hyperterminal).

I have checked and I see that the terminal character is disabled..

So what is the reason and more important how to fix it ?

Any suggestions out there ??

 

Thanks.

0 Kudos
Message 3 of 23
(6,717 Views)

The easiest thing is to simply wire the error out indicator terminal to the error out of the VISA Read.

 

You do not need the property node in the False case. It does not do anything.

 

In the True case you may be reading before the byte is echoed. Recall that it takes about 1 ms for the byte to be sent by the remote device. If the Bytes at Port property node executes in less than 1 ms (and it probably only takes a few microseconds), then it likely returns zero so your VISA Read does what it is told and read zero bytes.  Since you are expecting an echo of one byte, remove the Bytes at Port and wire a constant 1 to the VISA Read Byte Count input. The Read will wait until it receives one byte or until the timeout.

 

Lynn

0 Kudos
Message 4 of 23
(6,708 Views)

Hi Lynn,

I tried to run your suggested vi and same problem. I then slightly modified the False case logic so that when the last byte is sent. the read is to expect 3 bytes (e Y ?) rather than just the echo (See Pkt to be sent indicator and logic outside False case) No change. After sending the last byte (e = 0x65) the Byte Sent indicator goes blank and the program is stuck att VISA read times out. I increased the Wait Time to 10000 Sec. and that did not matter. I still get the timeout error. (See attached).

I don't think this is being caused by insuffecient delay, becuase when I run this in "Highlight Execution" mode, the echo character is immeduately observed.

 

So what else is causing the remaing 2 bytes to not show up ? Is this still have something to do with the Termination character ?

 

Appreciate if you can help..

 

Thanks.

Bhal Tulpule

0 Kudos
Message 5 of 23
(6,685 Views)

Bhal Tulpule,

 

I have looked back over this thread and have several questions and comments.

 

1. In your first post you mention HW flowcontrol but your code does not do any flow control.  At least the default value on the VI you posted has it set to None.  If the external device is trying to pause communications via the flow control and your program is not paying any attention, this could be part of the problem.

2. You describe the communications protocol as echoing back each character sent, then delaying about 200 ms, followed by an indication of valid reception or not.  You do not seem to be taking advantage of this information.  Also, what characters are returned for OK and Not OK? Is the delay of 200 ms always going to occur or can the response be immediate?

3. Your code tests the return string for equality with "eY.?" while your latest post refers to "eY?". One has 4 bytes and the other 3. Which is correct?

4. Do you have any idea how many bytes are received in the True case on the last iteration before switching to the False case? Things happen fast enough that the Bytes Read Back indicator will only show what happens on the last iteration.  What if the "e" occurs in the True case and you only get the "Y?" in the last iteration.  Then it will timeout waiting for the third byte which was actually transmitted (and received) earlier.

 

I have some ideas but before I present them, I want to see if I have understood what is going on.

 

I added two arrays to the VI which will capture the bytes read and the numbers of bytes requested to read.  Please run this and let me know what you get.

 

Lynn

0 Kudos
Message 6 of 23
(6,675 Views)

Hi Lynn,

Thanks for your prompt reply. I will look at the test case you have sent and also try to carefully answer your questions.. soon.

 

Bhal

0 Kudos
Message 7 of 23
(6,660 Views)

Hi Lynn,

Looks like I figured out what the problem was. Basically when the packet was correctly received, the external device was taking much longer time than 200 msec. to respond . I was able to verify this with a scope and a software tool called PortMon which is free and just monitors serial port activity...So without a timing loop the VISA read function was timing out.

I am now creating a time delay loop and periodically check if the # of Bytes at Port are as expected and only then read them. I am reasonably confident that it will work....

I think the loop is necessary since the time delay is variable and quite long...

 

In any case here are some answers (look for >>) to your questions, just for the record..

1. In your first post you mention HW flowcontrol but your code does not do any flow control.  At least the default value on the VI you posted has it set to None.  If the external device is trying to pause communications via the flow control and your program is not paying any attention, this could be part of the problem.

>> I needed to connect the flow Control local variable to the input to the VISA. Thanks for pointing it out.

2. You describe the communications protocol as echoing back each character sent, then delaying about 200 ms, followed by an indication of valid reception or not.  You do not seem to be taking advantage of this information.  Also, what characters are returned for OK and Not OK? Is the delay of 200 ms always going to occur or can the response be immediate?

>> You are correct. I am now putting in a timing loop. See above. Also the time is variable and so I need to put a loop.

3. Your code tests the return string for equality with "eY.?" while your latest post refers to "eY?". One has 4 bytes and the other 3. Which is correct?

>>Y.? is correct. "e' was the last byte of the message. My error..

4. Do you have any idea how many bytes are received in the True case on the last iteration before switching to the False case? Things happen fast enough that the Bytes Read Back indicator will only show what happens on the last iteration.  What if the "e" occurs in the True case and you only get the "Y?" in the last iteration.  Then it will timeout waiting for the third byte which was actually transmitted (and received) earlier.

>> After the last byte (e) is sent, it is simply echoed. Then after considerable delay, I am supposed to always get a 3 letter string, one such string is"Y.?" which indicates success and the device is ready for next packet. Other strings indicate errors and so on. Always 3 bytes..

 

I will try to post the complete vi after I have added all these fixes and tested them..

 

In the mean time thanks to you for helping me solve the problem..

 

Bhal

 

0 Kudos
Message 8 of 23
(6,649 Views)

What is the longest time you have seen?  I would recommend just setting the VISA timeout to something really long, like 10 seconds (I hope your device doesn't take that long to respond).  Then you don't need to keep checking the number of bytes at the port.  Furthermore, I'm afraid you are going to get into an infinate loop.  What if the device never responds or a byte gets dropped?  You will continuously be checking with no way to get out of the loop.  So you would want to put in a maximum time to wait.  And if you are doing that, you might as well just use the VISA timeout.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 9 of 23
(6,642 Views)
Solution
Accepted by Bhalt

Bhal,

 

I agree with crossrulz.  You should set the timeout based on the behavior of the device.

 

Consider reading 1 byte at a time - all the time - and comparing the result with what you sent.  Since the device echos what it receives, you can use this to check the communications. After all the sent bytes have been echoed, then start looking for "Y" followed by "." followed by "?" to see if you get the acknowledgment. With the VISA timout set to some value based on how long you want the program to be unresponsive to user inputs or other activity, you can handle the knids of errors crossrulz mentioned.

 

Lynn

0 Kudos
Message 10 of 23
(6,631 Views)