LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Visa Read Temination Character or Not

hi,

 

I am using the Visa Read port that will communicate from the PC to a Uart Interface on an FPGA. I have noticed that the Visa Read needs a Termination Character so it appears. The problem is that i am sending raw data back and forth not in ASCII format. So the termination character(U8 format) in my display window which reads the return data can be hit at any time by the raw data causing the data not to display correctly. I could send an 0A, 0D(LF/CR) as a termination sequence but not sure it can take a sequence as the termination..

 

If i disable the Termination Character... The Visa Read will not send out anything until the Time out occurs on the Visa Setup.

 

Any suggestions on reading the Visa Port in this case...

 

thanks

 

0 Kudos
Message 1 of 4
(2,697 Views)

Hello Roadrunna,

 

How are you specifying the number of bytes to read?  If you have a finite number of bytes to read specified, then the read function should return data when enough has accumulated at the port as long as you have the End Mode configured properly. Seeing your code as it is now would help us offer suggestions, but in the meantime I'd recommend taking a look at this tutorial:

 

Tutorial: Termination Characters in NI-VISA

https://www.ni.com/en/support/documentation/supplemental/06/termination-characters-in-ni-visa.html

 

Specifically, take a look at the "Binary Communication" section- I believe this is what you're trying to do.

 

Regards,

Tom L.
0 Kudos
Message 2 of 4
(2,691 Views)

This is one of those edge cases that we all hate.  It boils down to "What do I do if the data can contain "0x0A?" 

 

It happens, it happens a lot with "Chatty Cathy" UARTs. and there is not much you can do from this end of things except swear at the developer of the other end for failing to present a deterministic protocol.  So, you get to code around the other developers faults.

 

You do have a few things going for you though.... First, you get a warning when VISA terminates a read when the termination character is read.  its pretty easy to unbundle the code out of that error cluster and see if you got warned.   Bytes at port is your next asset,  If you get a warning, see if there are still bytes left in the buffer and read them.

 

Better, go talk to the other developer (politely this time. You never really meant those things you said about his mother)  and see if he can add a packet length byte to each message. 

e.g. STX, Message Length, Data[], EOM


"Should be" isn't "Is" -Jay
0 Kudos
Message 3 of 4
(2,675 Views)

Since you are doing binary data communication, you can't use termination characters.  So just disable them.  But you have to have some way of knowing how many bytes to read.  Do you have any control over the protocol?  Are each of the messages the same?  Are the message lengths always the same?


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 4 of 4
(2,673 Views)