04-08-2026 12:04 PM
Hi everyone,
read/write VISA as fast as a terminal?
I have a card to which I send frames, and it responds by sending back response frames. This card accepts and responds instantly when I use a simple terminal, but when I use LabVIEW, it takes too long before responding. This causes me to lose time in my automated tests.
I use For loops with shift registers to send and receive responses from my card.
Do you have a better idea, please
04-08-2026 12:28 PM
Hi Fredo,
@Fredo123Danzel wrote:
read/write VISA as fast as a terminal?
This card accepts and responds instantly when I use a simple terminal, but when I use LabVIEW, it takes too long before responding. This causes me to lose time in my automated tests.
Usually LabVIEW isn't slower than your terminal software, it uses the same underlying OS routines…
How do you send and read messages?
Do you use proper message handling (termchars) etc.?
04-08-2026 01:18 PM
Thank you for your response. I am using the example provided in the LABVIEW library. I send a frame and wait for a read.
04-08-2026 02:00 PM
You have termination characters turned off on read, request 100 characters on read, and clear the "timeout" error you get afterwards.
I think you perhaps misunderstand how serial communication works. Serial communication is a single stream of bytes that arrives at your PC with no structure beyond the byte level that you set up at the start (baud rate, bits, etc.).
That means LabVIEW, or any other software doesn't get "a message". It gets one byte, then the next, then the next, and so on. It needs to be told when to stop looking, somehow.
Typical methods are:
1. All communications end with the same character (termination character). This is by far the most common, but you have turned it off.
2. All communications are the same length, known before-hand. If this is the case for you, you need to change the "100" in the VISA read "byte count" input to match this number.
3. All communications begin with a header that includes the byte count of the message. If this is the case, you need to do one very short read to get the header, followed by parsing that to convert the bytes into an integer, then following that by another read afterwards that reads exactly that many bytes.
Your terminal program is just taking every byte it gets as a response and putting it on a screen for you to see. It's not using any form of intelligence to know when it's "done", you're just assuming that it somehow "knows" when it's done, but it doesn't "know" anything.
You need to determine which of the 3 methods listed above that the device is using, and start using it. It looks like you're sending raw bytes in hex, exactly 8 of them, so it's possible that you just need to replace that "100" with an "8" and you'll get 8 bytes back, but this is the point where you need to check the communications manual for whatever "card" you are communicating with.
04-08-2026 03:43 PM
Hello Kyle,
Thank you very much for your feedback!
If I understand correctly, it is essential to know the number of bytes expected so that we can stop reading and avoid error -1073807339.
My card sends me a frame whose last byte is the CRC. So I just need to use that as the end character.
What do you thi
04-08-2026 05:25 PM
Most serial communication issues can be solved by watching this video: VIWeek 2020/Proper way to communicate over serial
04-09-2026 11:20 AM
You are using a binary protocol. So first thing is to turn the Termination Character OFF. Secondly, we need a lot more detail on the messaging protocol and the data frame. Binary protocols typically start with a synch word and a message type. After that, it is pure guessing without the documentation.
As a side note, if you have a string control/indicator/constant not in "Normal" display mode, make sure to make the display style visible. This will save you A LOT of pain.