11-29-2025 04:18 AM
Hello, I am currently working on a project with an analog sensor and a stepper motor. I want the sensor to collect data when the motor turns to angles of specific increments.
Currently I have labview tell the arduino what angle increment the stepper should move in through the serial monitor. (Ex. I send "5", and the motor will move to 5, 10, 15, 20... degrees). When it reaches that angle, I have the arduino print on the serial monitor what angle it is currently at which Labview should read. I also have arduino write "M" on the serial monitor if the motor is still moving so that labview knows to not take any data from the sensor. I currently don't have the sensor on me so I have a random number generator instead.
Labview is able to communicate with the arduino and tell it what increment the stepper motor should move in. However it doesn't seem to be reading what the arduino writes back at all. I'm sure this is a trivial problem but I am still having a hard time with this even after reading the other threads. Any help would be graciously appreciated!
Solved! Go to Solution.
11-29-2025 01:10 PM
DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (still not enough emphasis)
You are dealing with an ASCII protocol, so let the VISA Read do all of the hard work for you. Just tell the VISA Read to read more bytes than you ever expect in a message and it will read the entire line for you. After that, it is a matter of you making a proper message on the Arduino side and properly parsing it on the LabVIEW side. I would recommend taking all of your data from a single loop in the Arduino and formatting that into a single line, maybe using a comma as a delimiter.
11-29-2025 01:26 PM
Thank you so much for replying!
I tried testing Labview with the bytes set to 100 and changed the arduino end so that it only sends a line on the serial monitor when the motor reaches the new angle. The buffer still doesn't read anything and doesn't seem to be recieving any bytes at all when I checked with the probe. Is there something with reading the serial monitor in a while loop after writing that Labview doesn't like?
11-29-2025 04:35 PM
I don't really know how I solved it but everything works now!
- I moved the code in the void loop to setup
- I changed the byte to 100
The code definetly could be cleaner but if it works it works.
11-30-2025 12:40 PM - edited 11-30-2025 12:49 PM
@crossrulz ha scritto:
DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (still not enough emphasis)
Let me pick-up on this.
I banged my head on this long ago and, back then, not even NI support gave me good answers.
Granted that it is of course a better idea to let VISA vi's do the 'hard work', and only for the sake of my understanding, why does "bytes at serial port' work so badly? I could never understand what conditions make it give out inaccurate results. It is simply erratic.
Given that VISA vi's work, there must be an internal "low level" mechanism that actually works. Why can't we, "common mortals", use it?
Or is it just a brute force "keep reading each byte from UART as fast as possible and process as soon as terminator is received, 'N' bytes are received or timeout reached"?
12-01-2025 08:43 AM
@Gyc wrote:Granted that it is of course a better idea to let VISA vi's do the 'hard work', and only for the sake of my understanding, why does "bytes at serial port' work so badly? I could never understand what conditions make it give out inaccurate results. It is simply erratic.
The Bytes At Port is not inaccurate, it's just usually isn't what you actually want. It just tells you how many bytes are in the RX FIFO even if it hasn't received the full message.
For example, you send a request for data to an instrument. It takes time for that data to be put into the UART TX FIFO, transmitted over the bus (baud rate is the biggest factor here), received by the instrument's UART RX FIFO, interpreted by the instrument, the instrument to react to the command, and then everything back the other way. If you send the command and then immediately use Bytes At Port, you will likely get an answer of 0 because none of the response data has come in yet. So you add in a delay. How much should you wait? And now you are suddenly subject to any part of the communication path being held up and you don't get the full message or you waste a lot of time by waiting longer than needed.
This is the point of the termination character (assuming an ASCII protocol here). It tells you when a message is complete. You don't need to worry about the timing as much (you still have the VISA timeout, typically 2 to 10 seconds). And as long as you told VISA Read to read more than the message should ever be, you will get the full message as soon as it comes in.
@Gyc wrote:Given that VISA vi's work, there must be an internal "low level" mechanism that actually works. Why can't we, "common mortals", use it?
Or is it just a brute force "keep reading each byte from UART as fast as possible and process as soon as terminator is received, 'N' bytes are received or timeout reached"?
You can do that if you really want to. It is just a FOR loop (to limit the number of bytes you read), reading 1 byte at a time until you read the termination character. But why would you want to do that when VISA does it all for you?
12-01-2025 10:45 AM
@Gyc wrote:
@crossrulz ha scritto:
DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (still not enough emphasis)
Let me pick-up on this.
I banged my head on this long ago and, back then, not even NI support gave me good answers.
Granted that it is of course a better idea to let VISA vi's do the 'hard work', and only for the sake of my understanding, why does "bytes at serial port' work so badly? I could never understand what conditions make it give out inaccurate results. It is simply erratic.
Given that VISA vi's work, there must be an internal "low level" mechanism that actually works. Why can't we, "common mortals", use it?
Or is it just a brute force "keep reading each byte from UART as fast as possible and process as soon as terminator is received, 'N' bytes are received or timeout reached"?
I think you might enjoy this article/video: https://hackaday.com/2021/03/20/arduino-serial-vs-serialusb/
Apart from the issue of empty buffers immediately after sending the message that @crossrulz mentioned, there is also the issue that between checking the buffer and actually reading what's in it, new bytes may have arrived and you are not reading a full message. You might then get truncated messages, with the missing half at the front of the next message. While this seldom happens during testing, once you deploy the code at scale, this is bound to happen at some point - almost by definition a Heisenbug. This is not limited to LabVIEW. All serial packages I am aware of have some sort of peek mode.
Part of the issue is that (using the OSI Model), everyone would like to deal with the session layer: You open the session with the instrument, send commands, receive data. But the nature of serial communication means that you always need to deal with the lower levels at some point. In the case here, transport and network: Is the reply complete? How long should we wait?
If the protocol is well-structured enough to give clear end-of-message characters (It might not be!), the solution is to just use this character and ask your Network/Transport layer implementation (i.e., VISA) to just give you the next message. If there was no message during the expected time, it will also tell you by emitting a timeout error, so that you can deal with it (e.g., alert the user to check the cables, ...).
The BytesAtPort function is a good way to ensure that new users will be able to see the tools working without emitting errors that look dangerous (Oh no, timeouts!).
12-01-2025 05:27 PM
Thank you all for the explanations.
Today is already late, I'll post one of the latest vi's where I found this error tomorrow.
I am aware of the response delay between sending a command and receiving a reply, so I do take into account that bytes may not be immediately available - quick&dirty trick is just inserting a delay between send and receive and tweak it.
It is a bit of a Heisenbug, yes - but we learn to recognize and deal with these early in our "careers" 😉 😁
12-01-2025 09:07 PM
I just realized I did not post a link to my serial port presentation from 5+ years ago: VIWeek 2020/Proper way to communicate over serial
12-02-2025 04:48 AM
@Gyc wrote:
Thank you all for the explanations.
Today is already late, I'll post one of the latest vi's where I found this error tomorrow.
I am aware of the response delay between sending a command and receiving a reply, so I do take into account that bytes may not be immediately available - quick&dirty trick is just inserting a delay between send and receive and tweak it.
It is a bit of a Heisenbug, yes - but we learn to recognize and deal with these early in our "careers" 😉 😁
And if you are a little further in your career, you might learn that instead of creating Heisenbugs, there are more reliable ways of doing things. Ways that work in production, day in day out, without introducing long delays that slow down your communication unnecessarily in 99.9% of the cases, just to try to avoid that Heisenbug. Except you don't avoid it really, even very extreme delays may sometimes not be enough. You never will know, except that your system spuriously causes errors.
Your serial port or network protocol (really any protocol that is based on a byte stream) should either be in ASCII and include a well defined EndOfMessage character, or if it is binary it should use fixed size messages or messages with a fixed size header that lets you determine how much data the remainder contains. Anything else is a hobby project, not a real device.