LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

VISA Serial + Arduino Read Buffer not working (Write works)

Solved!
Go to solution

Hello, I am currently working on a project with an analog sensor and a stepper motor. I want the sensor to collect data when the motor turns to angles of specific increments. 

 

Currently I have labview tell the arduino what angle increment the stepper should move in through the serial monitor. (Ex. I send "5", and the motor will move to 5, 10, 15, 20... degrees). When it reaches that angle, I have the arduino print on the serial monitor what angle it is currently at which Labview should read. I also have arduino write "M" on the serial monitor if the motor is still moving so that labview knows to not take any data from the sensor. I currently don't have the sensor on me so I have a random number generator instead.  

Labview is able to communicate with the arduino and tell it what increment the stepper motor should move in. However it doesn't seem to be reading what the arduino writes back at all. I'm sure this is a trivial problem but I am still having a hard time with this even after reading the other threads. Any help would be graciously appreciated!

 

#include <Stepper.h>

 
 
const int stepsPerRevolution = 2048;  // change this to fit the number of steps per revolution
const int rolePerMinute = 15;         // Adjustable range of 28BYJ-48 stepper is 0~17 rpm
const int degree = 100 * 30; // will be geared down
 
  int increment = 0;

// initialize the stepper library on pins 8 through 11:
Stepper myStepper(stepsPerRevolution, 8, 10, 9, 11);

void setup() {
  myStepper.setSpeed(rolePerMinute);
e the serial port:
  Serial.begin(9600);

}


 
void loop() {
 
  while (increment == 0){
    if (Serial.available() > 0) {

      int receivedInt = Serial.parseInt(); // read the incoming byte:
      Serial.print(" Increment Set To:");
      Serial.println(receivedInt);
      increment = receivedInt;
  }
  }
  Serial.println("S");
        int i = 0;
        int end_angle = 180 + 1;
       
      while (i < end_angle){
        myStepper.step(100 * 180);
        int angle = i * increment;
        //Serial.println("Degree: ");
        Serial.println(angle);
        delay(1000);
        i ++;
        Serial.println("M");
        }
      Serial.println("END");
 
 
}
0 Kudos
Message 1 of 11
(361 Views)
Solution
Accepted by topic author Yuki990

DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (still not enough emphasis)

 

You are dealing with an ASCII protocol, so let the VISA Read do all of the hard work for you. Just tell the VISA Read to read more bytes than you ever expect in a message and it will read the entire line for you. After that, it is a matter of you making a proper message on the Arduino side and properly parsing it on the LabVIEW side. I would recommend taking all of your data from a single loop in the Arduino and formatting that into a single line, maybe using a comma as a delimiter.



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 11
(316 Views)

Thank you so much for replying!
I tried testing Labview with the bytes set to 100 and changed the arduino end so that it only sends a line on the serial monitor when the motor reaches the new angle. The buffer still doesn't read anything and doesn't seem to be recieving any bytes at all when I checked with the probe. Is there something with reading the serial monitor in a while loop after writing that Labview doesn't like? 

0 Kudos
Message 3 of 11
(311 Views)
Solution
Accepted by topic author Yuki990

I don't really know how I solved it but everything works now!
- I moved the code in the void loop to setup
- I changed the byte to 100

The code definetly could be cleaner but if it works it works.

 

 

#include <Stepper.h>

const int STEP_PIN = 8;
const int DIR_PIN = 7;
const int START_PIN = 2;
//step increment: 1 step = 0.01 degree. Therefore 1 deg = 100 step
//const int stepsPerRevolution = 200;
const int stepsPerRevolution = 2048;  // change this to fit the number of steps per revolution
const int rolePerMinute = 15;         // Adjustable range of 28BYJ-48 stepper is 0~17 rpm
const int degree = 100 * 180;
const int TURN_PIN = 3;
String a;
  int increment = 0;
  int end_angle = 180 + 1;

// initialize the stepper library on pins 8 through 11:
Stepper myStepper(stepsPerRevolution, 8, 10, 9, 11);

void setup() {
  myStepper.setSpeed(rolePerMinute);
  pinMode(START_PIN, INPUT);
  pinMode(TURN_PIN, OUTPUT);
  /*
  pinMode(STEP_PIN, OUTPUT);
  pinMode(DIR_PIN, OUTPUT);
  */
  // initialize the serial port:
  Serial.begin(9600);

  int RESET = 0;


while (increment == 0){
    if (Serial.available() > 0) {
      int receivedInt = Serial.parseInt(); // read the incoming byte:
      Serial.print("Increment Set To: ");
      Serial.println(receivedInt);
      increment = receivedInt;
      Serial.print("Degree: ");
      Serial.println(0);
      delay(100);
  }
 
  }
        int i = 1;
       
      while ((i * increment) < end_angle){
        //myStepper.step(100 * 180);
        myStepper.step(increment * 180);
        int angle = i * increment;
        Serial.print("Degree: ");
        Serial.println(angle);
        delay(100);
        i ++;
        }
      Serial.println("END");
      RESET = 1;
      delay(1000);
      Serial.println("RESET STARTED");
      myStepper.step(18 * 18 * -1);
      delay(100);
      Serial.println("RESET ENDED");

}

 
void loop() {

  }
0 Kudos
Message 4 of 11
(289 Views)

@crossrulz  ha scritto:

DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (still not enough emphasis)

Let me pick-up on this.

 

I banged my head on this long ago and, back then, not even NI support gave me good answers.

 

Granted that it is of course a better idea to let VISA vi's do the 'hard work', and only for the sake of my understanding, why does "bytes at serial port' work so badly? I could never understand what conditions make it give out inaccurate results. It is simply erratic.

Given that VISA vi's work, there must be an internal "low level" mechanism that actually works. Why can't we, "common mortals", use it?

Or is it just a brute force "keep reading each byte from UART as fast as possible and process as soon as terminator is received, 'N' bytes are received or timeout reached"?

 

0 Kudos
Message 5 of 11
(241 Views)

@Gyc wrote:

Granted that it is of course a better idea to let VISA vi's do the 'hard work', and only for the sake of my understanding, why does "bytes at serial port' work so badly? I could never understand what conditions make it give out inaccurate results. It is simply erratic.


The Bytes At Port is not inaccurate, it's just usually isn't what you actually want. It just tells you how many bytes are in the RX FIFO even if it hasn't received the full message.

 

For example, you send a request for data to an instrument. It takes time for that data to be put into the UART TX FIFO, transmitted over the bus (baud rate is the biggest factor here), received by the instrument's UART RX FIFO, interpreted by the instrument, the instrument to react to the command, and then everything back the other way. If you send the command and then immediately use Bytes At Port, you will likely get an answer of 0 because none of the response data has come in yet. So you add in a delay. How much should you wait? And now you are suddenly subject to any part of the communication path being held up and you don't get the full message or you waste a lot of time by waiting longer than needed.

 

This is the point of the termination character (assuming an ASCII protocol here). It tells you when a message is complete. You don't need to worry about the timing as much (you still have the VISA timeout, typically 2 to 10 seconds). And as long as you told VISA Read to read more than the message should ever be, you will get the full message as soon as it comes in.

 


@Gyc wrote:

Given that VISA vi's work, there must be an internal "low level" mechanism that actually works. Why can't we, "common mortals", use it?

Or is it just a brute force "keep reading each byte from UART as fast as possible and process as soon as terminator is received, 'N' bytes are received or timeout reached"?


You can do that if you really want to. It is just a FOR loop (to limit the number of bytes you read), reading 1 byte at a time until you read the termination character. But why would you want to do that when VISA does it all for you?



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 6 of 11
(209 Views)

@Gyc wrote:

@crossrulz  ha scritto:

DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! (still not enough emphasis)

Let me pick-up on this.

 

I banged my head on this long ago and, back then, not even NI support gave me good answers.

 

Granted that it is of course a better idea to let VISA vi's do the 'hard work', and only for the sake of my understanding, why does "bytes at serial port' work so badly? I could never understand what conditions make it give out inaccurate results. It is simply erratic.

Given that VISA vi's work, there must be an internal "low level" mechanism that actually works. Why can't we, "common mortals", use it?

Or is it just a brute force "keep reading each byte from UART as fast as possible and process as soon as terminator is received, 'N' bytes are received or timeout reached"?

 


I think you might enjoy this article/video: https://hackaday.com/2021/03/20/arduino-serial-vs-serialusb/

 

Apart from the issue of empty buffers immediately after sending the message that @crossrulz mentioned, there is also the issue that between checking the buffer and actually reading what's in it, new bytes may have arrived and you are not reading a full message. You might then get truncated messages, with the missing half at the front of the next message. While this seldom happens during testing, once you deploy the code at scale, this is bound to happen at some point - almost by definition a Heisenbug. This is not limited to LabVIEW. All serial packages I am aware of have some sort of peek mode.

 

Part of the issue is that (using the OSI Model), everyone would like to deal with the session layer: You open the session with the instrument, send commands, receive data. But the nature of serial communication means that you always need to deal with the lower levels at some point. In the case here, transport and network: Is the reply complete? How long should we wait?

 

If the protocol is well-structured enough to give clear end-of-message characters (It might not be!), the solution is to just use this character and ask your Network/Transport layer implementation (i.e., VISA) to just give you the next message. If there was no message during the expected time, it will also tell you by emitting a timeout error, so that you can deal with it (e.g., alert the user to check the cables, ...).

 

The BytesAtPort function is a good way to ensure that new users will be able to see the tools working without emitting errors that look dangerous (Oh no, timeouts!).

0 Kudos
Message 7 of 11
(201 Views)

Thank you all for the explanations.

 

Today is already late, I'll post one of the latest vi's where I found this error tomorrow.

I am aware of the response delay between sending a command and receiving a reply, so I do take into account that bytes may not be immediately available - quick&dirty trick is just inserting a delay between send and receive and tweak it.

It is a bit of a Heisenbug, yes - but we learn to recognize and deal with these early in our "careers" 😉 😁

0 Kudos
Message 8 of 11
(188 Views)

I just realized I did not post a link to my serial port presentation from 5+ years ago: VIWeek 2020/Proper way to communicate over serial



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 9 of 11
(180 Views)

@Gyc wrote:

Thank you all for the explanations.

 

Today is already late, I'll post one of the latest vi's where I found this error tomorrow.

I am aware of the response delay between sending a command and receiving a reply, so I do take into account that bytes may not be immediately available - quick&dirty trick is just inserting a delay between send and receive and tweak it.

It is a bit of a Heisenbug, yes - but we learn to recognize and deal with these early in our "careers" 😉 😁


And if you are a little further in your career, you might learn that instead of creating Heisenbugs, there are more reliable ways of doing things. Ways that work in production, day in day out, without introducing long delays that slow down your communication unnecessarily in 99.9% of the cases, just to try to avoid that Heisenbug. Except you don't avoid it really, even very extreme delays may sometimes not be enough. You never will know, except that your system spuriously causes errors.

 

Your serial port or network protocol (really any protocol that is based on a byte stream) should either be in ASCII and include a well defined EndOfMessage character, or if it is binary it should use fixed size messages or messages with a fixed size header that lets you determine how much data the remainder contains. Anything else is a hobby project, not a real device.

 

 

Rolf Kalbermatter  My Blog
DEMO, Electronic and Mechanical Support department, room 36.LB00.390
0 Kudos
Message 10 of 11
(161 Views)