LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Correct format cRio-9074 and NI 9871 10 byte write

Solved!
Go to solution

Hello,

 

I am communicating with a custom embedded board through NI9871 and cRio-9704 ethernet connected to a Windows 10 machine.  I have success with NI MAX, I send a command and get the appropriate response. Read and write work fine.

 

In the attached vi, I can read the appropriate response from the board, no problem.  However, I cannot send the command correctly.

 

All communications are 115200 Baud, 8 bits, no parity and 1 stop bit.
All commands are 10 bytes long including start and stop bytes and CRC byte.

 

This command works perfect in NI-MAX but not in the vi: {\DE\00\00\14\00\00\03\B0}

 

Thank you in advance

0 Kudos
Message 1 of 3
(1,366 Views)
Solution
Accepted by topic author djneff

For your Write string and the Read string, right-click on then and choose Visible Items->Display Style.  You will see a little n on the control.  That is showing what display style you currently have selected for the control/indicator.  You can click it and choose a different style.  I will personally choose "Hex".  It looks like you were trying to use "\ Codes".

 

And after you change the Read indicator to be in hex mode, you can eliminate all of the formatting code for the read output; just wire the indicator straight to the VISA Read's output.

 

Some other notes:

1. No need for the Flush Buffer.  In fact, it could be removing data from your input buffer that you care about.

2. Due to the raw/hex/binary nature of the data protocol, you will get a timeout from your VISA Read.  I recommend basing how you read on the protocol instead of just attempting to read a bunch of byte.  But I would need a lot more details on the protocol in order to give any real advice.  But for general advice: VIWeek 2020/Proper way to communicate over serial



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 2 of 3
(1,308 Views)

This was exactly what I needed to hear; thank you so much.

 

To summarize the issue for posterity,  LabVIEW was auto-formatting my input string because I did not define the correct display format for input.  I was trying to send /Codes and by following @Crossrulz advice I saw that I was sending something else entirely.

 

** Update:  My end goal is to be able to enter hex in eight bit pieces. So from the command above I'd like to send

 

FB DE 00 00 14 00 00 03 B0 FD

 

This is what my device is looking for.  /Code works only for commands that don't have a ascii character that corresponds; it will autoformat.  For example:

 

FB 91 00 00 00 00 01 2C 81  become   {\91\00\00\00\00\01,\81} 

 

How can I stop it from changing my input at all?

 

 

 

 

0 Kudos
Message 3 of 3
(1,297 Views)