LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Ethernet communication with Labview (not familiar with it)

Hello all, 

 

I have been using Labview to send/receive data from instruments using GPIB, VISA, SERIAL, etc.

 

Never used a Ethernet port before, so I stuck on the basic part of opening the connection and sending the commands properly.

 

I did a NI I/O trace capture.

 

Line 1 & 2 is using NI MAX (Hostname 172.16.0.25 at Port 10001), with Line Feed \n termination character. 

I am sending the simple *IDN?\n. I get the answer as shown on line 2 (and same on NI-MAX).

Screenshot 2025-05-19 124134.png

When I send the command using Labview with VISA, NI trace shows:

Screenshot 2025-05-19 125325.png

I set timeout at 5000ms, same as in NI-MAX (on a next iteration), but this one was not giving any errors.

Screenshot 2025-05-19 123831.png

Only when running NI-MAX again, will error once or answer something other that the answer for *IDN?, to which cleaning the buffer and using NI-MAX again would work just fine.

 

What I am missing?

 

I use to use VISA write, read without the property node and would work fine?

 

I am trying to control an EXFO Variable Optical Attenuator from Dicon, with a Ethernet server inside this damn machine.

No drivers available.

 

Thank you,

 

Gaston

0 Kudos
Message 1 of 6
(207 Views)

I personally use VISA if the instrument supports the LXI protocol, but if it doesn't then I instead just use the TCP/IP palette instead of the VISA commands.

 

You do lose some VISA properties and such, but you gain the ability to operate on a more native level.  It may help.

 

Still though, you have some bits in your code example that look weird so I have to wonder if your code example is in the exact state that got the I/O trace that you show.  The "Write" command is set to just send a semicolon, and the "Read" command has its bytes to read input wired to the Return Count of the Write command, which is not going to give you a full reply.

 

The I/O trace also makes it look like maybe you're sending a literal "\n", i.e. two bytes, a slash and an "n" instead of the line feed character.   It should be one byte.  "*IDN?\n" should be 6 characters, but shows as 7 in the second entry.  How are you handling adding the termination character?

0 Kudos
Message 2 of 6
(180 Views)

@Kyle97330 wrote:

 

The I/O trace also makes it look like maybe you're sending a literal "\n", i.e. two bytes, a slash and an "n" instead of the line feed character.   It should be one byte.  "*IDN?\n" should be 6 characters, but shows as 7 in the second entry.  How are you handling adding the termination character?


By typing in \n at the end but without having switched the display mode to '\' Codes Display mode.

Rolf Kalbermatter
My Blog
0 Kudos
Message 3 of 6
(139 Views)

There are many different examples of how to do TCP/IP in the LabVIEW examples. Just look here.

aeastet_1-1747754430236.png



aeastet_0-1747754411607.png

 

Tim
GHSP
0 Kudos
Message 4 of 6
(109 Views)

Yes, I was using \n instead of the proper character return.

Still having some issues related to the read function and the number of characters to read, so it does not error out.

0 Kudos
Message 5 of 6
(89 Views)

I was doing it wrong. Now I having issues with the read number of characters. Thank you!

0 Kudos
Message 6 of 6
(88 Views)