LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Writing Binary in VISA LabVIEW (

Solved!
Go to solution

If you are familiar with C#, this might be easier to understand where I'm coming from.  In C#, I would use this function to put together a blob of binary data to write to VISA:

 

IMessageBasedSession.FormattedIO.Write();

 

That particular method can take strings, arrays of bytes, etc.  To my knowledge I can find no equivalent in the VISA VIs available in LabVIEW.  I think the VISA write VI provided automatically writes and flushes the buffer, so I would need to build the entirety of what I am sending into one data structure, then send it on its way.  It only accepts a string, so I thought to convert the bytes to string - didn't work.

[SOUR1:DATA:ARB1 HALFSINE_POS, ] - then from there append the array of bytes.  In C#, I would write a string to the buffer, then write the bytes to the buffer (as bytes), then flush and everything worked.  I'm seeing no option to do that here - am I missing something?

 

0 Kudos
Message 1 of 8
(340 Views)

Hi P²,

 


@Psquared wrote:

[SOUR1:DATA:ARB1 HALFSINE_POS, ] - then from there append the array of bytes.  In C#, I would write a string to the buffer, then write the bytes to the buffer (as bytes), then flush and everything worked.  I'm seeing no option to do that here - am I missing something?


A string is the same as an array of bytes, and there are two functions to convert between them (StringToU8Array, U8ArrayToString).

What have you tried and where are you stuck?

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 8
(333 Views)

You say you "so I thought to convert the bytes to string - didn't work".... how did you do the conversion?  Was it this?:

 

Kyle97330_1-1742330546604.png

 

https://www.ni.com/docs/en-US/bundle/labview-api-ref/page/functions/byte-array-to-string.html

 

That is exactly how I have sent binary data over VISA before and it has worked.

 

Perhaps there's something else involved?  You don't say what device you're writing to so I can't look up the protocol, but perhaps you need to prepend a message length, or add a null termination character, or something else along those lines.

0 Kudos
Message 3 of 8
(331 Views)

I am talking to a Keysight 33622A Function Generator.  Everything I am trying to do in LabVIEW I have already done in C#, so I know that it is possible.  The unsigned byte array to string conversion was one of the efforts I had made, but since you've said you have made it work I will have to take a look at the string as a whole when I get back to it in the morning.  It sounds like I might be on the right track; I'm hoping it is something simple.

0 Kudos
Message 4 of 8
(305 Views)

@GerdW

 

Tried:

 

Type cast (Byte Array -> String)

Flatten to String

Byte Array to String

 

What I do is write the command, then concatenate the Byte array -> String conversion and then call the VISA write VI.  The instrument doesn't like any of the above.  

0 Kudos
Message 5 of 8
(301 Views)
Solution
Accepted by topic author Psquared

It's almost certainly in IEEE-488.2 binary block format, then.

So, in addition to your binary data you need to send a # sign, then the length of the length, then the actual length, and only then add your binary in.

 

Plus also making sure that the byte order is correct...

 

Kyle97330_0-1742336536900.png

 

Message 6 of 8
(294 Views)

@Psquared wrote:

I am talking to a Keysight 33622A Function Generator.  Everything I am trying to do in LabVIEW I have already done in C#, so I know that it is possible.  The unsigned byte array to string conversion was one of the efforts I had made, but since you've said you have made it work I will have to take a look at the string as a whole when I get back to it in the morning.  It sounds like I might be on the right track; I'm hoping it is something simple.


It is simple.  Not quite a no-brainer, but the advice given is sound so you should be able to work out the details with no problem.

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 7 of 8
(264 Views)

@Kyle97330 wrote:

It's almost certainly in IEEE-488.2 binary block format, then.

So, in addition to your binary data you need to send a # sign, then the length of the length, then the actual length, and only then add your binary in.

 

Plus also making sure that the byte order is correct...

 

This.

 

I realize now what was going on.  I remember debugging a similar issue in C#, except that then I coded the BinBlock format into the string and for "some" reason I saw the #3456 on the write buffer to the device.  Turned out that the WriteBinary method in C# automatically coded BinBlock into the write.  That's not the case here in LabVIEW, since don't have that option - building that BinBlock string and adding that in did the job.  I buried that functionality into the abstraction of my classes once it got working in C#.  Another interesting thing I came across is that I had to swap the "endian" of my float -> bytes conversion in C# and didn't have to do that here.

 

Simple things indeed.  But a good learning experience.  Thanks for the guidance.

Message 8 of 8
(216 Views)