LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

What is the most efficient way to turn an array of 16 bit unsigned integers into an ASCII string such that...?

What is the most efficient way to turn a one dimensional array of 16 bit unsigned integers into an ASCII string such that the low byte of the integer is first, then the high byte, then two bytes of hex "00" (that is to say, two null characters in a row)?

My method seems somewhat ad hoc. I take the number, split it, then interleave it with 2 arrays of 4095 bytes. Easy enough, but it depends on all of these files being exactly 16380 bytes, which theoretically they should be.

The size of the array is known. However, if it were not, what would be the best method?

(And yes, I am trying to read in a file format from another program)
0 Kudos
Message 1 of 4
(2,924 Views)
Your method is not bad. How about using the "Array Size" function to feed your initializer instead of the constant 4095? That way it will put in the correct size array every time.

Dan Press
www.primetest.com
0 Kudos
Message 3 of 4
(2,924 Views)
You may try this:
[U16]->Swap Bytes->To U32->Swap Words->Type Cast-> String

Jean-Pierre Drolet


LabVIEW, C'est LabVIEW

0 Kudos
Message 4 of 4
(2,924 Views)