LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Number to hexadecimal string memory allocations

Solved!
Go to solution

I am converting a string to a byte array and then converting that byte array to a hexadecimal array using the built in primitive like this

 

hex.PNG

 

Is this the best way for performance? When I run the DETT I see many memory allocations. I think one for each element in the array. I think it has to do with the coercion dot. Also it only happens if I connect the indicator string. If I run this without the indicator string I don't see all of the memory allocations. (Maybe LabVIEW ignores the code since I am not doing anything with the result)

 

I just got the DETT and I am not extremely sure of how LabVIEW allocates memory. But I did do a test by running this code

 

Capture.PNG

 

I do not see all of the memory allocations in the DETT like I do with the number to hexadecimal string.

=====================
LabVIEW 2012


0 Kudos
Message 1 of 14
(4,365 Views)

what does "DETT" stand for?

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 2 of 14
(4,353 Views)

Well, you're really comparing apples to oranges, so you can't really conclude anything. In the first code you have to have allocations - the array of strings has to be created. This has nothing to do with the coercion dot. In the second implementation there is only one array allocation - the single numeric array.

 

As for the coercion dot, this was reported as a bug here: http://forums.ni.com/t5/BreakPoint/Monthly-bugs-August-2008/m-p/758921#M6325

 

As for "efficiency", please provide a few more details. What do you mean by a binary string, and converting that to an array of hex integers? Please provide an example.

0 Kudos
Message 3 of 14
(4,351 Views)

 


@Ben wrote:

what does "DETT" stand for?

 

Ben


 

Maybe Desktop Execution Trace Toolkit?

http://sine.ni.com/nips/cds/view/p/lang/en/nid/206790 

Message 4 of 14
(4,338 Views)

Your first solution creates an array of strings (yet only 2 character long), as they're shown in an indicator they'll need to allocate memory.

 

What is it you're really trying to do?

 

Are you trying to read a binary string and present the hex value? Then this is one solution. If it doesn't have to be a string as input you can make it really, really simple by just having a integer control connected to one integer indicator and change their "Format & Precision".

 

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
Message 5 of 14
(4,335 Views)

@smercurio_fc wrote:

Well, you're really comparing apples to oranges, so you can't really conclude anything. In the first code you have to have allocations - the array of strings has to be created. This has nothing to do with the coercion dot. In the second implementation there is only one array allocation - the single numeric array.

 

As for the coercion dot, this was reported as a bug here: http://forums.ni.com/t5/BreakPoint/Monthly-bugs-August-2008/m-p/758921#M6325

 

As for "efficiency", please provide a few more details. What do you mean by a binary string, and converting that to an array of hex integers? Please provide an example.


Tanks. I was not aware of how LabVIEW does the memory allocations. I thought it would create a large chunk of memory for the array in a single allocation and then fill it in. But aparently it needs to do an allocation for each element. By the way I tried another test that seems to confirm that. I had a vi with an array of strings control on the front panel that contained a few thousand elements. That control was not connected to anything and I saw the same results as I did with the first snippet.

 

The code in the first snippet is from an application I am working on. I am converting a string of binary data (binary string) to an array of hex strings - one element for each byte in the string. Is the code I posted the most efficient way of acomplishing that goal? I ask because I have seen code that looked like it could be done simpler but it was done the way it was for performance reasons.

 

I guess I could try different ways of acomplishing what this code does and benchmark them against each other and use the Desktop Execution Trace Toolkit to look for performance issues. I know you do not use the build array in a loop. The code snippet replaces a section of code that does just that and is much much faster.

=====================
LabVIEW 2012


0 Kudos
Message 6 of 14
(4,321 Views)

The end goal is to send a binary string over the network. I have to encode it using something like hex or base64 or whatever. Hex is the simplest.

 

Here is the encoder and decoder.

 

The encoder is about twice the speed as the decoder. So maybe I will expand on the question and ask what is the most efficient way in terms of memory and speed to encode and decode binary to hex representation?

=====================
LabVIEW 2012


0 Kudos
Message 7 of 14
(4,311 Views)

Start by looking over the Bit Twiddler challenge.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 8 of 14
(4,293 Views)

 


@SteveChandler wrote:

The end goal is to send a binary string over the network. I have to encode it using something like hex or base64 or whatever. Hex is the simplest.

 

Here is the encoder and decoder.

 

The encoder is about twice the speed as the decoder. So maybe I will expand on the question and ask what is the most efficient way in terms of memory and speed to encode and decode binary to hex representation?


 

You meddle around too much i think. Just use the function i posted and add your number to hex string. Done.

In the recieving end i assume you want the number or do you want the binary string?

For reconvert to binary string i'd simply loop through all bits of the number recieved and build a string of '0' or '1'.

 

/Y

G# - Award winning reference based OOP for LV, for free! - Qestit VIPM GitHub

Qestit Systems
Certified-LabVIEW-Developer
0 Kudos
Message 9 of 14
(4,274 Views)

I looked at the code you posted but it doesn't do anything. Even if it did I need a hex encoded string and not a numeric. As I said I am sending this over the network.

 

=====================
LabVIEW 2012


0 Kudos
Message 10 of 14
(4,257 Views)