LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to change raw char string to 16-bit 2's complement integer?

Solved!
Go to solution

Hi,

 

Please see the following raw char sting

 

óèóóóóþþóèþóóóþóþóóèþóóóóóóóóóþóóóóóèóóóóóþóóþþóóóþóèóóþóèþóèóþóóþóóèþóóóþèóóþóþóþóóóþóóóþóèóþþèþóþó

 

How could I converter it to 16-bit short signed integer 2's complement?

 

Thanks,

Ott

0 Kudos
Message 1 of 5
(3,526 Views)
Solution
Accepted by topic author Ott

What do you expect for the output?  Something like this?



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 2 of 5
(3,517 Views)

Yes- how to achieve it?

Thank,

Ott

0 Kudos
Message 3 of 5
(3,510 Views)

@Ott wrote:

Yes- how to achieve it?

Thank,

Ott


original.png

Jim
You're entirely bonkers. But I'll tell you a secret. All the best people are. ~ Alice
For he does not know what will happen; So who can tell him when it will occur? Eccl. 8:7

0 Kudos
Message 4 of 5
(3,493 Views)

I used the Unflatten From String primitive.  Wired in an array of I16 to the data type and a FALSE to the "Contains Array or String Size".  You can also save off that picture I gave you earlier.  It is called a snippet.  Just drag your saved copy onto a clean block diagram and you will have the code.



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 5 of 5
(3,490 Views)