LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

conversion from ascii to 16 bit inteeger number

Hi, I recieve data from message based measurement device via a serial port. I have used VISA functions write, read and so on. So I have read a string of charaters (8 bits). My problem is, that in this output message are text messages in char and 16 bit inteeger data numbers too. I can split these two parts from the string, but I don't know how can I convert data read like 8 bit char to 16 bit inteeger number. So my problem is how can I convert ascii character to a 16 bit inteeger. Thank you for your help. 
0 Kudos
Message 1 of 3
(4,091 Views)

You will find the all the functions you need in this palette(attached picture). The typecast function should work. Type cast the part of the string containing data in U16 to a U16 array. The byte swap function may also be needed.

 

Message Edited by t06afre on 03-23-2009 11:59 AM


Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
(Sorry no Labview "brag list" so far)
Message 2 of 3
(4,087 Views)

I thought differently. The way I read the query, it sounds like the function needed is "Decimal String To Number". This would be the right thing to use if (for example) you can read the answer on screen if you connect to the device using hyperterm, ie the measurement device produces results in ASCII text like "123". If when you connect by hyperterm, the results are strange sequences of characters, the changes are that a cast and/or some sort of bit manipulation will be the correct method.

 

It's difficult to describe these things accurately, as people will interpret words differently. An example screen copy showing the string that is read from the device (as a string indicator) would help, and let us know whether the text indicator is set to mormal display, or to hex display. Don't forget to give us the expected answer too!

 

Rod.

 

 

 

0 Kudos
Message 3 of 3
(4,068 Views)