LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

how to convert ascii character to decimal integer

Solved!
Go to solution
I made constant at one end of the type cast block and wrote 'A' then at the other ending I wired a numeric indicator and it displayed on the frontpanel a number from 8 digits !
0 Kudos
Message 11 of 17
(1,798 Views)
I made constant at one end of the type cast block and wrote 'A' then at the other ending I wired a numeric indicator and it displayed on the frontpanel a number from 8 digits !
0 Kudos
Message 12 of 17
(1,797 Views)

@shrouk'13 wrote:
I made constant at one end of the type cast block and wrote 'A' then at the other ending I wired a numeric indicator and it displayed on the frontpanel a number from 8 digits !

This is insufficient information to describe the problem. typecast has two inputs and one output. Please attach a small VI.

0 Kudos
Message 13 of 17
(1,793 Views)

here it's

0 Kudos
Message 14 of 17
(1,786 Views)

There are many things wrong with this. You try to typecast a single character (8bits) to a I32 numeric (32 bits). This makes absolutely no sense!

 

(For efficiency, typecast has no error output, so it cannot tell you when it receives garbage).

0 Kudos
Message 15 of 17
(1,783 Views)

i coverted the represntation to U8 instead of I 32  and it didnot work !

0 Kudos
Message 16 of 17
(1,766 Views)

You didn't convert the data to a U8.  You typecast it to an I32 then coerced it into the U8 indicator.

 

You need to right click the constant wired into the top of the typecast function and change the representation U8.

 

That is what Apok showed you in message #2.  You already marked his message as the solution, so I don't understand why you are still struggling with this.

 

0 Kudos
Message 17 of 17
(1,762 Views)