LabWindows/CVI

cancel
Showing results for 
Search instead for 
Did you mean: 

Decimal to Hex Value

I have a issue where I read in a value:
 
 buffer[1] = -94 (decimal value) and I am trying to convert it to a hexidecimal number.
 
My problem is when I use the CVI debugger and look at the hexadecimal value I get 0xA2, but when I try and do a printf
on the same value using %x I get the value FFA2.
 
I know one value is a 8 bit number and the other is a 16 bit number, but I am trying to figure out how I can extract this value from the buffer and store it into another variable and I am wanting the 8 bit version (0xA2). How can I do this?
 
I know this is going to be simple but I am drawing a blank right now.
 
Thanks
 
 
0 Kudos
Message 1 of 2
(3,064 Views)

If you have a 16 bit variable storing a bit pattern of, say, 1111111110100010, the interpretation of this bit pattern depends on the type of the variable. If it is unsigned, the pattern reads as 0xFFA2; if it is signed, it reads as -94. In either event, if you simply copy this value to an 8 bit variable by using a simple assignment statement, only the lower 8 bits will be copied. So how the resulting bit pattern 10100010 is now interpreted again depends on the type of the 8 bit variable. The default CVI char is signed, so it will have the value of -94: if you have an unsigned char it will show as 0xA2 or 162.

JR

0 Kudos
Message 2 of 2
(3,047 Views)