LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

how to convert numeric to hexa decimal string. labVIEW 2018

Hi sir/ma'am 

     i have to convert my 5 byte of data numeric to hexadecimal, i'm using number to hexadecimal string function, but in chipscope for every byte 3 is adding  i'm getting wrong data in chipscope .\

Issue:-

 In chipscope tool i'm getting given data,but 3 numeric is adding  

 

Download All
0 Kudos
Message 1 of 5
(2,575 Views)

Hi saikiran,

 

which data is your "chipscope" expecting?

Can you provide a clear example?

 

Does it expect a "binary string"?

check.png

(Why do you use the I8 datatype for your numerics? Any good reason?)

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 5
(2,568 Views)

Hi gerdW,

 my chipscope is expecting  any decimal (or) hexadecimal data .

   thanking you  for responding. 

0 Kudos
Message 3 of 5
(2,546 Views)

Sounds like a problem on the receiving side. Or the communication. The sending seems right.

 

Are you using the expected baudrate (parity, handshaking, etc.)?

0 Kudos
Message 4 of 5
(2,535 Views)

@saikiranbusharaju wrote:

my chipscope is expecting  any decimal (or) hexadecimal data. 


That is still not clear.  You can express a number as a decimal or hexadecimal value, but it is still a raw 8-bit value.  Or you do a conversion so that the hexadecimal values are in ASCII (I call this ASCII Hex, you can read the data in a text editor).

 

So where is this value supposed to be read?  Do you have any documentation on how to write to the software?



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 5 of 5
(2,519 Views)