LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

binary to ascii from 10 bit ADC

Hi,
 
I have a problem to convert  10 bit ADC binary output from microcontroller to ASCII format.
Can someone guide me to solve this matter.
 
Regards
MF Hussin  
0 Kudos
Message 1 of 18
(6,194 Views)

There is no such thing as "ASCII format".  Any formatted output is necessarily ASCII, so you should probably be a bit more specific.

  • How does the original 10bit data look like? (10 booleans, U16 masked to 10 bits, binary string, etc.)
  • How should the output look like? (binary string, human redable formatted output, etc.)

Best would be if you could attach a simple example that contains typical input data in a control (as default value) or diagram constant. Also explain how the output should look like. Thanks! 🙂

Message 2 of 18
(6,187 Views)

Hi,

Our group project is get an output from 10 bit ADC in binary format. Which is 1010...... Then transmit it trough the TCP/IP to the PC. But I can combine the both program TCP and binary. Then convert it to ASCII to display the waveform.

Here I attach my TCP program.

I think to add and modified the binary example program to my TCP program but i don't know how to start it.

Hope you can advise me on this

Cheers

Download All
0 Kudos
Message 3 of 18
(6,176 Views)
Sorry, after looking at your program, I don't see any place that looks like "binary data" from an ADC. Could you point out the section of code you are talking about?


@MF Hussin wrote:
... Then convert it to ASCII to display the waveform. ...
Waveform graphs don't display "ASCII data". You just need to cast your raw string to the correct numeric data type, possibly built a waveform, and send it to the waveform graph.
0 Kudos
Message 4 of 18
(6,157 Views)
HI,

Can you advise me how I'm going to change the 8 bit binary input to ASCII.

Example: 0000 0000.

Thanks

Hope to get reply from you.

0 Kudos
Message 5 of 18
(6,068 Views)
Hi,

There is some addition to the previous information.

I will explain it a bit detail.

How I'm going to change the 8 bit binary that transfer from RS232 to Ascii.

Say:

0000 0000 in binary represent 0 in decimal. And so on.

Therefore how to do this in AScii form?

Please help me out.

Thanks.
0 Kudos
Message 6 of 18
(6,066 Views)
Hi Hussin,

you should use the Byte Array to String function:


This will convert an array of U8 into a string of ASCII codes

Ton

Message Edited by TonP on 02-16-2007 08:58 AM

Free Code Capture Tool! Version 2.1.3 with comments, web-upload, back-save and snippets!
Nederlandse LabVIEW user groep www.lvug.nl
My LabVIEW Ideas

LabVIEW, programming like it should be!
0 Kudos
Message 7 of 18
(6,057 Views)
Dear Guys,

I has develop this program.
Which at the server it will convert the ASCII text to binary and send it through TCP.
At the client it will convert that binary string to ASCII and display it.
The problem is is stated not enough memory.

Can somebody look at my program and advice me about it.

Thanks.
Download All
0 Kudos
Message 8 of 18
(6,034 Views)

Simply have a look at your code, nothing makes much sense.

  1. Server: Why are you possibly formatting your numbers as binary (a series of ASCII characters 0 or 1) instead of sending the raw binary string?
  2. Server: You are casting the size to a string (4 bytes) but then only send the last byte formatted as binary text. You are stripping 32bits of potential information to 8 bits, then use 64 bits to transfer that information.
  3. Server: you are casting 200 DBLs (1600bytes) and take only the last byte, which you send as binary formatted text, again using 8 bytes worth of information.
  4. Client: For the size, you are only reading four characters of binary formatted text, i.e. only 4 bits of information (!). You are casting four binary formatted characters as I32 Giving you a random, very large number.
  5. Client: Now you try to read a very large number of characters (see above), but you were actually sending only 8 bytes of binary formatted text, so that's all there is.
  6. Client: These 8 bytes you scan as binary and convert them to U8, which you are casting back to a string with 1 (!) character. 1 byte!
  7. Client: for some reason, you are casting this single byte to an array of DBLs. Unfortunately, there isn't even enough information for a single DBL element.

You seem to have a difficult time understanding the difference between binary string, binary formatted text, typcasting, formatting to string, scanning from string and seem to randomly mix these operation. They mean very different things!

Your FOR loops have autoindexing input tunnels, but plain output tunnels, thus you are only retaining the information from the last iteration.

All you have to do is typecasting your size and data to a string each, send it via the network, and typecast it back to the original datatype on the receiving end. Try it! 🙂

You seem to have used the shipping example "simple data client|server" which basically does exactly want you want already. For some reason, you mutiliated the data handling beyond recognition. 😉 Go back to the virgin example code, it is working and does all you are trying to do already. Really! 😄

Message Edited by altenbach on 02-17-2007 09:07 AM

Message 9 of 18
(6,022 Views)
Dear Altenbach,

To be honest, I'm not get the info that you try to explain.
If you don't mind can you show the code how to do it.
Usually, I just learn through the code and get the understanding behind it.

Hopefully, you can help me out.

Thanks.
Hussin
0 Kudos
Message 10 of 18
(5,992 Views)