02-19-2007 09:06 PM
If you have LabVIEW, launch the example finder and look for the examples that I mentioned. (simple data client and simple data server). These are good examples for exactly your problem. You must have already looked at them, because you seem to have used them as a starting point for your version.
Do they work? Let me know what you don't understand or what you want to work differently.
02-20-2007 08:45 AM
Hi,
Just want to ask you.
It is the right way to do it?
To convert from ASCII to binary?
Cheers!!
02-20-2007 11:27 AM
Hello Hussin,
I want to jump into this discussion to explain some vocabulary and the relations between these terms.
A String is a row of bytes, stored in the memory one after another. Each byte can have 256 different states. ASCII means, that every letter in the alphabet belongs to a certain state that a byte can represent, e.g. 0x41 for the Letter "A" or 0x61 for the letter "a". This relationship is defined by the ASCII- table. Even though the ASCII tabe defines some control codes below 0x20 and some special characters above 0x7F, an ASCII string mostly means a human readable string, consisting of words or numerous digits which make sense to a human reader.
A binary string consists of a row of bytes which have to be interpreted with the knowledge of the data- structure an how it is stored. In your "binary server.vi" there is no ASCII string at all! The content of the "Binary String"- indicator has to be interpreted as packages of 8 bytes, each package containing the value of a double precision number. This is only readable by the computer, not by humans. If you want to make it a readable ASCII String, you need to use the conversion VIs from the "String/Number Conversion" palette.
I hope I made it more clear to you what ASCII means.
Your attached VI is the very right way to transmit data via TCP since the typecast node converts ANY type of data to a string. So you only have to replace the case structure with a control of the datatype you get the ADC data with and all is done.
Please read altenbachs first reply again, then you can get more specific advice to handle the 10bit ADC data.
Greets, dave
02-20-2007 11:57 AM
@MF Hussin wrote:
It is the right way to do it?
To convert from ASCII to binary?
Dave already gave you some good tips. Typecasting is not really a "conversions", it is more a "reinterpretation" of data in memory.
Going down to the basics, all data in a computer is just a long sequence of bits and looking at it will not tell you anything unless you know what they represent.
For example, your DBL array contains 200 DBLs (each 64bits), thus represents a long sequence of 64x200 bits.
If I simply give you the 12800 bits, I need to also tell you that they are an array of DBLs. (They could repersent anything else, for example a size 400 array of SGLs, A size 800 array of I16, a complicated cluster, and image, etc.), With typecasting, you basically tell the computer what it is supposed to be. In simple terms (leaving out complications such as endian-ness etc. for the moment), typcasting just looks at the sequence of bits in memory and (without modifying the bits!) Just looks at them in a different way.
In your particular case, TCP does not need to know what the data represents, it just transfers long sequences of bits. For convenience, it accepts string datatype because it is most universal.
The process is simple:
Take your data and typecast it to a string data tpye-- send it via TCP---take the string and tyepcast it back to a DBL.
02-22-2007 07:03 AM
Hi,
I already make a modification to the previous program.
Can you advise me on this.
To honest I'm not quite get it what you guys try to explain but try my best to solve this problem.
Thanks
Hussin.
02-22-2007 08:01 AM
02-22-2007 10:52 AM
02-22-2007 05:37 PM