01-16-2005 03:25 PM
01-16-2005 04:13 PM
Blog for (mostly LabVIEW) programmers: Tips And Tricks
01-16-2005 07:06 PM
01-16-2005 08:00 PM
This is commonly used on variable length data - the four-byte value tells the receiver how many bytes are in the following chunk.
It's necessary for the Y case - you could have 101 values in one packet and 102 in another (maybe not in your particular case, but in general, when you transmit an array, you include the element count). The same thing applies to strings.
I see no reason for it anywhere else. As long as both ends know that a double is 8 bytes, then just transmit the 8 bytes.
I take it you did not design this protocol? (Otherwise you wouldn't be asking ME about it...)
If you have liberty to change the protocol, then I would consider using a cluster:
Bundle the Skin Response, Heart Rate, Resp. Rate, and the four waveforms into a cluster.
(Use a typedef for easier changes).
Flatten the cluster into a data string.
Get the string length (unless you can guarantee the length will be the same every time).
Flatten the length into a string.
Transmit the length string and the data string.
On the receiving end, receive 4 bytes, unflatten into an I32 (data length).
Then receive that many bytes, unflatten them into a cluster, and there's your whole data block. No muss, no fuss.
I am having a hard time reconciling the fact that I am determining the length to be 2052 bytes of waveform data on server and only reading 4 bytes of it on the client side.<>
You are misreadimg the code. It reads four bytes to determine HOW MANY MORE BYTES to read. It then reads that many more.
Blog for (mostly LabVIEW) programmers: Tips And Tricks
01-17-2005 06:31 PM
CoastalMaineBird wrote:the four-byte value tells the receiver how many bytes are in the following chunk.
You are misreadimg the code. It reads four bytes to determine HOW MANY MORE BYTES to read. It then reads that many more.
01-17-2005 07:05 PM
--- Four bytes was used because that's the length of an I32 - the native integer type.
As long as your data size doesn't exceed 64k, you can use two bytes if you want - just typecast the string length number to a U16, and flatten THAT to a string and send it. Read TWO bytes on the other end, unflatten it to a U16, then read THAT MANY more bytes, and you're there.
You could figure out a way to use three bytes, I suppose, but you'd be the first person to do it, because no one else has ever needed to. ;->
In your case, I submit that the advantage to sending a two-byte count instead of a four-byte count is dwarfed by the increased complexity of the code. Saving two bytes out of the thousand you're sending is just not worth it. Send the four bytes.
It is not much of a big deal, but knowledge is a wonderful thing!
--- Indeed.
If this is the case, then I assume the way I have it set up now is that I can create one big packet on a Java server, send the length of the packet and then the actual data packet and then the LabVIEW client will read 4 bytes to determine how many more to read.
--- Yes. BUT....
If this is the case, then I assume the way I have it set up now is that I can create one big packet on a Java server, send the length of the packet and then the actual data packet and then the LabVIEW client will read 4 bytes to determine how many more to read.
--- Sounds simple, doesn't it? But, as I mentioned before, you have to pay attention to the endian-ness of your data. I strongly suggest that your first Java-LabVIEW test NOT be of your whole cluster, but instead, send a simple four-byte integer, namely the value 1. That's right, the number 1. If you receive it and get a 1, you're golden. If you receive it and get the value 16,777,216, you have an endian problem. If you get something else, you have some other problem. If you don't know about endian problems, ask.
P.S. Give your CPU a break now and then. Your loop has no WAIT function in it, so you're composing a signal and transmitting it over and over, as fast as possible, doing nothing else. Insert a WAIT TILL NEXT MSEC MULTIPLE with a constant of 100 inside your transmitting loop. That's throttle back to a 10-Hz transmission rate. Your other programs will love you for it.
Blog for (mostly LabVIEW) programmers: Tips And Tricks
01-18-2005 05:51 PM
01-18-2005 08:50 PM
The code I see sets "x" to 15. If you're receiving a "12", something's wrong...
I am assuming that the form it will be received by the LabVIEW client will be a string (default for all TCP/IP connections??)
Not only default, there is no other option. Strings are the only thing sent and received by the TCP functions. That's what all the flattening and unflattening is about.
However, I also tried to use a cluster that contains a single string connected to an unflatten to string and then unbundled it (because this is the format I am using in my other VI's that you already seen) to be displayed in a second string indicator. But this does not work.
Judging by your LabVIEW code, you're still not understanding something. You receive 4 bytes. That is the length of a native integer. If you got the number 12 (or 15) to work, then it's because your Java transmitter sent the value as a four-byte value (not unreasonable).
But if you want to send a string, you can't just cram any old string into a four-byte value. That's where the two-part transmission comes in - the first four bytes tell the receiver how many MORE bytes are following. The receiver knows to receive four bytes, convert it to an integer (displaying it as a string is not useful), and call it N. The transmitter then sends N more bytes. The receiver receives N more bytes, then converts the data from string to whatever. Every transmission is in two parts, every reception is in two parts.
If you want to transmit a string from Java, you'll have to send the length first, followed by the string itself.
Something like (Forgive me if my Java's a bit rusty):
String myString = "This is a test string";
// "pout" being your connection PrintStream
pout.print(myString.length()); //send out the length
pout.print(myString); //send out the string.
On the LabVIEW receiving side, receive four bytes, and unflatten them into an I32 (display it as "N" if you like). Then receive N more bytes, and display it directly as a string.
Bundling a string (or anything else) does nothing as far as the byte structure goes - it doesn't make the result any longer (or shorter).
Whenever you get lost, go back to the working LabVIEW server, where you flatten the huge cluster and send it. Display that string using hex display, and you can see exactly what gets sent. Duplicate that in your Java server.
Blog for (mostly LabVIEW) programmers: Tips And Tricks
01-19-2005 02:35 PM
@CoastalMaineBird wrote:
The code I see sets "x" to 15. If you're receiving a "12", something's wrong...
That is my fault, I orginally had the value of "12" being sent.
Judging by your LabVIEW code, you're still not understanding something. You receive 4 bytes. That is the length of a native integer. If you got the number 12 (or 15) to work, then it's because your Java transmitter sent the value as a four-byte value (not unreasonable).
Ok, it is making a lot more sense now. I had a hard time wondering why I needed two TCP reads and writes. I remember now that you need to send a length first then the actual data in Java/C++ etc etc.
If you want to transmit a string from Java, you'll have to send the length first, followed by the string itself.
Something like (Forgive me if my Java's a bit rusty):
String myString = "This is a test string";
// "pout" being your connection PrintStream
pout.print(myString.length()); //send out the length
pout.print(myString); //send out the string.
On the LabVIEW receiving side, receive four bytes, and unflatten them into an I32 (display it as "N" if you like). Then receive N more bytes, and display it directly as a string.
I will try the logic of this code in Java and see if I can get it to work...I sounds like it will! Not only am I a LabVIEW newbie but Java is new to me as well. I guess I get a double whammy.
Bundling a string (or anything else) does nothing as far as the byte structure goes - it doesn't make the result any longer (or shorter). Whenever you get lost, go back to the working LabVIEW server, where you flatten the huge cluster and send it. Display that string using hex display, and you can see exactly what gets sent. Duplicate that in your Java server.
Ok, it maybe be a stumbling block for me to get this to work, but at least I know now that the bundling "logic" in LabVIEW causes no "incompatability" errors if you know what I am getting at.
Thank you very much for your help in this matter. I will post again if I get stuck. Thanks again for all the help.
01-19-2005 02:56 PM
I strongly suggest you do not jump whole hog into using your app's real cluster. First, get an I32 across the fence (you've already done that).
Then send a string.
Then send a cluster of an I32 and a string. (Change both ends).
Then add a DOUBLE to the cluster, and make that work.
Then add an array of doubles, and make that work.
Blog for (mostly LabVIEW) programmers: Tips And Tricks