10-21-2022 07:20 AM
Hi Community,
I have an odd problem, and I've looked over the UDP examples floating around on the forums as well as the pre-canned examples from NI. I have two VI's working as Server Client arrangement, parsing some bits from the Sender to light LED's on the Client. This is using port 5800 of the local host. Then on port 5900 I'm trying to send a string of data, made up of bytes, where I simply convert STRING TO BYTE ARRAY on the Client end where I expect to use an Index Array to parse them. I built the parsing interior to the VI first to confirm the String Control ->String to Byte Array -> Index Array -> indicators is working, then when I change to the string that comes across from the Server VI, it adds a decimal 30 to each byte. I apologize I explained the VI as I can't share the VI's (but added an image). I was going to try a from scratch version tomorrow essentially have one VI Sending to the two ports on a Receiving VI, so two ports out, two ports in, each with their own while loop.
What am I missing with UDP that could be adding this overhead value?
This parsing from Test String is functionally perfect, but when I send the identical string in from a Server UDP to this UDP supplied string, 30 is added. So send "1" the array shows "31". When I attach a String Indicator however, the string is perfect, such as ABCDE from Sender, and ABCDE at Client.
Thank you in advance, I'm still climbing the learning curve of LV only picking it up a few weeks back.
10-21-2022 07:33 AM
Honestly I did not understand very well what you are doing, but I remark that the character "1" is 1 byte whose numerical value is 31 (expressed in hex format). Maybe this can put you on the right track?
10-21-2022 12:55 PM - edited 10-21-2022 12:56 PM
Hi Plum,
@PlumPine wrote:
when I send the identical string in from a Server UDP to this UDP supplied string, 30 is added. So send "1" the array shows "31".
Another hint: read the Wikipedia on "ASCII". Even though that definition dates from the 60ies it still is important nowadays…
On your image:
10-21-2022 12:59 PM
OH MY GOODNESS crew, I have to completely admit a rookie mistake here, I absolutely did not double check the display format of the string field, and it was decimal of course, confirming to the ASCII hint a few of you referenced to which was perfectly correct. Thus, changed string to HEX display format and voila! Geez I apologize, it was a long week with many distractions and thought I had some actual issue with the block code.
Problem solved!
Thank you all for contributions and keep coding 🙂
10-21-2022 02:30 PM
@PlumPine wrote:
.
As a very first step you need to eliminate that glaring race condition due to blatant overuse of a local variable. Since there is no data dependency, both "code islands" (train on the left, hairball on the right) will execute in parallel and there is obviously no guarantee that the local variable read of "In" will occur after the terminal write of "In". Most likely the local variable will return a previous value. Get rid of the local and branch the wire to both sinks (terminal and index array) to ensure things operate correctly.
There are some other glaring errors, for example:
10-21-2022 03:11 PM
@altenbach wrote:
There are much simpler ways to get the value of three bits.
Three (of many) possibilities to get the lower three bits. For the bottom two, You should use an array indicator with a caption describing the tree elements.
(Your code is basically on top. using a "not equal zero" would have eliminated the "1" constant)