09-06-2018 06:50 PM
We can't see what your terminals are set to. What do you have set for "End read on termination character" and "Termination char"?
Because if you have termination enabled, you're just rolling the dice until that character comes in as part of one of your bits of data, and cuts off your VISA read early, and then all other subsequent reads are offset by that amount.
09-06-2018 07:09 PM
Welp, I in fact did have a termination character set, doh. Turned that false and this VI has been running perfectly for about 10 minutes now without a glitch! It's perfect, thank you everyone for your help. I've marked crossrulz' suggestion as the solution, but extra thanks to you as well Kyle97330.
09-06-2018 07:21 PM
@lavadisco wrote:
Welp, I in fact did have a termination character set, doh. Turned that false and this VI has been running perfectly for about 10 minutes now without a glitch! It's perfect, thank you everyone for your help. I've marked crossrulz' suggestion as the solution, but extra thanks to you as well Kyle97330.
Just an FYI, if you believe that the answer was split between two (or more) posts, you don't have to choose - you can mark more than one as the solution. 🙂 (I feel the solution wouldn't have worked without Kyle's input...)
09-06-2018 07:22 PM
@billkoJust an FYI, if you believe that the answer was split between two (or more) posts, you don't have to choose - you can mark more than one as the solution. 🙂 (I feel the solution wouldn't have worked without Kyle's input...)
Didn't know that, thanks! I've marked Kyle's post as a solution as well.
09-07-2018 05:57 AM
@lavadisco wrote:I put this between the output of my Visa Read and the input to Scan from String. It works great! BUT... after a given number of cycles, it suddenly goes wonky. I can see the LSBs changing as each new string comes in (once per second) and it grabs anywhere from ~10 to ~50 strings successfully, but then something glitches and the numbers get all messed up and don't ever return to normal unless I stop and restart the VI. Any idea why?
Ultimately, what this comes down to is a lack of a robust protocol at the serial message level. You are just sending out X bytes every second with no checks for when the message actually starts or if the data was correct. When using a binary/hex/raw protocol a typical method is to use a start byte (0x2, STX is common), a byte to state how many bytes are in the message, the data, and then a checksum (add all of the data bytes ignoring overflow) or CRC (Cyclical Redundancy Check). Then you need to implement this protocol on both sides of the bus.
And, yes, do make sure the termination character is turned off when dealing with binary/hex/raw data. It has bitten me more than once.
09-07-2018 04:20 PM
Hard to tell if the "glitches" are due to corrupted received data or due to the processing, but I guess things happen during transmission. Is the number of received byte the same as the requested bytes?
Can you define "glitch"? What are the exact symptoms? (value is zero, value is random, error generated, etc.)
From looking at the severely truncated code picture, all you do is take a binary string, do a lot of Rube Goldberg gymnastics, then ends up with a series of integers. Is that right? This really screams for simplifications, even if you leave all the rest of the code in place.
Can you isolate it to a tiny VI that generates a random (or well defined) 38 byte string, pipes it through your algorithm and displays the various outputs. Attach that VI here.