09-06-2018 01:09 PM - edited 09-06-2018 01:09 PM
@lavadisco wrote:
crossrulz, what is the second to last element in the chain in the graphic you posted? The module just before the ASCII output. Can't find it.
"concatenate string" (resized to height=1)
09-06-2018 01:56 PM - edited 09-06-2018 01:56 PM
@lavadisco wrote:
crossrulz, what is the second to last element in the chain in the graphic you posted? The module just before the ASCII output. Can't find it.
Here is a new snippet, this time in LabVIEW 2016, with the functions labeled, and a fix (set the width of the "Hexadecimal String" to 2 to make sure that 0 is in there).
09-06-2018 02:46 PM
Thanks! So would the raw data input here be the output from my Visa Read module?
09-06-2018 03:24 PM - edited 09-06-2018 03:28 PM
@crossrulz wrote:
@lavadisco wrote:
Was looking for a quick solution here, but the quickest thing to do given my greater experience in C vs LabView is to just write an ASCII conversion on the microcontroller that I can comment out later, which I already did and it works. However, in the future when I have time I'd like to re-write the VI to accomodate the hex stream.
Doing a raw to ASCII Hex is pretty simple in LabVIEW. Do note that this is very inefficient compared to directly parsing the raw data, but it will get the data in the format you seem to like.
Just clarify a bit, it's inefficient because it takes up twice as much memory. The byte value "AB" takes up one byte. To represent it as an ASCII "A" and ASCII "B" requires two bytes. To me, it's an exercise in futility because to parse the data you would have to re-convert it into RAW data anyway. Just leave it the way it is and save some unnecessary work later.
I guess the only thing this would be useful for is if you wanted to format the raw data to print it out on a piece of paper (or to a human-readable text file). Waitaminit. I guess that would be pretty useful after all. I wouldn't store the data for manipulation later that way, though.
09-06-2018 03:27 PM - edited 09-06-2018 03:28 PM
@crossrulzHere is a new snippet, this time in LabVIEW 2016, with the functions labeled, and a fix (set the width of the "Hexadecimal String" to 2 to make sure that 0 is in there).
I put this between the output of my Visa Read and the input to Scan from String. It works great! BUT... after a given number of cycles, it suddenly goes wonky. I can see the LSBs changing as each new string comes in (once per second) and it grabs anywhere from ~10 to ~50 strings successfully, but then something glitches and the numbers get all messed up and don't ever return to normal unless I stop and restart the VI. Any idea why?
09-06-2018 03:31 PM
@lavadisco wrote:
@crossrulzHere is a new snippet, this time in LabVIEW 2016, with the functions labeled, and a fix (set the width of the "Hexadecimal String" to 2 to make sure that 0 is in there).
I put this between the output of my Visa Read and the input to Scan from String. It works great! BUT... after a given number of cycles, it suddenly goes wonky. I can see the LSBs changing as each new string comes in (once per second) and it grabs anywhere from ~10 to ~50 strings successfully, but then something glitches and the numbers get all messed up and don't ever return to normal unless I stop and restart the VI. Any idea why?
What you're describing seems really Rube-ish and fragile. WHAT ARE YOU TRYING TO DO???
09-06-2018 04:27 PM
I see that I answered the "wrong question", but have to say I agree with Billko that this entire exercise doesn't make sense. Now I understand (but may still be wrong) that you have a binary data stream, format unspecified (could be U8's, Dbls, pointers, pictures, etc.). Why change its format, why not just dump it into a binary file?
Bob Schor
09-06-2018 05:27 PM
I see that I answered the "wrong question", but have to say I agree with Billko that this entire exercise doesn't make sense. Now I understand (but may still be wrong) that you have a binary data stream, format unspecified (could be U8's, Dbls, pointers, pictures, etc.). Why change its format, why not just dump it into a binary file?
Crossrulz' suggestion was very simple and elegant, and it worked up until the point it glitched. If that had worked, it would have been exactly what I was looking for. 3 little things that I could slip in between my Visa Read function and my Scan From String function. Easy. Any idea why it glitches out? If I can figure that out that's my ideal solution.
When you say "why not just dump it into a binary file", that's meaningless to me. I know what a file is, I know what binary is and how to manipulate it, but in the context of Labview I don't know what you mean, and it sounds like it's a lot more work than Crossrulz' solution.
09-06-2018 05:38 PM
Show your VI where you are actually gathering the data. Byte/hex/decimal/character conversions aren't going to cause "glitches".
Problems with your serial wiring, or the way you are acquiring the serial data in LabVIEW are the likely cause.
My first two guesses are that you are opening and closing the serial port on every, or you are using "Bytes at Port".
09-06-2018 06:11 PM - edited 09-06-2018 06:33 PM
Here's a pic of the serial read functionality up to Scan From String. Bytes to Read on the Visa Read function is set to 38, which is the length of the raw hex string that comes in once a second.