09-11-2010 03:42 PM
Is it coming as binary data such as a set of 8 bytes that can be typecast into a double precision number? Or as a human readable string like 1.234 that can be converted to a number using the string to number conversion functions?
09-11-2010 03:44 PM
Since our coder is kind of inept I believe that it's going to be a set of 8 bytes. I'm not quite certain yet but once I know I'll reply.
As for right now lets assume that it's going to be 8 bytes.
09-11-2010 04:08 PM
Then you will use the typecast function on the Numeric >> Data Manipulation palette and wire a constant that is a double datatype into the top.
If your coder is inept, then you might have to teach him how to do his job. If he hasn't done anything yet, then perhaps you decide what you want to get and tell him how you want him to send the data to you.
09-11-2010 04:39 PM
Okay, so is something like this alright?
I am still relatively new to LabView so I don't know how I'm going to get the end results to be showing up on the three different waveforms (For instance, one is .5-4Hz, second is 4-7Hz, third is 8-12Hz).
Also, the thing that it's first hooked up to is a write to spreadsheet file where it's just going to be (hopefully) printing out the value of the double at any given time.
09-11-2010 05:39 PM
I placed a Write to Text File vi in there to try and debug what's supposedly going in to the flatten... but nothing is writing to the text file which is a little unnerving. Perhaps I'm not doing this correctly?
Thank you so much for your help by the way.
09-11-2010 06:20 PM
It looks like it is sending characters. Is it sending a "5", "55" "555" ??? So you need the 2nd option I listed which would be using the string conversion functions in the String >> String/Number Conversion palette.
09-11-2010 11:02 PM
Well, since the code isn't done I had plugged two Zigbees into my laptop. I was sending just "55555555" from COM4 to COM5 (Labview Zigbee). I hit the send button a couple times which is why it shows up in the Terminal a couple times.
We're going to be taking a break for the weekend to allow the coder to finish his software. 😕 If he's not done with it by the end of the weekend we're going to have to have a talk with him. Really want to have some actual data to play around with.
09-14-2010 03:03 PM
We have the code completed. All it took was a warning from the Dean from the school that he may be removed from the project. 😕
Anyways, now we know that the data that we're getting in from the XBee is going to be a packet that looks like: 01010101 (For example)
We also picked up a book called "LabView For Everyone" and this book showed us a vi called "Boolean Array to Digital" and then from there we can wire up a "Digital to Analog VI" and then it would just be an issue of splitting the waves up to show the different frequencies on the waveform charts (We can worry about saving the data later, we just want to get it showing)
So here's the problem we're having now - Since the serial read is coming in and is being flooded with bits, we need to find a way to pull apart the usable parts of the bits, put them into an array, and then feed that array into the "Boolean Array to Digital". One of the main worries about this is we're trying to deal with three different waveforms so how are we going to differentiate between #1 wave, #2 wave, and #3 wave after the conversions?
09-15-2010 08:29 PM
Got to do some live testing of the code, the microcontroller seems to be actually converting SOMETHING into digital and then sending it to labview. I set up a probe from the VISA and the only thing I'm getting out of the read buffer is "U" all the way up to "UUUUUUUUUUUUU" it's always U's.
I've attached the VI, but I've been looking at the XCTU terminal when the code is running and all we're getting there is U's too. So I guess LabView is working - the code isn't.
Thank you Ravens Fan for all your help.
09-15-2010 09:12 PM
You're welcome. Good luck getting the other code to work.
For differentiating between human readable ASCII data , you could use commas to separate it. Or have a prefix in front to distinguish the data points (A100.0,B-5.40,C0.002) for example.
If is is binary data where you have a series of 8 bytes that represent a double number, then you could use the knowledge that if you get 24 bytes, the first 8 can be typecast to the 1st waveform, the next 8 to the next waveform, the final 8 to the third waveform. With binary data, there is no way you could use easily use any kind of delimiter byte since the delimiter byte could just as easily be a valid byte in the 8 bytes of binary data.
But it's not out of the question. You could still use A12345678B12345678C12345678. (Here A B and C represent the ASCII characters, 12345678 represent the 8 consecutive bytes that get typecast to a double.) If you read 27 bytes, and the 1st, 10th, and 19th are definitely the characters A B and C respectively, then you know you have a good data packet. Then you can break out the intervening 8 bytes between them and typecast each set to a double. If those 3 bytes aren't A B and C, then you can just throw out the packet. Of course it can be difficult to get things back on track again. So you can store them in a shift register, append the next set of bytes that come, and work your way through them until you get the good pattern of bytes. No guarantees that this is perfect because there is still a very slight chance the A B and C characters could show up as valid bytes in the middle of the 3 sets of 8, but highly unlikely.
To guard against this, you could add another 1 or 2 bytes to the end and do a checksum method on the previouis 27 bytes.