LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Reading numbers over serial

Solved!
Go to solution

I have an arduino UNO hooked up through usb reading all 6 analog inputs and formatting it (I know it's not ideal for efficiency) as 14 bytes, [upper 8][lower 8] [upper 8] [lower 8]...[carriage return] [linefeed] such that every transmission ends with 13 10. The problem is that sometimes my sensors actually read 10 and then I'm stuck with the array not being right and the buffer starts filling up (number of bytes at port skyrockets and latency follows close behind), basically everything spirals out of control and I have to reset. I forced the VISA read for 14 bytes, but that still isn't helping. I don't have a string stripper any more just in case that was the issue, but it's not. The program already knows some of the stuff I want it to because it always cuts off the 13 and 10 at the end to signify that they aren't needed. Since every other byte is either a 0, 1, 2, or 3, given the nature of converting a 10 bit number to 2 8 bit numbers, can I just add some kind of detection?

 

It might be helpful to know that everything works perfectly fine in realterm where it keeps the 13 and 10 at the end

 

(labview and arduino code attached [you'll need to rename it from .doc to .ino], you shouldn't need to hook anything up to the arduino to get it to run so if you have one around for testing it should work fine)

Download All
0 Kudos
Message 1 of 4
(2,593 Views)
Solution
Accepted by topic author Neywiny

Is your data being sent as binary?

 

If so, then disable the termination character.  Since you know you will always read 14 bytes, then do 14 bytes to read.

 

Do not mix a carriage return/line feed with binary data.  Those characters work as terminating characters when you are using human readable ASCII characters.  But they serve no purpose when you data is binary.

0 Kudos
Message 2 of 4
(2,583 Views)

I figured it would need a termination character to keep track of the packet's start and end, but you're right that disabling that made everything work fine. Thank you.

0 Kudos
Message 3 of 4
(2,578 Views)

@Neywiny wrote:

I figured it would need a termination character to keep track of the packet's start and end, but you're right that disabling that made everything work fine. Thank you.


Typically with binary transmission, the message will start with a specific character, such as 0x02, maybe have a byte stating the length of the message, then the data followed by a checksum of some sort (either add all of the message bytes or do a CRC).  You sync by reading 1 byte at  time until the start byte is read.  Then read the rest of the message and verify the checksum.  If it fails, throw away your data and try again.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 4 of 4
(2,551 Views)