LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Need help preventing VISA Read from interpreting value at port

I am trying to build an interface that will read values from the UART of a microcontroller.  I do not want to send ASCII characters, as I find that very inefficient for my application where sending the actual byte is prefered.  I find that it is easy enough to receive the bytes and convert them to Hex or Decimal; however, LabView likes to first interpret the values.  I find that when one of the characters are close to the lower values (i.e. 0x0A, 0x0D, etc.) LabView first interprets the result, adding a New Line or CR, etc. and this ends up shifting the data I am expecting (if I am reading 2 bytes and one of them is a character that is interpreted as something other than a Hex value, it appears to skip that and read the next byte, which should be part of the next communication).

 

Is there anyway to prevent this?  I really just want LabView to read the bits on the serial port and NOT do with it what it wants.  I will handle the post-processing of the data.

0 Kudos
Message 1 of 27
(3,247 Views)

That's pretty vague.  Would you mind posting a picture or the the VI itself so we can better understand your problem?

 

Thanks!  🙂

 

Bill

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 2 of 27
(3,239 Views)

Not really sure what is vague.  The end result is I would like to have LabView read the bits at the port and just provide them to me without processing them itself.

 

One thing I am trying to do is read the value from a 16-bit ADC.  The uC correctly reads the value and can send it via UART to LabView.  If the value is such that both bytes are above say 0x20, the value is correctly interpreted and the bytes are received in the correct order (MSByte first then LSByte).

 

However, if one of the bytes are relatively low in value, say below 0x20, LabView first interprets this to mean something, such as New Line, Carriage Return, Backspace, etc., and does not appear to provide me with that byte.  Instead, since I am telling it to read and return 2 bytes, it grabs the next byte, which is the MSByte of the next value.  From then on I receive LSByte and then MSByte instead of the other way around.  This continues until another byte is the correct character to be ignored and it is reversed again.

 

I am not sure exactly which character causes this reversal (the program operates too quickly), but after testing a few values it seems to be one of the 'system' characters.

0 Kudos
Message 3 of 27
(3,232 Views)
The vague part is there's a lot of different ways to read data into LabVIEW.  Maybe you're using a way that is not suitable to the task.  Or maybe the code that was written isn't handling the data correctly.  That's why posting at least a picture would help a lot.  I don't even know the level of your expertise, so bear with me and forgive me if I'm going over stuff you already know.  🙂
Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 4 of 27
(3,223 Views)

LabVIEW will not interpret anything unless you are telling it to. First, you should have the termination character disabled in the VISA Configure Serial Port. If you don't do that, when the VISA Read sees the termination character, it will stop the read. Second, the string returned can be displayed in any manner that you choose. The default is 'Normal' or ASCII but you can simply right click on the indicator and choose Hex display.

 

If you do not have some sort of synchronization method in your code, of course you will sometimes recieve the lsb before the msb. That has nothing to do with LabVIEW but is a fault in your programming. How can any program know what is the LSB and MSB if you don't provide some sort of mechanism to determine which is which? Your device is constantly sending out data and the time when you start reading 2 bytes is random.

0 Kudos
Message 5 of 27
(3,217 Views)

Attached is a capture.

 

If I set the value to something like 36000 (0x8CA0), no problem.  The Measured value settles on 36000, give or take a few LSBs of noise.

 

If I set the value to something like 32000 (0x7D00), the output tries to get to 32000 and then begins to feedback incorrect results.  I've seen values in the 40000s, next sample would be 17000, etc. 

 

I know the ADC is not reading the incorrect value as I've already tested it and I can measure the voltage the ADC is sampling and it is not changing.  This leaves the fact that it has to be in how LabView interprets the bytes received on the port.  Instead of merely telling you the value it read, it first appears to assume it is a string of Ascii characters and if a character shows up that is a function versus something to display, it will perform that function and not return the byte correctly.

 

*Edit: The >= and < sign in the description of the IF block are backwards, but the input is wired correctly.

Message Edited by am0n on 08-26-2009 10:38 AM
0 Kudos
Message 6 of 27
(3,209 Views)
The read is not random.  The uC only sends back the value after receiving a command to do so.  I also have set the Enable Termination to False.
0 Kudos
Message 7 of 27
(3,206 Views)

Attached are two images.  One is the Desired value at 34400 (0x8660).  As you can see, it reads and converts the value correctly.  I can let it sit there forever, and as long as the noise doesn't cause it to drift too low, I will always have a 0x86 as the MSByte.

 

The second image is when I went from 34400 down to 34330 (0x861A).  Now, due to noise, the LSByte wiggled too low and at some point the 'reversal' happened.  Now the 0x86 is being read second.  My only conclusion here is that it is reading the MSByte of the next sample. (If the current sample only returned one of the bytes, the LSByte of that sample would still be in the buffer.  After requesting the next two bytes there would be three total in the buffer and it would return the first two, the LSByte of the previous sample and the MSByte of the present sample.)

 

Now, since I have Enable Termination off, I am not sure what else would be causing this.

Download All
0 Kudos
Message 8 of 27
(3,191 Views)

Instead of converting a string to a byte array cast to a U16, why not use hex string to number?  It could be something funny happening when the first element in the array is "0."

 

Bill

Bill
CLD
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.
Humble author of the CLAD Nugget.
0 Kudos
Message 9 of 27
(3,179 Views)
Because Hex String to Number doesn't work like that.  It requires the string is a Hex value, i.e. the string is 'F109.'  Since my string is the Ascii equivalent of the byte, it doesn't convert it to a number.
0 Kudos
Message 10 of 27
(3,159 Views)