LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Bytes at Port problem on OS X

>What serial hardware are you using to communicate with your instrument and what is the part number?
>What version of VISA and Serial drivers are installed

Ah, it's been a while--thought that was in my original post...

LabVIEW 8.2.1, NI-VISA 4.2, silabs "CP210x Macintosh OSX Driver v1.02"

We have a Silicon Laboratories CP2103 USB-to-UART bridge:

http://www.silabs.com/public/documents/tpub_doc/dshort/Microcontrollers/Interface/en/CP2103_Short.pdf

In a nutshell, there is a serial port in our product, along with the CP2103 so that the serial protocol is carried over USB. The host computer runs their (silabs) serial driver, which talks to the CP2103--essentially, the host software thinks it's talking to a UART, and processor in the product thinks it's talking to a UART, but the combination of the driver in the host computer and the CP2103 on the other end bridge the transaction over USB.

This works fine on the PC with LabVIEW 8.2.1, but not on the Mac (OS X 10.4.11, both PowerPC and Intel processors).

Note that if I write to the serial driver in C, bypassing LabVIEW (or as a framework callable from LabVIEW), it works fine. That is, I send a serial message to our device, and expect a 6-byte reply. Addressing the driver from C, I find that--after giving it enough time--6 bytes are waiting. I can read those bytes, and all is as expected.

From LabVIEW, though, the Bytes at Port property tells me that 7 bytes are waiting (again, on the PC it returns "6"). If I try to read them, I can only read 6, and eventually get a time-out error waiting for the 7th. Since I expect to receive a 6-byte reply, I can disregard the Bytes at Port value and just request to read 6 bytes. This works, but I get a warning that there may be addition bytes available at the port (of course, there are no addition bytes waiting).

In this case, I can easily work around it, because I'm always expecting a 6-byte reply, but in the future we may get packets of arbitrary size returned.

thanks,

Nigel
0 Kudos
Message 11 of 19
(2,246 Views)
Hi Nigel,
 
Have you tried sending/reading more than 6-byte replies to see if LabVIEW always adds an additional byte? 
 
Regards,
Andy L.
Applications Engineer
National Instruments
0 Kudos
Message 12 of 19
(2,224 Views)
>Have you tried sending/reading more than 6-byte replies

I don't have that option--the device send only 6-byte replies... let's see... I could code something up to not read the first packet and request another (for 12 bytes total waiting)... I'll try that. I'd like to be able to put a large delay between bytes and see if the first byte received gives a reading of 2 bytes waiting, but I don't have access to that capability.
0 Kudos
Message 13 of 19
(2,216 Views)

@Andy L wrote:

Have you tried sending/reading more than 6-byte replies to see if LabVIEW always adds an additional byte? 




OK, I modified the code to send an arbitrary number of requests before reading the input. The Bytes at Port value is always 7 per request, but the number of bytes of the reply in the input buffer is always 6 per request. That is, if I send four requests before reading, Bytes at Port returns 28. If I read the input, I find 24 bytes in the buffer (the same sequences of the expected 6 byte reply, repeated four times).
0 Kudos
Message 14 of 19
(2,212 Views)

@Andy L wrote:

Have you tried sending/reading more than 6-byte replies to see if LabVIEW always adds an additional byte? 




I just found out that it's worse than that...

Most of the messages are commands to the devices, which return 6-byte acknowledge packets (again, "Bytes at Port" (BaP) tells me 7 bytes are waiting, but in fact only 6 are). Now I'm needing to query information from the devices, which is returned as 518-byte replies packets. For these, BaP is returning 1020, although only 518 bytes are available to be read. This true both on my PowerMac G5 Quad and Mac Book (Intel Core2 Duo), both running OS X 10.4.11 (Tiger).

On both computers, I've accessed the devices from C in a similar manner, and the driver returns the expected count of 518 bytes waiting in the input buffer ("ioctl(fd, FIONREAD, &numBytes);").

Again, BaP works fine under Windows. I don't see how this cannot be a Lab View bug, or at least an interaction with the driver from ascertaining the number of bytes waiting in an unexpected way under OS X.

OK, just got an idea... I'm wondering is the incoming bytes are getting stored in the input buffer as unicode, and Lab View (BaP) is determining the available bytes by checking the buffer bounds instead of querying the driver. For instance, I notice that my 6-byte acknowledge packets have one 0xff byte and all other bytes are less than 0x7f. That would account for BaP being 7, if I'm right.

I just wrote a vi that reads the 518 byte packets a byte at a time, checking BaP each time. BaP starts out at 1020, I read the first byte, 0x02, Bap is now 1019; read 0x06, BaP is 1018,; read 0xff, Bap is... 1016--this time the count decremented by 2. The next two bytes are 0x01 and 0x06, and BaP decrements 1 for each, but the rest of the block is all 0xff's, except for the last byte (checksum), which is 0xf1.

The math didn't come out exactly right for it to be simply that 0xff counted as two for BaP, so I had to check closer: After reading the 66th byte, the BaP count _incremented_ by 9!

Here's the gist of it--518 bytes, all the omitted ones are 0xff (decrementing BaP by 2):

byte read (hex), BaP after read
02, 1019
06, 1018
ff, 1016
01, 1015
06, 1014
ff, 1012
...
ff, 892
ff, 901 (!)
ff, 899
...
ff, 1
f1, 0
0 Kudos
Message 15 of 19
(2,042 Views)
Hello earlevel,

It is very difficult to say if the problem is with the VISA driver or with the USB-232 firmware/driver.   You might want to try running a similar application using a terminal program like Zterm (like hyperterminal in Windows).  You could verify how many bytes are actually received.  This would help narrowing down the problem.

You can also post an NI-SPY capture.  You can refer to this knowledgebase on how to perform an NI-SPY capture.  This may also help in narrowing down the problem.

Have a great day.
O. Proulx
National Instruments
www.ni.com/support
0 Kudos
Message 16 of 19
(2,000 Views)

@O_Proulx wrote:
It is very difficult to say if the problem is with the VISA driver or with the USB-232 firmware/driver.   You might want to try running a similar application using a terminal program like Zterm (like hyperterminal in Windows).  You could verify how many bytes are actually received.  This would help narrowing down the problem.

You can also post an NI-SPY capture.  You can refer to this knowledgebase on how to perform an NI-SPY capture.  This may also help in narrowing down the problem.




I'm not sure if you understand the situation. First, NI-VISA is not failing at anything, so I don't think NI_SPY will be useful--I already know exactly what is happening. (I did run NI-SPY, and the output is pretty boring, as expected.)

To sum it up: I'm completely sure that the device is sending the expected number of bytes, and that it is responding to to the serial commands I send it. If I perform the exchanges using standard UNIX/POSIX commands, programmed in C, everything works as expected on my Mac, on my PowerMac G5 and on my Mac Book Pro (Intel-based) laptop. Everything works as expected from LabVIEW running on a PC.

With the LabVIEW/Mac combination, however, "Bytes at Port" does not work. There is no question that it is returning the wrong values. I can read the number of bytes I expect the device to send me, and there will be no more available.

As I mentioned, I suspect that NI-VISA is doing the wrong thing in determining the number of bytes waiting at the port. I think it's somehow measuring the buffer usage instead of asking the driver how many bytes are available.

I don't think that a serial terminal program will help--it's not that I can't read the replies from the devices, it's that "Bytes at Port" returns the wrong value.
0 Kudos
Message 17 of 19
(1,994 Views)
Hello earlevel,

NI-VISA makes use of the only way to get errors reported back on POSIX, and that is to use flags INPCK and PARMRK.  These flags enable the checking for the FF control code byte in the buffer, and when they find FF, another byte gets added to the appearance of the buffer because there appears to be an error report included (directly after the FF byte).   We apologize for the inconvenience but this is the way we report errors on Mac OS.  You can take a look at this document and this doucment for more information about this (search for INPCK or PARMRK)(Also note that  0377 is octal for 0xFF). 

Do you absolutely need to send the FF, or can you modify the code in your device to send another value instead?

Have a great day.




O. Proulx
National Instruments
www.ni.com/support
Message 18 of 19
(1,967 Views)

@O_Proulx wrote:

NI-VISA makes use of the only way to get errors reported back on POSIX, and that is to use flags INPCK and PARMRK.  These flags enable the checking for the FF control code byte in the buffer, and when they find FF, another byte gets added to the appearance of the buffer because there appears to be an error report included (directly after the FF byte).   We apologize for the inconvenience but this is the way we report errors on Mac OS.  You can take a look at this document and this doucment for more information about this (search for INPCK or PARMRK)(Also note that  0377 is octal for 0xFF). 

Do you absolutely need to send the FF, or can you modify the code in your device to send another value instead?




The devices are in the field, so we can't change them. We'll have to avoid the use of "Bytes at Port" or do the serial transaction via external code. Thanks for the explanation.
0 Kudos
Message 19 of 19
(1,961 Views)