LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

VISA Read and Bytes at Port Timing Question

Hi,

 

I have a question that doesn't seem to be documented in the VISA Read function help. My application normally queries a serial instrument, waits, and then reads the port (with Bytes at Port property node wired to the byte count input of the VISA Read). However, I also need to be able to handle strings received from the instrument asynchronously without my vi requesting any data. So in the False Case in my vi (the True Case is where I write a command to the instrument) I have a Bytes at Port property wired to the VISA Read function's byte count input without using a VISA Write. This works fine if the \r\n terminated string is sent in one packet. However, sometimes there is a slight delay (only a few milliseconds) between characters. When that happens, the VISA Read returns, but I don't get the entire intended string. (Of course I know I have to keep reading in a loop until I get the \n and then assemble the received characters (sub strings) into my complete string for processing.)

 

This is my question: What is the time delay between characters at which the VISA Read terminates? This is not specified. I assume it could be as little as just slightly more than 1 stop bit at the baud rate being used. Does anyone know? NI employees?

 

When a string of more than one character (byte) is sent, as soon as the stop bit time has expired, the next start bit is normally sent immediately. Is it possible that if the next start bit doesn't come by, say, the mid-bit position time at the baud rate being used, the VISA Read returns immediately? Or does it wait at least 1 character time (at the baud rate)? This should be documented. Furthermore, for future versions it might be useful to add an input to the VISA Read to specify in milliseconds how long to wait AFTER the 'byte count' number of bytes have been received before returning the string (or character).

 

Thanks for your help.

Ed

 

0 Kudos
Message 1 of 5
(3,942 Views)

 


@Edjsch wrote:

Hi,

 

 

This is my question: What is the time delay between characters at which the VISA Read terminates? This is not specified. I assume it could be as little as just slightly more than 1 stop bit at the baud rate being used. Does anyone know? NI employees?


I don't understand the question. The VISA Read will get whatever is in the UART's buffer. It's the UART's responsibility to fill up its buffer, and it handles all of the communication protocol, not VISA.

 

0 Kudos
Message 2 of 5
(3,937 Views)

Yes, I know that today's UART chip sets have large buffers (originally they were only double-buffered so they could receive a new character while the buffer is waiting to be read by the application) to work with non-real-time OS's like Windows. (I think Windows serial drivers probably have something close to a 1 millisecond interrupt rate - that is the timing that USB to Serial Port converters which the USB Bulk Transfer mode).

 

Your point does shed some light on my question. What happens when VISA driver reads the "UART" (really serial port) buffer while another character is being received? The buffer (UART "holding" register) will be empty, but the Windows serial port driver surely will check to see if the Rx buffer (shift register) is also empty (nothing currently being received) before returning. If there is a slight delay between characters, I guess, as you say, VISA isn't going to bother to wait a little and check again. That would be the job of the serial port (or virtual com port in the case of a USB-serial bridge/converter) driver.

 

So I guess my question then becomes a question for the low-level serial driver, not the VISA Read function (although I don't know VISA's exact role as a serial driver). With that in mind, would you have any idea of how long a serial driver for the UART might wait after the recieved character is transfered to the holding register (buffer)? I assume that the hardware would immediately set the Rx buffer empty flag and simultaneously clear the holding buffer empty flag. Then it would clear the Rx buffer empty flag again as soon as the next start bit is received. So the software driver would need to wait a little to see if another character is coming in. Do you have any idea what that delay might be? (I could, I suppose, do some checking with a scope using a micro-controller to vary the delay between characters.)

 

Of course, in my application, I'll assume that delay could be infinite, and process the string when the LF is received.

 

Let me know if you have any knowledge about this. Thanks again!

0 Kudos
Message 3 of 5
(3,925 Views)

I was thinking about what I said in my previous post. Since the VISA driver is (I believe) the low-level driver interfacing with the UART hardware (which today I believe have as much as 16 KB buffers), I don't think it's such a far-fetched question to ask how long the low-level (VISA) driver waits before determining that there is no character(s) being received an returns.

0 Kudos
Message 4 of 5
(3,916 Views)

I looked up the PC16550D data sheet (http://www.national.com/ds/PC/PC16550D.pdf). On p. 19 it says:

 

When RCVR FIFO and receiver interrupts are enabled, RCVR FIFO timeout interrupts will occur as follows:

A. A FIFO timeout interrupt will occur, if the following conditions exist:

    - at least one character is in the FIFO

 

    - the most recent serial character received was longer than 4 continuous character times ago (if 2 stop bits are  programmed the second one is included in this time delay).

    - the most recent CPU read of the FIFO was longer than 4 continuous character times ago.

 

The maximum time between a received character and a timeout interrupt will be 160 ms at 300 baud with a 12-bit receive character (i.e., 1 Start, 8 Data, 1 Parity and 2 Stop Bits).

B. Character times are calculated by using the RCLK input for a clock signal (this makes the delay proportional to the baudrate).

C. When a timeout interrupt has occurred it is cleared and the timer reset when the CPU reads one character from the RCVR FIFO.

D. When a timeout interrupt has not occurred the timeout timer is reset after a new character is received or after the CPU reads the RCVR FIFO.

 

So, this UART uses 4 character times to determine that no more characters are coming in. And the delay is baud-rate dependent. This makes sense because I see that at, say, 115200 baud I receive more "partial strings" than I do at 9600 baud (where the sending device has more time to send the next character)!

 

Kudos for making me investigate this further! Thanks for listening. Hope this may help others in the future.

 

0 Kudos
Message 5 of 5
(3,905 Views)