LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How do i clear the receiver and send buffer?

 @dsbuxi

This is how you flush the read buffer without loosing the data, as the flush function do.

 



Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
(Sorry no Labview "brag list" so far)
0 Kudos
Message 11 of 15
(3,203 Views)

Hi all,

 

I have also an issue with the buffer of my Vötsch VCL 7003 climate chamber. I am using GPIB to communicate with the chamber and everything works fine, if I am reading the exact amount of bytes with the "VISA Read" function. If the amount of bytes is right, I get the positive error "1073676294" (http://digital.ni.com/public.nsf/allkb/C96C84C922DC3F978625632500482F78). But if the amount of bytes is not exact, then I get the negative error "-1073807339" (http://digital.ni.com/public.nsf/allkb/E6DFA2B8D7E99F9886256C14005E82F7) and my VI is not working like it should be anymore.

 

I already tried to flush the buffer before sending a command and reading it with the "VISA Flush I/O Buffer" function. I used "64" and "128" mask values, but it didnt help.

Has anyone an idea or a solution to clear the buffer?

 

 

Kind regards, Roman

0 Kudos
Message 12 of 15
(2,902 Views)

First, why are you trying to flush the buffer?  It's not something you should need to do very often, only if you think you have some unknown amount of stale data.  If the instrument is working as a write a query command, read the returned data, type of situation, you should never have stale data in your buffers.

 

You are getting a timeout error which means the data didn't come back within time specified.  Did any data come back?  If you are not getting data, that implies there is something wrong with the command you wrote, (or something wrong with cabling).  You will really need to read the manual on how they specify the communication protocol and see that your implementation in LabVIEW matches.  Perhaps you are not sending the correct termination character at the end of your command?

 

On the reading side, you say it works okay if you specify the correct number of bytes to read.  Does the communication manual tell you how many bytes to read?  Or does the response have a variable number of bytes but send a termination character to signal the end of a packet of data?

0 Kudos
Message 13 of 15
(2,895 Views)

Hello Ravens Fan,

 

thank you for you reply!

 

I am using different commands, which are returning a different amount of bytes on my climate chamber. And I already had the problem, that after a program crash there was still data in the buffer and my VI wasnt working, because of the rest of bytes from the last response. Thats why I would like to make a kind of exception handling.

 

I am getting a reply when reading an inappropriate amount of bytes, so the problem is not the communication or a time out. It is also stated in this document: http://digital.ni.com/public.nsf/allkb/E6DFA2B8D7E99F9886256C14005E82F7

 

  • If you are experiencing this error for a VISA Read, verify that you are not trying to read too many bytes. Read only 1 byte at a time while debugging.

    Note: If you do not get the error now, increment the number of bytes you read until you get the timeout error again. This tells you how many bytes that command sends back.

    You can also use a Property Node to read the Number of Bytes at the Serial Port. Right-click the Property Node and select Select VISA Class»I/O Session»Serial Instr. Then right-click the Property Node and select Properties»Serial Settings»Number of Bytes at Serial Port
  •  

    Your suggestion to read byte after byte until a termination character appears is interesting, I will try to do this!

    0 Kudos
    Message 14 of 15
    (2,888 Views)

    @RomanD wrote:

     

    Your suggestion to read byte after byte until a termination character appears is interesting, I will try to do this!


    No.  If there is a termination character sent (often a carriage return or a linefeed), what you do is configure your serial port to enable the termination character and also what the termination character is.  Then when you do the VISA read, you just read a sufficiently large number of bytes.  The VISA read will terminate automatically when it gets the termination character.

     

    If you have a program crash, then one of the problems will be figuring out why your program crashed.  In general you should never have to clear the buffers.  It should be sufficient to open and configure the port at the beginning of your program, write and read as needed in a loop in your program, and close the port when your program ends.

     

    If you want to clear the buffers after opening the port at the beginning, that is fine, but generally shouldn't be needed.  If the port was actually closed, it wouldn't have anything in them anyway.

     

    If you want to clear the buffers before writing the next command and reading the response.  That is okay, but it seems like a lot of unneeded work that wouldn't be necessary if the communication protocol is functioning properly.

    0 Kudos
    Message 15 of 15
    (2,876 Views)