LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

VISA Timeout - until start or end of transmission (received)

Hello all!

 

I am not clear with a small detail about VISA.

 

I am waiting for a message based on its length. I have set a timeout of 5 seconds for VISA. The message is potentially very long and sent in packets. And I fail to receive it.

 

In this context, it is important to know which interval is the VISA timeout:

  1. Is it the time from start of VI execution until the end of receiving the expected number of bytes?
  2. Or is it the time until the very first byte is received, after which it waits for all the bytes.

 Of course, the most probable answer is option 1. Then this leads to other questions:

  1. Is there any kind of setting where I can set a timeout until start of transmission?
  2. Or, is there a way to set the interval between several consecutive packets? Something like: if e.g. I expect 100 bytes and they are sent in packets of 20 bytes, with at most 2 seconds in between, is it possible to reapply the timeout in between packets? Of course, one suggestion is to use a loop with reading several packets, but the downside here is that I am actually not sure that each packet has the same length...

Complicated, I know.

 

Any suggestions?

 

Regards!

0 Kudos
Message 1 of 3
(2,827 Views)

There are two ways you could do this,

 

1) Read a single byte with your timeout for "Start read", then perform the full read with your timeout for "End Read".  You need to add the first received byte to the second received data.

2) Use an Event to detect a Byte on the port with timout for "Start Read" and then perform a full read with your timeout for "End Read".

 

Shane.

0 Kudos
Message 2 of 3
(2,825 Views)

Personally I tend to see the VISA timeout as a protection against the device note responding at all, so I typically write drivers for Option 2. My experience is that once the device starts responding it will complete whatever it's sending. The process is easier if your device sends data in packets with a defined EOM character - otherwise you have to wait for a specific number of bytes or a pause indicating that it has stopped sending data. 

 

Of course there are exceptions - like an instrument I worked on years ago that would crash and start spewing trash out the serial port. Thankfully, there aren't many situation like that any more. In the same way, you typically don't have to worry about delays in the middle of responses any more. In general serial devices are much better behaved now.

 

Specifically, what are you trying to talk to?

 

Mike...


Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion

"... after all, He's not a tame lion..."

For help with grief and grieving.
0 Kudos
Message 3 of 3
(2,797 Views)