LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Why the execution times of VISA write is not linearly proportional with Byte sent

Hello,

I am building an application with a function to detect the propagation delay of serial connection. I am using VISA and I have followed the example "Timing template" shipped with LabView 6.1 to generate the time elapsed.

My application sends the tick count value as data packet and receives it from the loopback path then compares it with the current tick count value. However, I found out that the Time elapsed for VISA write to send out all the data packet does not increase linearly with the size of data. The execution time of VISA writes actually increment every 14-byte data increase.

I am subtracting the value from "tick count value received" (which is a lin
ear line) to the execution time of VISA write to obtain the path delay (which should be linear). The path delay I obtained from my application has a sudden drop when there is an increase in VISA write's execution time.

How do I correct this?

Any Help is appreciated. Thanks in advance
Download All
0 Kudos
Message 1 of 3
(2,872 Views)
Well, there's a certain amount of overhead in VISA as it translates from its own high-level API to the respective serial or GPIB or other more hardare-specific API. VISA may well be doing some buffering or caching in between these layers. I'm not sure this is something you need to 'correct,' as it may be VISA's intended behavior.

If you're just looking to benchmark response time from your instrument, you could just send something other than the tick count in your VISA Write node, and start your timer immediately after. True, you'd only be timing one half of your communications, but the receive side is probably the half you're interested in, anyway.
Message 2 of 3
(2,872 Views)
Hi,

It might also have to do with the FIFO depth of the serial hardware. I don't think it is a coincidence that the receive FIFO depth of the COM port I have is 14 bytes. I assume this is a common FIFO depth for the built serial ports. The FIFO depth affects when the serial hardware interrupts the CPU to retrieve the data.

You could try changing the FIFO depth and see how it affects the delays. You could also disable the FIFOs. To modify the FIFO settings go to the device manager, right-click on the COM port and select properties. Click on the Port Settings tab and go to Advanced. You should see the FIFO settings.

Hope this helps.

DiegoF
National Instruments.
0 Kudos
Message 3 of 3
(2,872 Views)