First, please be patient with me. I've only been working with VISA for a few days.
I'm using LabView to write a program that drives a Keithley 2000-20 multimeter, which is measuring the voltage across ten diodes and two standard resistors. The procedure for the measurements calls for a ten-second settling period between switching to a channel and recording the data. This was easy when we had someone sitting in front of the multimeter: they'd close the channel, wait ten seconds (more or less), and write down the numbers. But I've been going crazy trying to figure out how to do this with VISA.
The little test VI I wrote currently has it set so there's an internal scan list of twelve channels (ten diodes and two resistors), a reading count of twelve, and a sample count of one. I set the timer to ten seconds and the trigger to the timer. What I think is happening is: channel closes, measurement is taken, timer fires. But I want the timer interval to happen between the closing of the channel and the measurement.
I tried using the delay function but it seems to me that it only fires once, at the beginning? Or am I mistaken? Do I need to use the delay and timer functions in conjunction? Is what I'm trying to do even possible?
I also can't get the read buffer to work, but I figured I'd start with the thing that's really making me tear my hair out.
Thank you in advance for any help!
Katherine