Instrument Control (GPIB, Serial, VISA, IVI)

cancel
Showing results for 
Search instead for 
Did you mean: 

Keithley 2001 single read.vi takes 10+ seconds to read AC voltage

I have a lot of applications that require me to read DC voltage, and current on a Keithley 2001 DMM.

I have been using 'Keithley 2001 single read.vi' to read DC voltage, and current for years, and it only takes a second to grab the data.  I now have a need to read AC voltage, and after configuring the 2001 meter with 'Keithley 2001 ACV config.vi' it takes over 10 seconds to grab the data.

 

Why is this so slow?

 

The single read.vi defaults to 7.5 digits of resolution.  I had to reduce that to it's minimum of 3.5 digits just to get it to read in 10 seconds.  There is no difference in the acquisition time if I am reading .3VAC, or 116VAC.  I do not know where the .vi's originated, but they appear to be very clean, and professionally created.  I acquired them from a fellow employee that no longer works here.

LabVIEW 7.1. IEEE, GPIB

0 Kudos
Message 1 of 7
(4,478 Views)
I don't have one of the instruments but if you're using the same driver that's available here, you might want to experiment with the integration time setting in Keithley ACV Config. Using autoranging can also cause a long measurment time.
0 Kudos
Message 2 of 7
(4,472 Views)
Hello Dennis:
Two simple questions:
1). Keithley 2001 Single Read Driver reads negative current when actual is NOT. 
     Detail: An hp 6632A driver sets the voltage to 10V, then disconnects it. Keithley DMM is supposed to read the current across the circuit with a Cap. and a Resistor.
     The whole program was working fine until I replaced the 6632A generic GPIB commands with its Driver. Please advise.
 
2.) How can I increase my precision to 5 digits after decimal with a (Keithley 2001 Single Read.vi). The readings I get are in Engineering format
      with a precision of 2. Please help.
 
thank you,
labVIEWrookie
 
0 Kudos
Message 3 of 7
(4,334 Views)

Could you post the code you were using to program the power supply before you started using the driver? I'm also assuming that if you go back to using this code, the polarity is correct?

If it's the LabVIEW indicator that you wish to change, then you can right click on it and select Format & Precision. Change it to whatever you want. If it's the instrument itself, then the functions DCI Config and DCV Config have a resolution control. To determine which one you need to change, you can probe the wire that is coming out of the GPIB Receive Message read buffe output in the Single Read function. This is the string that gets converted to a DBL and displayed.

Message 4 of 7
(4,328 Views)
Thanks, Problems solved BUT an
"Error message: The subVI could not be found when loading this VI. It may have been moved, deleted, or its name may have changed." appears while
loading the program into a new computer. The sub VI is the "HP6xxxA Set Voltage/Current.vi"
However, I have managed to load it by copying the VI  from the program I was writing.
Now, will this problem be always-existing?
please let me know, if i sound too vague.. cuz i normally do 🙂
 
thx. .... labVIEWrookie
 
0 Kudos
Message 5 of 7
(4,321 Views)

You won't have the problem in the future if, when you load the top level VI, you browse to the location of the subVI, and then when you exit the top level, you do a save.

By the way, there is a standard location for instrument drivers. When you download a zip file with a driver, unzip it and then move the folder underneath the LabVIEW\instr.lib folder. You should see a HP34401 folder there already. The instructions on the driver download pages also tell you this. When you do this, the driver functions will appear on the Instrument I/O>Instrument Drivers palette.

Message 6 of 7
(4,314 Views)
Dennis:
 
Thanks a lot.!! you ROCK !!!
 
~ labVIEWrookie
0 Kudos
Message 7 of 7
(4,308 Views)