LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Modded Agilent 34970A Routines that people are sharing? plus other questions.

Hello,

I have downloaded the agilent 34970A drivers provided by NI.   I am using the USB-RS485 Converter
I am curious if others have customized, or even created their own... I am looking for alternative techniques for data aquisition through labview... I am a serious newbie!

I would like to be able to set the integration times (when reading DC Voltages)  so that I can have improved resolution. 


Question:
Is it possible to execute a logging application, but still allow other channels on the 34970A to be accessed or modified manually? (so that I can still manually use the 34970a while doing a long term datalog)?


Thank you for your time


0 Kudos
Message 1 of 4
(3,038 Views)

I haven't used the internal DMM that much but there is the HP34970A Conf Voltage function that allows for setting of intergration time. The default is minimum which is 400 msec. The VI also allows you to set the range. The default is maximum so you probably want to change this if you want more resolution.

I don't think the instrument allows you to view/modify other channels while a scan is in progress. You should probably check with Agilent to be sure.

0 Kudos
Message 2 of 4
(3,031 Views)

I have develped a very cool Excel Workbook to take the output from the Agilent "Benchlink Software".  It has macros that

create graphs and tables that I use in reports... But I am now thinking of writing a LabView program to run the 34970A because the Agilent

software sucks.  I am willing to share my program and ideas.  I am pretty advanced with LabView.  I use a USB to HPIB (GPIB) interface.. but

really it doesn't matter what you use to connect your computer to the instrument with in LabView... it's pretty flexable and easy.  I have written "drivers"

and modified them... It's easy !

But to answer your question:  The best way to answer this is.  Download the Agilent programming manual, which also may be a section in the regular manual,

I haven't done this yet... but if Agilent says you can do it.. probably you can.  But really if you can't do it with the manual keys on the equipment, probably it can't be done.

LabView will only allow you to use the remote commands that the equipment manufacturer programed the device to be able to read and write.

Message Edited by dpopovich on 03-20-2006 11:47 AM

"Opportunities multiply as they are seized."
- Sun Tzu
0 Kudos
Message 3 of 4
(2,990 Views)
 Thank you for your responses.


I was able to modify the Agilent 34970A Advanced Scan... sucessfully.  I didnt realize that within the Agilent 34970.vlib:Conf Voltage.vi you had to "set as default" (data operations > make current default value) in order for the settings such as NPLC to hold the desired value.

I went through various combinations of mean-pt-pt and median pt-pt and the lowest noise level that I could obtain was approximately 0.9uV p-p DC with 6.5digit, NPLC=100.  I currently am 3ft of 4 wire, Teflon coated AWG22 (2pr) shielded wire  (BELDEN 88723), the shield wire is soldered to the 34902A shield plane.  The result is obtained with the +/- shorted at the measuring end.  I dont think this is horrid seing the cable length...  I am curious as to what others see as a noise floor.

Originally with my configuration I was measuring the following:
Channel 103 - Bridge Output (Auto NPLC=100)
Channel 104 - 10.000 Vdc Bridge Excitation ( Auto NPLC=100)
Channel 116 -  Temperature (Thermocouple, Type J)
I set up the sample time for 10s, and when monitoring the outputs.... once and a while the output of channel 103 would spike from mV level to some higher voltage (between 2-10Vdc)... upon removing monitoring of channel 104 this problematic output went away.  The connections are good (there is no frayed wires) and I figured that the sample time being greater than 10s would be plenty... Any suggestions on why this would happen?


Thanks for your time



0 Kudos
Message 4 of 4
(2,975 Views)