Instrument Control (GPIB, Serial, VISA, IVI)

cancel
Showing results for 
Search instead for 
Did you mean: 

Help With Agilent 34908A

You first write a 1, then a 0, then a 1, then a 0. You have a single bit that you're controlling. The single bit is either a 0 or a 1. Serial pattern generation is all that is possible. To put out a pattern of multiple bits, you would use the digital output of the 34907 directly and not go through the mux. The 34907 has parallel output. The 34908 closes a single channel at a time so parallel output is not possible.
0 Kudos
Message 11 of 16
(1,655 Views)
I have connected a function generator, one of the channels on the 34908A MUX to see if I can read the pattern. I am using the read digital VI, but for some strange reason, it doesn't work until I run the EZ voltage, to take a voltage measurement on the channel. Also, When I run EZ voltage the Agilent box tells me that all the channels in the slot are off, and the one channel that I am communicating is at VDC. Must this happen in order to use the Read Digital VI?
Thanks
AP
0 Kudos
Message 12 of 16
(1,641 Views)
Let me try to understand what you're trying to do. You have a function generator connected to one of the mux channels and the common point of the scanner is connected to one of the digital I/O pins and you try to read the digital state? If that's the case, let's start with the connections. Let's say the 34908 is in slot 1 and the 34907 is in slot 2. Let's also say that you have channel 1 of the mux connected to the function gen and bit 0 of port 1 connected to the 34908 common. The first thing to do is to use HP34970A Switch.vi to close the channel. The Channel List will need a value of 101 (board 1, channel 1), Set/Check is set, and Open/Close is close. Then, use HP34970A Digital Output.vi with slot number set to 2 and Check/Set to Check. When you run this VI, it will detect the logic state of bit 0. If your function gen is set to output 5 vdc, you will read a logic 1. If your function gen is set to 0 vdc, you will read a logic 0. If your function gen is set to output a digital pattern, you will read either a logic 1 or 0 but you will not read a pattern. Whenever you do a read, all you wil be able to do is detect the logic state when the read command is issued. Pattern detection is nearly impossible. You would need to put your digital read in a constant loop and the loop would have to cycle very precisely to detect all the changes in your input pattern. It would also need to cycle faster or at the same rate as the pattern. The 34970 is not the fastest box out there. It is really designed for static monitoring. You also have problems with running a loop on windows with any kind of deterministic timing. The best you can hope for is 1msec and the timed loop available in LabVIEW 7.1. If the time between logic level changes of your digital input is less than 1msec (greater than 1KHz), you will only be able to detect some of the logic states. If pattern recognition of a single digital input is your goal, the 34970 is probably the wrong instrument.
0 Kudos
Message 13 of 16
(1,639 Views)
Are you saying that if the frequency of the function generator was a 10k, Labview would not be able to read the digital pattern, even if I looped the digital read VI, with time delay of .1ms between each iteration?
0 Kudos
Message 14 of 16
(1,636 Views)
No, you wouldn't and it's not really a limitation of LabVIEW. Timing on a windows/mac/linux based system is simply not that accurate. I also looked up the specs and the box itself can't be sampled that fast. The spec on digital I/O is 95/s. Limit the function gen to a frequency of less than a 100 Hz and you might have a chance.
0 Kudos
Message 15 of 16
(1,633 Views)

Hi,

I see this is an old topic, but I have a 34907A that I'm looking to use in a serial data application. I need to use 4 outputs configured as clock, strobe, serial data in, and a data latch. As far as I can tell, how fast the data is transmitted is not an issue. Here's an exerpt from the datasheet:

 

Serial Data
Following initialization, the 12-bit digital word representing the desired output
current is applied to the SDI pin. The serial data should appear starting with
the most significant bit (MSB, bit 1, D11) and ending with the least signifi cant
bit (LSB, bit 12, D0). With each data bit present and stable on the SDI line,
the CLK must be toggled through a low-to-high transition to register that bit.
Twelve rising clock edges, at rates up to 500kHz, are required to clock all 12
digital bits into the DTL2A-LC’s input register.
Latching Data and Presenting It to the D/A
After loading the LSB, the serial data word is latched by bringing the Control
Strobe (pin 7) high and then toggling the Latch Data pin (pin 4) through a
high-low-high sequence. Approximately 100μsec later, the output current will
settle to its fi nal desired value.

 

The following steps describe a typical timing sequence when using the
4 digital inputs and a programming language such as C. Using 4 bits of
a typical 8-bit port, assign BIT_0 to the Control Strobe (CS, pin 7), BIT_1 to
Latch Data (LD, pin 4), BIT_2 to Serial Data In (SDI, pin 5), and BIT_3 to
the Clock (CLK, pin 6).
1. Initialize with Control Strobe, Latch Data, and Clock high:
BIT_0 = 1, BIT_1 = 1, BIT_2 = X (don’t care), BIT_3 = 1
2. Bring the Control Strobe low.
BIT_0 = 0
3. Apply the MSB (D11) of the serial data word to Serial Data In.
BIT_2 = 0 or 1
4. Toggle the Clock high-low-high.
BIT_3 = 1 to 0 to 1
5. Apply D10 of the serial data word to Serial Data In.
BIT_2 = 0 or 1
6. Toggle the Clock high-low-high.
BIT_3 = 1 to 0 to 1
7. Repeat the process for remaining data bits D9 through D0.
8. Drive the Control Strobe high.
BIT_0 = 1
9. Toggle the Latch Data input high-low-high.
BIT_1 = 1 to 0 to 1.

 

Is something like this doable, or should I be looking elsewhere?

Thanks,

Bryan

0 Kudos
Message 16 of 16
(1,319 Views)