Instrument Control (GPIB, Serial, VISA, IVI)

cancel
Showing results for 
Search instead for 
Did you mean: 

Continuous acquisition using Agilent U2356A and Labview

Hello Steve,

        Thanks for letting me know that you have already updated the firmware.  We have tested the instrument driver and the separate commands that Agilent recommended with a U2331A and U2355A, and did not see any problems.  It seems like it is not a problem with the entire series.  We have notified Agilent about this discrepancy.

 

Thanks,

 

NathanT

0 Kudos
Message 11 of 33
(3,620 Views)

Hi Nathan,

 

I've been in contact with the Agilent support centre regarding this issue, and have found out that the little snippet of code that I was given to try with the Interactive IO was not quite correct;  for each line (VOLT:RANG 10, etc.) there needs to be a channel designator at the end (i.e. VOLT:RANG 10, (@101)).  Adding in this little piece resolved that issue and prevented the DAQ from using the default values.

 

However, this did not solve my problem with the Labview program, since it uses the ROUT set of SCPI commands.  Now the very first measurement that I take is correct, but the remaining 124 are zero.  I also loop through the acquisition portion for a user selected amount of time, and each time it is zero.  The only correct measurement is the very first one.  I feel that I'm very close, there is just something small that I'm missing.  Please have a look at my program (replace the continuous acquisition program from the examples folder of the Labview driver and put the channel selection in the example folder as well) and let me know if there is any glaring (or minute) issues I may have overlooked.  I'm using Labview 8.5 for this code.

 

Thanks in advance,

Steve

 

 

0 Kudos
Message 12 of 33
(3,592 Views)

One thing I forgot to mention was that if the DAQ power is cycled, then the first time I run the program it will operate exactly as I intended, but every time after that it will only be correct for the very first measurement.  Setting the reset operator on the initialize vi to true seems to not have any impact, I would have suspected that resetting the instrument would be similar to cycling the power.

 

Steve

0 Kudos
Message 13 of 33
(3,589 Views)

Hello Steve,

       Thank you for the additional information.  If you run one of the examples that uses Configure Channel instead of Configure Route, does it work consistently?  It is very interesting that power-cycling the device makes the program run as intended.  The reset VI just sends the reset command, and that behavior is determinedby the device's firmware.  In many instruments, resetting is very similar to power-cycling, but evidently not in this case.

 

Cheers,

 

NathanT

0 Kudos
Message 14 of 33
(3,575 Views)

Hello Nathan,

 

I found some errors in my program that were not causing Labview to stop working, but it was causing the DAQ to return to the default state, thereby removing all of my settings.  I also believe that the other issue I was having (only the very first measurement of the first block was correct, everything else was zero) is happening because of a grounding issue.  Since the DAQ is not plugged into the thermocouple board that I made, most of the inputs are floating, I only shorted out a handful of them.  Subsequently, the program works properly when I limit it to 4 or 5 channels, when I go above this then the issue presents itself.  I've attached the fixed VI for your reference.

 

I do have another concern, however.  Currently, I changed the scan speed to be a static 10kHz instead of trying to calculate the maximum theoretical speed of the system, since that wasn't tracking the input, it seemed to be repeating the initial block of data.  In any case, I don't think that we will ever need to measure 10kHz on a single channel, and using all 32 differential channels yields a little faster than 1Hz, which is one of the common sampling rates here (along with 1 sample every 6 seconds).  The issue I'm having now is that the buffer of the DAQ is getting full after approximately 30 seconds of aquisition, even though I have the # of points set to 1000, so it will output one set of data every 100ms.

 

I believe that the computer cannot keep up with the DAQ, and I was wondering if you know of any way for me to retain the 10kHz/1000 samples per block that I'm currently using, but be able to empty the buffer at once, since the tests that we run can go for a number of hours.  Ideally I'd like it to be able to run indefinitely, but I'll take what I can get.

 

I hope this makes sense to you, my mind is on way too many projects at once .... and I apologize for rambling. Smiley Tongue

 

Steve

Message Edited by krazyups on 06-23-2009 08:57 AM
0 Kudos
Message 15 of 33
(3,572 Views)

Steve,

 

Here are a couple suggestions for getting rid of the buffer overflow issue:

- Read this.

- Use DMA rather than Interrupts.

- Read more data at a time. So, set your # of points to 10,000 or to -1 (this will read all data available in the buffer for a continuos task. there is also a Read All Available Samples property node)

- Increase the buffer size. This will just delay the inevitable as you are obviously filling  the buffer faster than it is being emptied, so making the buffer larger will just increase the time you have before the error hits.

 

---

Peter Flores
Applications Engineer
Message 16 of 33
(3,562 Views)

Peter,

 

Thanks for the suggestions.  You are right, I've basically got no choice other than increase the buffer size to 10,000 (my sample rate).  I can then break up that block of data into 1000 sample blocks, and write to file/display/graph the blocks corresponding to the user selected sampling rate (usually 1 sample every 6 seconds, sometimes 1Hz).  The only issue I can foresee is that when I expand the number of channels to 32, I can't get the steady output of data from the DAQ (somewhere around 2Hz when all channels are enabled), which is going to complicate things.

 

I need a faster computer!

 

Steve

0 Kudos
Message 17 of 33
(3,546 Views)

Hello Steve,

       Peter linked to documents that explain how to increase the buffer size with NI-DAQmx, which is the device driver for NI DAQ products.  Since you are using an Agilent USB DAQ device with NI-VISA, you will need to use the "VISA Set I/O Buffer Size" VI.  Please let us know if that helps.

 

Thank you,

 

NathanT

Message 18 of 33
(3,543 Views)

Hello Nathan,

 

Thanks for the information.  I just tried to incorporate the set I/O buffer size vi into my program, but no matter which buffer I select (transmit, receive or both) it comes back with an error that states "Invalid buffer mask specified."  I suppose I need to find out what numbers correspond to which buffer in the U2356A that I'm using.

 

I've done some IVI-COM programming using this DAQ, and I do recall setting the buffer size using that, but I'm not quite sure how I would go about extracting the buffer mask number.

 

Steve

0 Kudos
Message 19 of 33
(3,539 Views)

So I decided to take a slightly different route in my programming.  Instead of using the continuous acquisition features of the DAQ, I've decided that it would make my life a lot easier if I just used the MEAS? (@<channel list>) commands, along with the VOLT:RANG, POL and STYP commands to set the parameters of the channels.  Although this doesn't quite give me the extreme resolution that averaging 1000 samples does, it is still acceptable for us (I was getting ~10µV by averaging, and now the DAQ by itself is running ~1.2mV ... but considering that my range for temperature measurement is ±5V, that should be ok).

 

My program is now bottlenecked by the speed of the computer if I decide to run the code wide open (I'm most likely going to throttle it back to 5Hz, since even that is much faster than what we need ... and it makes my data analysis easier).

 

Thanks to all who have offered assistance, I'm sure I'll be posting back here soon when I run into problems with the new program.

 

Steve

0 Kudos
Message 20 of 33
(3,530 Views)