NI TestStand

cancel
Showing results for 
Search instead for 
Did you mean: 

a quest for AI speed

SW: teststand 2.01,LabVIEW 6.1 and NIDAQ 6.9
Hardware: PCI-6036E, SCXI-1000, module 2 SCXI-1102, module 3:SCXI-1520 and module 4:SCXI-1520 and adapter module for measuring temperature and force. Cable connecting between PCI-6036E and module 3 strain gauge.
In MAX I set up sixteen virtual channels for strain gauges and sixteen virtual channels for temperature measurement.
In Teststand sequence, in pre-uut loop I have a labview VI calling
AI Group config.VI with one strain gauge Virtual channel and one temperature virtual channel, with Device = 3 (the PCI-6036E) and group = 0.
AI Buffer Config.VI with Scans per buffer = 3(the smallest value it accepts with no error), number of buffer: no change, allocation mode: no change.

In Main sequence of Tes
tStand, to read temperature and force, I have a VI calling:
AI Clock Config.VI with clock frequency:10,000;
which clock: Scan clock 1; clock source: internal.
AI Control.VI with Minimum pretrigger scan:0;
Control Code: Start; Total scans to acquire: 3
Number of buffer to acquire: 1
AI Single Scan.VI with read newest data, array of 2 real numbers.
This VI in main takes 0.4 second to obtain one temperature point and one force measurement.
To me it is too long. What do I do wrong? How can I improve the AI to read two channels.
0 Kudos
Message 1 of 4
(3,471 Views)
Hello,

I might be able to shed some light on this issue. There is a bit of overhead when configuring your DAQ board. Therefore, the "config" VIs will take longer then expected. The actual acquisition portion, however, shouldn't take long at all. Typically, if you are running a time sensitive application, you will want to do all your configuring before you need to acquire data. Once you are ready, use the AI Control (Start) to arm the acquisition. Since your board can handle up to 200kS/s, you will be able to acquire data at this rate (5us and not 0.4s) however this is using the buffer transfer and you are not reconfiguring for each acquisition.

It seems you might be using a loop and reconfiguring your acquisition each time through the loop, thus giving y
ou a limited performance because of all the configuring. Unfortunately, there is no way to eliminate the performance overhead of the configuration VIs. You have to design your final application around this limitation.

There is however another option. LabVIEW 7.0 and the new NI-DAQ 7.0 driver (only works with LabVIEW 7) have optimized the single point acquisition process. NI-DAQmx (part of NI-DAQ 7.0) defines and enforces a state model, which eliminates unnecessary reconfiguration of input limits, timing, triggering, and accessories (all done behind the scenes) and thus provides higher performance. It is currently over 10 times faster than NI-DAQ 6.9.

You're best bet though, is to design for a buffered acquisition in which you don't have to reconfigure each time. Hope that helps. Have a good day.

Ron
Applications Engineer
National Instruments
Message 2 of 4
(3,471 Views)
Ron
Thank you for your assistance.
I am connecting the board to SCXI modules, so only one channel CH0 on pci-6036 is used now.
My goal is to read one temperature and one force measurement every time I use AI SINGLE SCAN,(different channels on the modules). I have a few basic question about Analog input. I really don't need buffer, I want newest value. Am I correct?
with the pci-6036 and scxi channels, at what point I set up the scan rate. (In MAX, in pre-uut loop...?)assuming I don't want to call config every reading.
Question 2: Ai clock config clock frequency = 10000, why 10000 is the maximum number I can enter. (higher number will give me error).
0 Kudos
Message 3 of 4
(3,471 Views)
Hi,

I find the easiest way to program using NI-DAQ, SCXI and LabVIEW is to create a Virtual Channel in Measurement and Automation Explorer (MAX). You can do this by right-clicking on Data Neighborhood and creating a new virtual channel.

Then you can use any of the Analog Input shipping examples (Help >> Find Examples >> Hardware Input and Output >> Analog Input >> General) and use the Virtual Channel name you created as the channel string value. Usually the channel input string is an array of strings so you can select both of your virtual channels (temperature etc.). If it is not allowing you to input your virtual channel names in the channel string box, right-click it and select "Allow Undefined Names". This should allow all values.

As fo
r the maximum value of your AI Clock Config, there shouldn't be a maximum value in general. It could be that the 1520 has a maximum value of 10000 because of the filters that are enabled on the module. This is typical with signal conditioning modules. The user manual for the 1520 should indicate these bandwidth limits (ni.com/manuals).

Anyway, using Virtual Channels in MAX and the shipping examples in LabVIEW will give you a good headstart in programming this type of application.

Hope that helps.

Ron
0 Kudos
Message 4 of 4
(3,471 Views)