11-24-2009 03:19 PM
Hello,
I am working with a new cDAQ-9172. The modules that are in use for analog acquisition are 9211(2 channels) and 9219(1 channel-high speed). I have created a while loop merely for acquiring these signals. I am unable to accurately predict the time needed to complete one iteration. For instance,the time taken for 10 samples and 'x' Hz sampling rate was assumed to be (10/x)+ADC delay. However, this math does not seem to hold true for most rates. The actual time taken is lesser than predicted for low rates like 1Hz and more than expected for higher rates.Besides, for multiple iterations, the actual time taken almost remains constant at higher sampling frequencies(1kHz, 10kHz,etc.). I would be interested in knowing the reason for this.
Could you please tell me exactly how to calculate the time for an iteration? Have I missed out on some vital time-dependent factor?
11-24-2009 03:55 PM
Hello Jerry,
So there are a lot of factors that can affect your loop rate. To start off with, can I get some more detail about your software setup? Specifically:
11-24-2009 10:23 PM
The required details are:
1. Programming Language- LABVIEW 8.0
2. The DAQ Assistant is used
3. Hardware timing-finite
Thank you.
11-25-2009 09:29 AM
Hi Jerry,
So using the DAQ Assistant isn't going to result in constant loop times most of the time. If the assistant is returning all of the samples you request in one iteration, the assistant will need to commit the settings to hardware, start the task, read the samples, and then stop and clear the task. The time this takes will vary based on what system resources are currently available. If you want a repeatable loop time, using the DAQmx VIs is more reliable.
A setup similar to this is what I use if I want to measure loop time: