Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

A/D Sample Rate on PCI-6220

I am using 'on demand' sampling on a PCI-6220 card and I would like to know what the samplig time/channel I will be getting. There doesn't appear to be a setting so am I to assume it will be 4uS per channel (250KS/s)?

 

I have an input signal containing noise spikes which is affecting the reading and I would like to do some averaging. I need to know the sampling time so that I can decide how many readings I can safely sample within the 10mSec time frame in my application.

 

Thanks

 

Tony

0 Kudos
Message 1 of 6
(3,506 Views)
The sample rate when using on-demand is the time between iterations of your loop and as such, is highly variable because you are not using a real-time os. You are using software timing and the rate will not be 250KS/s. Not even close to that.
0 Kudos
Message 2 of 6
(3,503 Views)

Hi,

 

Thanks for the reply.

 

I realise that I will not be able to read the ADC at this speed but my question is really, what is the conversion time of the ADC? To put it another way, if I ask the board to read 8 channels into a buffer using a single call to DAQmxReadAnalogF64(), how long will it take to sample each channel?

 

Hope that makes sense.

 

Tony

Message Edited by Software Monkey on 11-18-2009 10:11 AM
0 Kudos
Message 3 of 6
(3,499 Views)
Sorry, I misunderstood. I believe the time to perform a single scan will be whatever you set it to. If you set it to the max, then I think the 4 usec between channels would be accurate.
0 Kudos
Message 4 of 6
(3,496 Views)
For a multiplexed device, the driver uses an algorithm to determine a default conversion rate if not explicitly set.  I can't remember the exact algorithm off the top of my head, but I think it's something like the maximum conversion rate of the ADC with a pad of 4 microseconds to allow for maximum settling.  When using hardware timing, this pad gets shrunk as necessary to meet the overall sample rate.  For On Demand or software timing, I think the pad is always there.  This algorithm is documented somewhere in the DAQmx help file (I forget exactly where right now).  Regardless, the easiest thing to do to find out which rate is being used is to just read back the convert rate property.  If you don't like the default used by the driver, you can override the property and set it to something faster or slower as necessary to achieve the desired rate and accuracy.
0 Kudos
Message 5 of 6
(3,475 Views)

The default rate is described in NI-DAQmx Key Concepts >> Timing and Triggering >> Timing, Hardware Versus Software >> Clocks:

 AI Convert Clock—The clock on a multiplexed device that directly causes ADC conversions. The default AI Convert Clock rate uses 10 µs of additional settling time between channels, compared to the fastest AI Convert Clock rate for the device. When the Sample Clock rate is too high to allow for 10 µs of additional settling time, the default AI Convert Clock rate uses as much settling time as is allowed by the Sample Clock rate.

Brad

---
Brad Keryan
NI R&D
Message 6 of 6
(3,465 Views)