11-18-2009 08:54 AM
I am using 'on demand' sampling on a PCI-6220 card and I would like to know what the samplig time/channel I will be getting. There doesn't appear to be a setting so am I to assume it will be 4uS per channel (250KS/s)?
I have an input signal containing noise spikes which is affecting the reading and I would like to do some averaging. I need to know the sampling time so that I can decide how many readings I can safely sample within the 10mSec time frame in my application.
Thanks
Tony
11-18-2009 09:37 AM
11-18-2009 10:10 AM - edited 11-18-2009 10:11 AM
Hi,
Thanks for the reply.
I realise that I will not be able to read the ADC at this speed but my question is really, what is the conversion time of the ADC? To put it another way, if I ask the board to read 8 channels into a buffer using a single call to DAQmxReadAnalogF64(), how long will it take to sample each channel?
Hope that makes sense.
Tony
11-18-2009 10:29 AM
11-18-2009 05:48 PM
11-18-2009 08:14 PM
The default rate is described in NI-DAQmx Key Concepts >> Timing and Triggering >> Timing, Hardware Versus Software >> Clocks:
AI Convert Clock—The clock on a multiplexed device that directly causes ADC conversions. The default AI Convert Clock rate uses 10 µs of additional settling time between channels, compared to the fastest AI Convert Clock rate for the device. When the Sample Clock rate is too high to allow for 10 µs of additional settling time, the default AI Convert Clock rate uses as much settling time as is allowed by the Sample Clock rate.
Brad