Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Interchannel Delay

Problem: How can I check the interchannel delay of my DAQ task using LabVIEW? I want to be able to know what interchannel delay was automatically set by NI-DAQ, so I can make changes if I want to better settle the instrumentation amplifier of the board.
Solution: Using LabVIEW, you can place an instance of the Clock Config.vi, right after the AI Config.vi. Set the which clock input as "channel clock". The actual clock rate specification output will show the interchannel delay configured by NI-DAQ for the sample rate specified.

---------------------
This sounds like an answer until you actually try to do it. All you get is a couple of clock rates, a divisor, and a period. Is interchannel delay simply the period?

Does sampling 16 channels at 5
0KHz (20uS) with a period of 0.8uS give about 7uS of "free time" during the 20uS?
0 Kudos
Message 1 of 2
(2,845 Views)
Yes that is correct. You see, there are two clocks at work here. The scan clock and the channel clock. The period of the channel clock is the delay between channels also known as the "interchannel delay". This way, all the channels can be sampled as close to the edge of the scan clock as possible. Like you said, the rest of the 20us is just "free-time." Some people find this useful as it sort of simulates simultaneous sampling. If all the channels are sampled quickly at the beginning of the scan clock edge, for all practical purposes, they might as well have been sampled at the same time. However, some need to slow down the channel clock (or increase the interchannel delay) if they run into incorrect measurements due to settling time issues.

I hope thie h
elps. Let me know if you have any questions.

Russell
Applications Engineer
National Instruments
http://www.ni.com/support
0 Kudos
Message 2 of 2
(2,844 Views)