Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Convert Clock duty cycle

I am wondering about the duty cycle of the AI Convert clock.

Here is my situation:

I am running an analog input on channels 1 and 2.  I have a synronized digital output at 20 times this frequency.  This digital output controls a multiplexer.  The multiplexer needs a few commands to get things setup and then I allot some time for settling.

The data collected from Ch 1 looks perfect.  However, the data from Ch 2 looks very suspcioius to me.  It looks like the data is being sampled right in the middle of the MUX setup sequence.

Here is what I think is going on:
Digital Output: 1MHz
Analog Input: 50KHz (2 channles)

The Mux sequence goes something like this:
Nothing, Setup differential input Ch A (3 cycles), Setup differential input Ch B (3 cycles), Nothing (13 cycles)

Now, if the analog input convert clock just multiples the sample clock by 2 (to generate the Ch A clock and Ch B clock), then it makes sense that Ch B would be sampling right in the middle of the MUX setup.

Is there a way to push both of the converts (Ch A and Ch B) towards the end of the cycle?  Essentially, I would like them to sample at nearly the same time and then wait for the rest of the sampling period while the MUX is setup and the signals are settling.

I can attach my VI if anyone is interested, but it is quite complicated at this point in time.

Thanks,
Drew

0 Kudos
Message 1 of 7
(4,537 Views)

Drew,

It would seem that you're getting the first channel measurement before the mux does anything, and the second in the middle of it changing.

DAQmx offers a property called DelayFromSampClk.Delay, as well as DelayFromSampClk.DelayUnits. They are both located in the timing property node under more->AI Convert. By varying the convert clock delay to the end of your sample clock interval you'll make sure you're getting both channels after the MUX has changed.

Hope that helps.

Daniel D.

0 Kudos
Message 2 of 7
(4,533 Views)
Unless I can change the duty cycle (or change the delay on a channel by channel basis), it seems that I am just shifting the problem.  If I use the delay, then it will fix Ch 1 but will shift Ch 2 into sampling when the MUX is setting up.  Does this make sense?

I don't think that a simple delay is the answer.

0 Kudos
Message 3 of 7
(4,531 Views)

The convert clock is not restricted to be a 2x multiple of the sample rate (in the case of 2 channels to sample).  Poke around further in the DAQmx Timing property node and you should find a way to set the rate.  In combination with the previous suggestion about delay, you should be able to shift the 2 ADC conversions down toward the late end of the sample cycle.

-Kevin P.

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 4 of 7
(4,509 Views)
Hello SuperSnake428,

Just to elaborate on what has already been said,  the default convert clock for most of our DAQ boards is to have a 10us interchannel delay.  This allows for relatively close to simultaneous sampling on the sample clock edge.  It is possible to set the convert clock to "round robin" sampling, which (in this case) would divide the sample clock by 2, to have the longest time between channels possible. In order for this to happen though, you have to specify it. You can also specify a convert clock if you wish.

All of these properties can be read or written with the DAQmx timing property node, located where Daniel D. said before:



And I agree with Kevin P., poking around that property node should let you fine tune exactly what you are looking for.  If you are unsure what a property does, context help (Ctrl+H) gives a great summary.

So the moral to the story is delaying the convert clock should not shift the second channel into the mux setup unless you have specifically changed the sampling to "round robin". I hope this helps, let us know if this solution works for you.


Message Edited by Neal M on 11-06-2007 12:22 PM
Neal M.
Applications Engineering       National Instruments        www.ni.com/support
0 Kudos
Message 5 of 7
(4,496 Views)
Neal & Others,

So are you saying that the default sample clock actually looks like my "Desired AI Convert Clock" in the attached picture?  That is what I am trying to achieve.

Drew




0 Kudos
Message 6 of 7
(4,481 Views)
Hello SuperSnake428,

Theoretically, yes thats what we are saying.  At least that is how it is calculated and seen by the card.  In reality, things are a bit more confusing.  I used the AI Convert clock rate property on my M series card and discovered that for 2 channels reading at 50k, my convert clock was running at 100kHz (10us between channels).  This means that the convert clock looks like your "Actual Convert Clock" signal. This is not because the sample clock is being divided by two, but because that is the rate the hardware uses by default.  You can increase this rate (by writing to that property) upto 1MHz (1us between channels).  I found this out by using the AI Convert Maximum Rate property.  You have to be careful when increasing this rate since you can start running into issues such as ghosting between channels and such.

Here are a couple links that may clarify some:

What Is the Difference Between Interval Scanning and Round Robin Scanning?

Minimum and Maximum Values for the Interchannel Delay Setting

How Can I Check the Interchannel Delay of My DAQ Task Using LabVIEW?

You can then use the delay from sample clock property to delay the convert clock off of the sample clock edge if you wish. I hope this clarifies.
Neal M.
Applications Engineering       National Instruments        www.ni.com/support
0 Kudos
Message 7 of 7
(4,450 Views)