Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Random channel skew on a USB-6251

Solved!
Go to solution

The AI and AO timing engines divide down the sample clock timebase (20 MHz by default)

I'm confused now. Is the timebase 10MHz or 20MHz? Or is the 10MHz clock a different signal?

 


If you don't synchronize the tasks using a shared start trigger or shared sample clock

Shared start trigger AND shared sample clock or just one of the two? Because I think I am already using the same clock for the AIs and the AO. If AO start dividing the 20MHz at a different time (phase) this does not matter to me, because I am only interested in the phase of my three AIs. But if AO is updated between two AIs this is obviously another story as I would have the two AIs with 1 sample delay from each other. Is this what is happening? This however does not completely explain what I am seeing. If I send a 1kHz (or any other frequency, it doesn't matter) sine wave to the AO and read 3 AIs, I see (for example, as this changes at every execution) the first two channels *EXACTLY* in phase (how can this happen??? There's a mux'd ADC!!!) and the other channel anticipated *EXACTLY* 1 sample with respect to the other two. If I repeat the experiment, starting only the AIs and sending the output from the sound card, I see exactly what I am expecting: three sine waves delayed by 0.1 samples, in the scan order  (I am using a sampling frequency of 100kHz, and a conversion frequency (I hope this is the correct name for it) of 1MHz, or the maximum allowed).

I'll post some plots later today, I don't have the board with me now.

 

Thank you

 

Giovanni

0 Kudos
Message 11 of 16
(2,086 Views)

Hi Dan,



I'm not sure my suggestion to synchronize AI and AO is correct.


I think it has to do with it somehow, but I still don't have a clear picture..

 


Since you're trying to measure inner-channel delay you need to be measuring a signal which is changing in the time between each one of your individual AI channels is sampled.  To make this happen, you'll want to be running your generation much faster than your acquisition. 


Not sure what you mean here... I can se a fractional time delay with any signal (say a sine wave) as long as it's not dc (obvious) or a step (you can only appreciate full samples delay with a step)...

 

See also my reply to  Brad K.

 

The synchronization I was proposing would have held AO at the same voltage while your AI channels were sampled.  This would not show inner-channel delay.


Why not? As long as it changes between different samples (it is not a dc)...

 


If all you want to know is what that delay is, you can query DAQmx, and it should provide this information to you.

DAQmxGetAIConvertRate

DAQmxSetAIConvertRate

DAQmx(Get/Set)DelayFromSamplClk.


Great advice, I'll try this later

Giovanni
0 Kudos
Message 12 of 16
(2,086 Views)

Hi Giovanni,

 

Let me start by explaining the timing signals that I think you need to be concerned about, and how these will affect what you are attempting to do.  Here are the relevant signals:

 

AI Sample Clock:  For devices such as the USB-6251 which have 1 ADC, this clock will be used to start the clock which will actually acquire data from all of your channels.  This clock occurs at the 'rate' you set in DAQmxCfgSampClkTiming.

AI Convert Clock: This clock controls when each individual channel is sampled.   After a sample clock occurs, hardware will wait DelayFromSampClk time, then issue a clock pulse for each channel in your task at the rate specified by AIConvertRate (setting I'd mentioned in my previous post). The rate of this clock will dictate your inner-channel delay.

AI Start Trigger: When this signal asserts, you hardware will wait StartTrigDelay then issue your first sample clock.  All subsequent sample clocks will follow at the rate you specified when you configured timing.

 

What you end up with is something that looks like this (hopefully the formatting of this ASCII art doesn't get too messy):

 

AI Start                   __n_______________________________________________________.....

AI Sample Clock:     ___n__________________n__________________n________________.....

AI Convert Clock:    ____n___n___n__________n___n___n___________n___n___n_______.....

                                      ai0   ai1   ai2                  ai0   ai1    ai2                   ai0   ai1    ai2

 

The AO signals we need to concern ourselves with are as follows:

AO Start Trigger: Much like AI Start Trigger, this signal is used to start the clock which updates any of the channels in your AO task.

AO Sample Clock: The clock which actually updates the DACs on your AO channels.  Note that each channel has it's own DAC, so all channels are updated simultaneously.

 

Now, since AO doesn't have this concept of a convert clock, if we synchronize sample clocks on AI and AO, AO will update it's value when the sample clock occurs.  If we take noise/slew rate out of the picture, your AO channel would hold this value until the next sample clock.  While this was being held, your AI task would then scan through it's channels (at rate specified by your convert clock rate).  However, since AO is being held, all channels should read back the same value.  In reality, things like the slew rate of your AO channel may dictate that your first AI channel reads the AO value before it has completed it's transition to its new value.  Noise will dictate that its very unlikely that all three of your AI channels actually read back the exact same value, but the difference may not be due to inner-channel delay.

 

In my mind, the only way to measure the time between the conversion of two channels, you would have to measure a signal that changes between these conversions such that you could observe some phase difference.  I think that sharing a start trigger between AI and AO, and perhaps using  DAQmxSetStartTrigDelay to control the relationship of sample clocks used for AI and AO is a good idea.  However, I still think that to measure inner-channel delay, you'll need for your AO channel to be updated at a higher rate than your AI Convert Clock Rate.

 

I hope that helps explain things.  If I've misunderstood what you're attempting to do, please let me know.

Dan

 

Message 13 of 16
(2,073 Views)
Hi Dan

Let me start by explaining the timing signals that I think you need to be concerned about, and how these will affect what you are attempting to do.  Here are the relevant signals:[...]


What you said is exactly the picture I had in mind, I simply didn't think long enough 😞  Of course, as you say, if AO doesn't change during the acquisition of the 3 AIs, there's no way to see any skew. That was a silly mistake of mine. And of course things work fine when I'm using an external source (the sound card) because the signal can change between the acquisition of two AIs.

So the fact that I'm seeing two different "phases" (i.e some inputs delayed by 1 sample) in my silly experiment is only due to the fact that the DAC is being updated in between two AIs, and a proper sync will solve the problem, right?

 


 I hope that helps explain things.  If I've misunderstood what you're attempting to do, please let me know.


I'll try with the sync and let you know.

Thank you once again for your patience

Have a nice weekend

 

Giovanni

0 Kudos
Message 14 of 16
(2,068 Views)

Giovanni,

 

With the setup you have described, proper synchronization should indeed prevent the DAC from updating during the acquisition of your 3 AI channels.  The one caveat that's bitten me when I've tried this is to remember that the DAC does not update instantaneously.  So, if you don't delay the sampling of your first AI channel a bit after you update AO, you can catch the DAC in the middle of it's update.  This can make it appear as though your first channel is one sample behind (or a partial sample behind).  If you synchronize your AI and AO sample clocks, then I would recommend that you use DAQmxSetDelayFromSampClk, to push the convert clock out by the specified delay after the sample clock occurs.  If you do all of this, I would expect that you will no longer see the 'phase' issue you had observed previously.

 

Good luck with your application!

Dan

 

 

Message 15 of 16
(2,063 Views)
Solution
Accepted by topic author Pasu
Hello Dan
Just to let you know that I 'almost' solved my problem by using an hardware trigger (Matlab kindly let me use the PFI0 line in a very straightforward way for this purpose). Now I get consistent results and can avoid clumsy zero paddings/cropping to obtain my acquisition.
I said 'almost' because I still get the first of the three channels '1 sample behind', but this is probably due to the fact that the DAC takes too much time to settle/update, as you were pointing out in your last post.
Anyhow, in my application I'm not interested in the output's phase (which is also further delayed when it travels through the actual measurement equipment), so I'm happy with this solution.
Thank you very much once again to you and all the people that joined this discussion
Take care
 Giovanni
0 Kudos
Message 16 of 16
(2,021 Views)