06-29-2008 10:28 PM
07-01-2008 09:41 AM
Hi Gustep,
Just to clear things up on the issue of how the convert clock is chosen, please refer to the KnowledgeBase article found here. Note that the rate of the convert clock depends on the maximum sample rate with a 10µs delay added between channels to allow for settling. Above certain frequencies, there will not be enough room for this extra delay and the convert clock is then calculated as the maximum sample rate times the total channels being scanned. I think the biggest point of confusion about the delay between sample clock and convert clock is that the delay is going to be 3 ticks of the master timebase, and not the convert clock. This timebase is going to be much greater than the convert clock, and therefore skipping 3 ticks of it will not affect the convert clock greatly. Hope this helps,
07-02-2008 06:13 PM - edited 07-02-2008 06:19 PM