01-25-2010 01:32 PM
The 5133 manual indicates that I can provide an external sample clock through the PFI line.
However...in 'normal' operation...it appears that the 5133 derives it's sample clock by dividing down a 100Mhz reference.
So...does the manual really mean...external *reference* or is it really an external *sample clock*.
For example...if I wanted a 10MHz sample rate...should I really provide a 100Mhz external reference...and let LabView do the divide by N to get the 10Mhz sample rate (by making the appropriate settings with the NI-Scope VI's or in Signal Express)...
OR....
Should I provide a 10MHz clock...and LabView somehow recognizes this...ignores all the other settings/VI's....and simply uses what it sees on the PFI line as the sample clock?
How exactly does this work?
---
Brandon
01-26-2010 05:24 PM
Hi Brandon,
It can best be thought of as an external sample clock. This is the terminology used in the specifications document (page 6) and the High-Speed Digitizers Help (Devices » NI 5132/5133 Overview » Clocking). It is confusing that the external sample clock can be decimated down, but in this case, either option you presented would work. If you simply want a 10MHz sample clock, it would probably be easiest just to provide it to the 5133 and use it directly, without any decimation. If you do not have access to that clock, but you do have a faster clock, then you can provide it and use the decimation feature. Either method is up to you, the option simply exists if the user wants to sample slower but does not have a slower clock. Hope this helps,
01-26-2010 05:32 PM