Hi Chris
There are two clocks associated with the ADCs and the DACs. The Actual ADC/DAC rate and the ADC/DAC IQ rate. The ADCs sample at 100 MSps by default, the DACs sample at 200 MSps by default. The IQ rate is determined by the amount of interpolation of decimation used.
In your example of dividing the decimation factor by the 100 MHz number (the actual ADC sampling rate) resulted in the dt of the ADC IQ Rate. Flipping the division, 100 MHz divided by the decimation results in the IQ rate of the data. The math is similar for the DACs.
On the host, the decimation and interpolation factors are configured when configuring the input (ni5640R ADC Configure DDC.vi) or the output (ni5640R DAC Configure for Quadrature Mode.vi). One difference between the ADC and DAC is that there is a fixe 4X interpolation on the DAC. So when an interpolation factor is configured, the math must take into account the 4X factor as well.
The 100 MHz for the ADC and the 200 MHz for the DAC is configured in the ni5640R Configure Timebase.vi. This is configured as a factor division of the 200 MHz board clock. The DAC is divided by 1, and the ADC is divided by 2. By default, the RTSI clock is divided by 16.
The frequencies you mention in the project explorer are used to tell the XILINX compiler how to compile the code. Those are the maximum frequencies that that clock domain will run at. The XILINX software will ensure that every part within that clock domain will meet timing constraints. If you want to run a section of code in SCTL faster than one of those settings, you should increase it in the project explorer as well. On the other hand, if you write code so the RTSI clock is actually going to run at 12.5 MHz, but the XILINX compiler is telling you that you are not meeting timing (which I think is 50 MHz by default), you can try lowering the number to 20 MHz, which gives the parts more time to propagate their signals on the FPGA.
Jerry