Unfortunately I'm on an assignment away from any LabVIEW terminal and can't look at the code. Based on the screenshot, it appears you're using the so-called 'Method 2' where you count cycles of your test signal during a known time interval.
The first thing I'd generally investigate is quantization error, though I have my doubts in this particular case. First of all, you describe a consistent bias in your result where quantization would typically dither on two values surrounding the correct one. Also, the size of the error (about 1 part in 39000) doesn't really correspond to a typical measurement duration. Nonetheless, I'll pose a couple questions and we'll see if we can get anywhere...
What is the duration of the time interval you use to gate the cy
cle counting you do? (Likely equal to the pulse width of the gating pulse. To explain a 27 Hz error on a 1.05 MHz signal, it'd require a pulse width of 0.037 sec.) How does the measurement error vary when you change this interval?
Have you tried using the referenced 'Method 3" yet? What dividers have you tried? How does the error vary when you change the divider?
Finally, is there a chance that the 6608 actually is mroe accurate than your other devices?
ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.