I know I can select one of several "input ranges" in LabView for my M-series DAQ card, such as +/- 10V etc. However, is this the allowable input range for the ADC, or the allowable input range for the PGIA?
To clarify, let me describe the following scenario: Say I have a 100Hz, 10Vpp sine on AI1, and 100Hz, 9Vpp on AI9 on my PCI-6281 card. Both tones are in phase.
This card can measure at most eight channels in differential mode (AI0 through AI7), or sixteen channels in single ended mode (AI8 through AI15). If I select "differential mode" for this M-Series card, then AI1 is measured differentially against AI9. In this case, the PGIA would presumably subtract these two signals and only pass the equivalent of 100Hz, 1Vpp to the ADC.
So to get the best ADC resolution, could I select a +/- 1V range in LabView? After all, this is the largest signal I expect the ADC to see. Or is there some requirement to select the full +/- 10V range in this case, even though the ADC would never see anything that large and a large chunk of the 18 bit resolution would thus remain unused?
Please advise.
Message Edited by Gustep on
05-06-2008 09:57 PMMessage Edited by Gustep on
05-06-2008 10:00 PM