01-30-2018 03:14 AM
Hello all,
I would like to use the built in amplifier of a DAQ device. I am currently using the NI 9206 for example but I may change it in the future.
What I would like to know is how to read the possible gains in the datasheets, in order for me to select my hardware correctly.
I've read that by defining the min/max input range, the gain is selected automatically.
What I am not clear about, is how to know the gain and how can I confirm the signal was really amplified.
https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z000000P8XxSAK
The above article has a very specified table of what I need to know, but is it true to all DAQ?
For example, in the datasheet of the NI 9206 that I have it says that the input range is: ±200mV, ±1V, ±5V, ±10V.
So, if I select the ±200mV range (by setting min/max inputs), automatically my gain is 25? (from the table in the link above)
Is there any way to confirm it?
Any additional info and tips on this topic are highly appriciated as well!
Thanks in advance,
Vlad
01-30-2018 07:02 AM
Well, the 'gain' or input range can be set with property nodes or by defining the needed (expected) range and the driver is smart enough to choose the best matching one.
Carefully reading the specification usually reveals the possible ranges of a device.
If the signal is amplified or attenuated isn't obvious for the customer (without schematics).
Usually the internal ADC runs in a constant range defined by a reference voltage ... sometimes this voltage can be changed. Assume a 1.25V to 5V reference Voltage (and range) with common DAQs. High end (6.5 and more digit) and/or older gear usually work with 7V to 10V references, it's the art of designing a stable , low noise reference.. that defines the accuracy you can get.