How does changing a DAQ board from a 0-10 volt range to a -10 to 10 volt range, affect the minimum voltage change that can be detected?
A There will be no effect on the minimum voltage change that can be detected.
B The minimum voltage change that can be detected will be split in half.
C The minimum voltage change that can be detected will depend inversely with the resolution of the board.
D The minimum voltage change that can be detected will be doubled.
I think D???
From what I just read from the 3 hour intro lesson. There is resolution which is provides 2^(resolution) number of bits that represent a range of 0-10 volts or -10 to 10 volts. So if you increase range from 0-10 to -10-10 volts, it would take twice as many voltage change to be detected since 2^(resolution) sample bits now have to be more spread out Am I right?