OK, so here I am again. I am almost done with this application. I measure a helium concentration, which usually varies from approximately 1 x 10^-6 to as much as 1 x 10^-3. I need to create an analog output that varies from 0 to 5 Vdc, based on this concentration. The problem is that I don't know the exact minimum or maximum that I might ever read. So I have two questions. One, how do I determine a value that I should use for min/max, to ensure the best resolution possible? Second, how do I implement the scaling (0 volts = 1 x 10^-6 and 5 volts = 1 x 10^-3). I know this is not a complex thing, but I'm burned out after 18 days' straight programming and debugging this application. Any help is ALWAYS greatly appreciated.