02-09-2011 04:09 PM
Hi,
I’m generating a voltage output (0-10v) to simulate a temp range from 0-100deg c. The external 3rd party controller normally accepts a 0-10v signal, and displays its temperature reading proportional to its voltage input, i.e. 1v should equal 10 degs. When i inject 1v into the controller from the daq, it reads near enough 10 deg (just slightly over) , however, as i increase this voltage, an error starts to grow at each ten of degrees, almost like its needing a gain adjustment.
I noticed that the external controller does have a gain and offset pot just before its own ADC. My question is, should i be trying to use that to trim the signal, or could i create an artificial gain an offset in labview to compensate?
Many thanks
02-09-2011 06:10 PM
Do you measure the corect voltage at the out put of your DAQ before the third party device?
Alan