LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

voltage output gain

Hi,

 

I’m generating a voltage output (0-10v) to simulate a temp range from 0-100deg c. The external 3rd party controller normally accepts a 0-10v signal, and displays its temperature reading proportional to its voltage input, i.e. 1v should equal 10 degs. When i inject 1v into the controller from the daq, it reads near enough 10 deg (just slightly over) , however, as i increase this voltage, an error starts to grow at each ten of degrees, almost like its needing a gain adjustment.

I noticed that the external controller does have a gain and offset pot just before its own ADC. My question is, should i be trying to use that to trim the signal, or could i create an artificial gain an offset in labview to compensate?

Many thanks

0 Kudos
Message 1 of 2
(2,314 Views)

Do you measure the corect voltage at the out put of your DAQ before the third party device?

 

Alan

0 Kudos
Message 2 of 2
(2,309 Views)