In a recent test on an aluminum tie rod, we used strain gauges (more suited to steel but we knew this at the time) we used a full bridge configuration which we dimaxed into place. The tie rod was installed on an engine (and the gauges balanced) and run up to speed, we allowed the tie rod to saturate at the temperature which it would be run at and then rebalanced the gauges. We then took our measurements. however when we looked at the results we were required to do a temperature calibration and correct the results accordingly as the temperature effects on the measurements were very high, is this normal?
We are curiuos as although the gauges were not ideal for the material we thought that the errors would only be minimal. The dimax that
we used to coat the gauges has a temperature rating twice that of the application we put it in.
I would be grateful for any help on this matter or anyone else who has experienced similar issues.
Thanks