02-18-2014 01:29 PM
I have a DUT that has a certain accuracy specification and I want to make a measurement to see if the device meets the accuracy specification.
So I will input a signal from a calibrated source to the device.
The question is, what is the minimum accuracy specification required for the calibrated source? I assume the minimum accuracy would be expressed in terms of the specified accuracy of the DUT. This is important because the accuracy of analog output boards is different, and I need to choose a board with the correct accuracy for my test.
Someone once told me that there is a Six-Sigma Black Belt rule of thumb that the source or measurement device should have an accuracy that has at least 3 times the accuracy of the input being excited or the output being measured, but I have not been able to confirm this.
Any help would be much appreciated.
Thanks.
02-18-2014 02:43 PM