05-10-2016 08:03 AM
Hello All,
We use a PXIe6556-HSDIO. We are trying to measure the leakage current of a device at particular pin by applying 3.3V and measuring the current drawn by the pin.
Our observation is that the current measurement accuracy was worse than 10%.
On replacing the DUT pin with a known resistor value (with other end connected to a different channel configured to sink the current), we observe that only if the sinking channel is forcing a 1V instead of 0V. Measured values are in the table attached.
We also came upon the graph of Quadrant behaviour ## in the 6556 specification sheet (attached). Could someone please explain this graph? We believe that the sinking channel voltage variation improving accuracy is due to this quadrant behaviour, please help.
Thanks.
05-11-2016 06:20 PM - edited 05-11-2016 06:21 PM
1) To explain the Quadrant Behavior graph:
The Quadrant Behavior graph shows you the conditions under which the PXIe-6555/6556 will meet its published specifications when performing PMU operations. Based on your description of how you operate the device, you are performing a "Force Voltage, Measure Current" operation. We can use the graph to see if the device will meet its published specifications under the conditions you are using it. First, we take the voltage you are forcing and draw a vertical line at that voltage. You are forcing 3.3V, so we draw a vertical line at 3.3V on the Quadrant Behavior graph. Now look at the what portions of the line are bounded by "Guaranteed" dotted line. The line is bounded by "100% of the current range" and "-100% of the current range". This means that in the 2uA current range that you are using, the device will be able to successfully force 3.3V and measure any current from -2uA to 2uA within the specifications of the device. Looking at the specifications, you should be guaranteed a +/- 1% measure current accuracy when performing this "Force Voltage, Measure Current" operation within 5 degrees of self calibration.
2) Suggested steps for debugging your problem:
First, open NI MAX and ensure that your device is within its external calibration interval. In NI MAX, you should be able to click on your device and see the date it was last externally calibrated on. There should also be a date that says when the calibration interval has expired. If today's date is later than the "Recommended Next Calibration" date, then the specifications for your device are not guaranteed. The device should still be usable, and I would expect better accuracy than what you are seeing.
Second, open NI MAX and ensure that your device is self calibrated. In NI MAX, you should be able to click on your device and see the date and temperature at which your device was last self calibrated. You will also be able to see the current temperature of the device. If the current temperature has deviated from temperature at which the last self calibration was performed at, run the self calibration routine. You may have to refer to documentation on performing this operation, but if you are using LabVIEW it should be as simple as opening a blank VI, opening a session to the device (niHSDIO Initialize), performing self calibration (niHSDIO Self Calibrate), and closing the session to the device (niHSDIO Close). NOTE that self calibration takes 15 minutes to complete and during this time you will not be able to use your device. Also, when performing self calibration disconnect all cabling from the PXIe-6556, or if this is not possible then ensure that no signals are present on the cables connected to the PXIe-6556.
After checking your external calibration and your self calibration are up to date, and the temperature of your device is the same as when you self calibrated, test the DUT again. You should achieve 1% measure current accuracy assuming self calibration completed successfully. If self calibration completed successfully and you still see problematic measurements, there may be some issue with your setup contributing to the measurement error.
Hopefully this helps,
Brandon Carson