From my colleague:
This answer will apply to most of the 12-bit E Series products (E-1, E-2, E-3, E-4, 602x) There are three main errors that relate to the accuracy when the external reference is used: gain, linearity, and offset. The linearity of the DAC will remain fixed, regardless of whatever reference is used. The offset remains constant, regardless of the reference used, although a +/-1 mV error for a +/- 10 V range is insignificant, while a +/-1 mV error at +/-5 mV range is very noticeable.
The gain error, as a percentage, will also change with the external reference. When the board is calibrated, the gain error of the system is corrected. This includes errors from the DAC, the resistor networks, buffers, and internal reference. When an external reference is used, the routing of the signal is different, and a few buffers used with the internal reference are no longer used. The absence of the errors from these buffers are what contributes to the gain error. If a large external reference is used, the error from these buffers may add up to several mVolts, which gives an error as a fraction of a percent. As the external reference value decreases, you have the error increasing because the denominator decreases (residual error of buffers / external voltage). It is possible to calibrate the gain error with an external reference, but the user would have to know exactly what reference voltage he is using. Also, NI-DAQ does not do this - the user would have to write the routine by himself. The other drawback would be that the granularity of adjustment to the gain error remains independent of the external reference, and the board may have a rather large residual error after this calibration. For example, with an 8-bit caldac, if you can adjust the gain error by 0.2 mV per caldac LSB (as an example with a made up number), for an external reference of 10 mV, you will still have a 2% (0.2 mV/10 mV) error, which may be unacceptable. The granularity of adjustment varies by board