when you quantize your signal (using an A/D converter), you get discrete values with a resolution of 1 lsb (least significant bit). So in an ideal system you can not get better resolution than 1 lsb. Dithering is a technique that allows you to increase the resolution (to less than 1 lsb) by adding a small amount of noise to you signal before quantization. If you think of a DC value that is somewhere between 2 discrete AD converter values, you can think that without any noise added, the value will always be rounded to the nearest AD converter value and you get a systematic (but small) error. If you add noise to your DC value the measured values will vary, but an average measurement will converge toward the correct DC value.
So Dithering does NOT reduce noise (it u
sually reshape it to be more "white"). The purpose is mainly to increase resolution (for f.ex DC measurements) but also linearize your AD converter (by removing systematic rounding errors that might result in, f.ex., harmonic distortion).
Dithering is only needed if the noise in your measurement system and in your input signal is so low (below 1 lsb) that you get these systematic errors. Are you sure you need it at all? What do you see in your measurement result that shows that dithering is needed?