LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How do I Precisely measuring mV with NI 9205?

I am building a test program with LabView 8.6 where I need to precisely measure mVdc for about 10-15 seconds at a rate of 10ms then record this data. This is “pure dc”. I am using a DAQ-9174 with a NI 9205. My measurements vary by about 1mV, I need more accuracy than that. How can I setup my DAQ to eliminate this variance?

0 Kudos
Message 1 of 4
(3,600 Views)

There are several things you can do to increase your stability and accuracy, but be aware that nothing will work if your input signal is noisy.  They are:

 

  1. Set the range of the device so it barely covers your input signal range (e.g., if your signal is ±1.8V, set the device range to ±2V range). Default range for this device will depend on how you program it, but will typically be about ±5V. The device itself can be set as low as ±200mV.  This gives over an order of magnitude more resolution.
  2. Oversample and average.  You need data at 100kHz, but the device is capable of 250kHz.  Sample as fast as you can and average.  This can give you about a 40% increase in resolution, assuming random noise on the inputs.
  3. As mentioned previously, make sure your grounds are connected and signal lines properly shielded.
  4. Use differential input mode to get rid of common mode noise.

Let us know if we can help further.  Give us as many details as possible when asking for help (e.g. signal levels, wiring diagrams, etc.)

Message 2 of 4
(3,596 Views)

Hello DFGray,

I am facing same sort of resolution problem with NI 9205 in CRIO.

 

In my application, I am using Magnetic Sensors which gives single ended voltage Output. My Requirement is to acheive 1mV resoultion with setting voltage Range of +10V to -10V.

 

I have tried all four ranges of Ni 9205, even +200mv to -200mv, but it only give resolution of 10mV.

 

One solution u gave above of averaging seems to me as good, I'll try it.

But there is another solution in my mind to use low pass filter of 10 Hz. It reduces the system noise, but i need to make sure that whether it's the right way for noise rejection?

 

Moreover, for alternating signal, I need to signal from 10Hz to 3Khz. For which if i use band stop filter of 60 Hz. will it work for me?

Kind Regards

Sohaib Kianii

0 Kudos
Message 3 of 4
(3,497 Views)

The 9205 is a 16-bit device, so even at the plusminus10V range should give you about 300microvolt resolution.  Your statement that even the plusminus200mV range gives you a 10mV resolution leads me to believe you have a problem somewhere else in either your circuit or your code.  Two possible issues:

 

  1. The unit under test is quantized at about 10mV.
  2. You are reducing the bit depth of the 9205 from 16 bits to 8 bits in your code.

Eliminating noise is an art and depends heavily upon both your hardware and software.  You should do the usual shielding and eliminating ground loops things.  In addition, you can read single-ended outputs in double-ended mode.  This will give you some common mode rejection.  Using a 60Hz stop band to eliminate powerline noise should work, but you will lose any signal you may have at 60Hz.  You can try some sort of Kalman filtering, as well.  Getting the data for this filter would tell you if your noise is consistent or random.  You will need to look at your noise and systematically eliminate the sources, whether hardware or software.

 

Message 4 of 4
(3,489 Views)