LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

RMS with Labview, Oscilloscope and Multimeter

Hi,

 

I am trying to understand which one returns the most accurate measurement for an RMS voltage, Labview, Oscilloscope or Multimeter. I attach below three images for the measurements I get.

Capture.JPG

SCRN0513.PNG

 

Capture2.JPG

 

In these pictures the differences are within an acceptable range but in some actual tests when a voltage divider is used, and these values are multiplied with a ratio, that divergence can not be disregarded. For my case, it seems that "Cycle RMS" function is quite close with the devices. Are there any other options for specifying the rms in Labview? Also, how can someone handle the Error -20308 of cycle rms without interrupting the code run?

 

Kind regards,

Michail

 

PS: I attach the code I have built so far with the great help and advice I have received from the forum members.

 

0 Kudos
Message 1 of 4
(3,973 Views)

I would go with the True RMS meter since it is looking at the total work done. They other methods require grabbing complete cycles and doing the math. One sample off from a complete waveform and the number is off. Of course sampling a higher rate will dreduce the amount of error due to a missed point here or there.

 

Just my 2 cents,

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 2 of 4
(3,966 Views)

I would also feel inclined to just use the DMM if for nothing else but simplicity.  Granted, make sure you do have a True RMS DMM if you go that route.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
Message 3 of 4
(3,906 Views)

I don't think there is a straight answer to your question since it depends on your signal and what exactly it is you want to measure.

The hardware accuracy is also very critical and your hand-held DMM may have for example a bandwidth limitation that will give you inaccurate results for higher frequency signals.

 

'LabVIEW' by itself is not a source of error since it's a programming language that allows you do do what ever maths you want once you've acquired a good signal.

 

For True RMS you should use the Basic DC-RMS VI from the waveform palette. That one will correctly calculate the True RMS value for the samples you are providing. However, as mentioned by Ben, if you are measuring a periodic signal but don't hit an exact number of periods you may introduce a small error (uncertainty), but you can fix that by enabling for example a Hanning window.

 

billede.png

Note the same problem may hit your hand-held DMM if your frequency isn't 'spot on'.

 

I noticed that you also used the Extract Single Tone Information VI. That one is not computing the True RMS of your signal but rather the amplitude of your fundamental tone. When you divide the amplitude result by SQRT(2) you get a very accurate value of the RMS of your fundamental tone. That may often be more interesting that the TRUE RMS and is a measurement that your DMM can't measure for you.

 

The difference between the RMS of your fundamental and the TRUE RMS value of your signal may be negligible if you signal is a single tone with relatively low DC-offset. low noise and low distortion. The RMS being a 'square' function you can say that as long as your distortion, dc-offset and distortion is less that 1/1000 (0.1 %) the resulting error will be less than 1/1000^2 or 10E-6 or 1 ppm.

 

Message 4 of 4
(3,891 Views)