LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Mean PtByPt calculations

I get it. So after the mean goes to zero, each iteration it is actually doing 0+pi-pi, so it stays zero. Good to learn something about double presicion numbers.

0 Kudos
Message 11 of 13
(1,625 Views)

Well the reasons are clear. A bit fuzzy about why the DBL can't reclaim it's lower range and higher resolution once it's been infected by a large number though but there is usually a logical explanation.

 

When it comes to the differences between the functions there may be a reason for not having the same functionality in the StDev version. The calculated StDev must relate to the mean, and there is no solution for that yet. The StDev shows similar behaivour, and fixing one output and not the other would be wrong. Posting another snippet to show this (and as a lesson learned: Please zoom in on the y-scale!).

 

MeanStDevOddity.png

 

When it comes to the solution to the problem in the Mean PtByPt vi; It isn't really a solution. It just makes the problem smaller. Replace the large value with 0.5E5 in the first snippet and look at the values with 15 digits of precision. They change, so there isn't any magic with the 1E5 constant.

 

So what should be done? Is this a bug or just expected from the definition of DBL? The fact that it is (sort of) fixed in Mean PtByPt suggests a bug, but fixing that might have been a hasty decision. Anyone who can shed some light in how this works in other languages? Should we have a function that reclaims the full resolution of a DBL?

 

For those interested: We were bitten by this due to (probably) a glitch in serial communication, making extreme values appear among more modest values, and from then on infecting our running mean value.

 

 



CLA
www.dvel.se
0 Kudos
Message 12 of 13
(1,610 Views)

I think this problem will not occur if we just keeping all the newest n elements, and taking average of these element every time. However, this might be slow if the rolling window is very large.

 

On another though, this problem happens when we do:  new element (small) - oldest element (large) + sum (large) . The small new element is vanished when added to large values. What if we do sum (large) - oldest (large) + new (small)? Will this works better?

 

__

 

I just realized averaging all the elements instead of taking the difference b/w new and oldest one takes about the same memory, just slower. Because either way, all the element in the rolling window needs to be stored.

0 Kudos
Message 13 of 13
(1,602 Views)