09-13-2013 03:10 PM
@altenbach wrote:
Then do this...
As always a smart solution, though i see a possibility of problems arise when the arrays get large and/or if the numbers involved are large or overflow of the sum. It's the fastest running average for sure, but i'd go for Avg-Avg/N+New/N to keep calculations around the Avg value.
/Y
09-13-2013 04:28 PM
Im thinking maybe you could help me with another question:
I have a 3x10 input array.
I take row 0, apply a high pass filter, take the log, and then plot it.
Second round I have another 3x10 array, and apply the same thing on row 0.
So my plot would have 20 points, from row 0 (eventually I would do this for all 3 rows)
Is this different from if I pass it in a 3x20 array at once, and do the processing and then plot it?
Or is there a way I can save the plotted value on the graph, and just add the new result at the end?
I am trying to avoid saving all values, so I dont want to use the 3x20 array. I want to do the calculations on 3x10 chunks of data while getting the same results.
Thanks!!!!!!!!!
09-13-2013 05:05 PM
@Yamaeda wrote:
As always a smart solution, though i see a possibility of problems arise when the arrays get large and/or if the numbers involved are large or overflow of the sum. It's the fastest running average for sure, but i'd go for Avg-Avg/N+New/N to keep calculations around the Avg value.
/Y
Well, most often we deal with measurement data, which is maybe 16 bits worth of information, while DBL has a 53 bit mantissa. There is plenty of headroom to grow before we run into problems unless the data is pathological in some way. Once N gets big and "avg" or "new" are very close to zero, you run into similiar problems from the other end.
Overall, it might be better to use a finite history, e.g. average only the last N points, or so.