08-30-2008 08:13 AM - edited 08-30-2008 08:14 AM
Hi CC,
I made three changes.
1) The first point does not get duplicated.
2) Added a zero-phase shift (by reversing the array, re-filter, reverse back)
3) Since I am no duplicating the first point I have to delete one less point at the end.
Note:
If you want to get rid of the gabarge at the back-end, do the reflecting on the las point as well.
Ben
08-30-2008 01:40 PM
Ben, thanks for the lesson !
This is something I'll put into use on monday. I hope you'll not ask for royalties !..
08-31-2008 07:46 AM
No, no royalties, but if it gets written up, send me a lnk and now let me tell you how I was driven to find this technique (Oh no now Ben is going to tell us a story! )
As the back-end of a web-base app (won E-Buisness of the Year award about eight years ago) that did Vendor Managed Inventory (VMI) they wanted automatic e-mail sent when it was time to send out a new truck load of stuff. Since the app depended on pre-existing detectors to monitor the levels inside of silos etc the quality of the data was pretty bad as a rule (cheap detectors, lots of noise and unbeleivable reading eg 500 feet of product in a 100 foot silo). Since the usage of stuff varied (de-icer only used in bad weather, matterials only used durring some batches...) we needed to look at useage and changes in useage. Reading were typically capture four times a day, so we did not have a lot of data to throw away.
So...
1) We needed to filter the data to get rid of bogus readings
2) We needed clean data since the first and second derivative told us what usage was and when it changed
But with the phase shift issue of filters and the grabage produced from the zero-phase-shift filter, I would have been a day late in sending out the trucks.
So using the technique above I was able to up-to-date guestimates of the usage and watch for changes in useage. Another interesting hack that went into that app was how we got rid of the bad readings in this context. I built a utility that let the account managers for each customer tailor a "Z-Factor" for each site. It would present the known history and let the account manager specify how many "blips" they saw in the data. It would then interactively characterize the usage in three regimes (non-use, use, re-fill) to find a multiple of std-dev's that would result in as many blips being thrown away as the manager specified.
When the app was doing its normal data collection it would evaluate the point to determine if looked a good reading or a bad one. Bad values would get tagged for human approval before becoming official. The test ran great and all was good to go when the lawyers got involved. It turned out that the departments of weights and measures in some states would not accept the usage numbers, bummer.
The application (BridgeVIEW 2.1) is still in operation today (that division got spun-off as it own company) making sure there is de-icer ready when your plane takes off.
Done with the story,
Ben