08-11-2011 10:48 AM
Hi all,
I have a question regarding high-pass filtering in LabVIEW simply based on observation, having never been taught directly about filters.
Post high-pass filtering a signal, with an IIR Butterworth Filter (2nd order) for example, towards time = 0 there's a transition from a 'false' high magnitude to an expected level, as in the attached albeit this is obtained through a Fortran program, as I don't have anything else to hand at the moment. To minimise the significance, with respect to the rest of the filtered signal, you can increase the sampling time, change the filter settings or even add additional sampling time with the intention of truncating it.
Is there an established method of getting around this? The reason I ask is because I'm trying to accurately discriminate between laminar and turbulent portions of a velocity signal but towards time = 0, in a completely laminar signal, our method returns false positives for the presence of turbulent structures. I presume this will always be the case but it's better to ask first and conclude afterwards!
Thanks.
Solved! Go to Solution.
08-11-2011 10:53 AM
All filters have an inital transient. You need to wait until it has settled out or try to compensate for the transient in some way. Because the transient also depends on the input signal (including any noise), it is generally not feasible to compensate exactly.
Lynn
08-11-2011 03:42 PM
Clicked the wrong button there.
Thanks for that anyway, it's what I thought.