LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

smooth a single looping step for a biped robot

Hi,
I am capturing walking data and save a single step which is looped to produced a continuous walking pattern. I only know a bit about filters and windowing. The problem is that hip/knee/ankle data does not quite match up from the start to the end of the step. The attached vi weights selected hip data from the start/end of the step to the mid point between the start/end points. This introduces 2 nasty bumps(high frequency signal) If I then do further fourier,polynomial,interpolation etc, I get nasty noisy spikes in the data cause by the bumps. I need something to provide a smooth blend over a selected area without changing the untouched section of the walking pattern.
Many thanks in advance. Regards Chr
is

Theres video and more detail at:
http://www.robotic-systems.co.uk/
0 Kudos
Message 1 of 7
(3,072 Views)
Chris,

Please could you give me a little more information on what you are attempting with the VI that you have attached, and whether you have made any progress.

Kind Regards

Tristan J
Applications Engineer
National Instruments
0 Kudos
Message 2 of 7
(3,072 Views)
Hi Tristan,
I need a vi to take a waveform, typicaly 200 scans over 1.5 secs, and provide a smoothing/weighting action to join the start and end of the wave. Waves will be played as a loop and provide continuous walking for a biped robot. That is why it is important to smooth/join the start/end. I also need to minimise the effect of the vi on the wave and would like the function to affect as few start/end samples as possible. I have tried looking through some of the filtering vi's but dont have any electronic filtering experience. The vi should not introduce and high frequency bumps at the joining point as the windowed data will be experiemented on using other functions like polynomial, fourier etc. I imagine the solution will split the wave data into 3 differe
nt sections:
1, 1st 10 scans of adjusted data.
2, roughly 180 scans of untouched data.
3, last 10 scans of adjusted data.
Please let me know if you require more info.
Regards Chris
0 Kudos
Message 3 of 7
(3,072 Views)
Just butting in here with a fairly simple idea that doesn't get into any filtering theory. Basically, you would just do a weighted average of the data points near the cycle boundary, and let the weighting factor ramp from 0 to 100% over the set of smoothed points.

For example: suppose n is the number of points in the cycle so that (ideally) data(0) = data(n) = data(2n)...

let smoothed(0) = 0.50*raw(0) + 0.50*raw(n)
smoothed(1) = 0.55*raw(1) + 0.45*raw(n-1)
smoothed(2) = 0.60*raw(2) + 0.40*raw(n-2)
...
or generally, for -10 <= j <= +10
smoothed(j) = (0.50+0.05*j)*raw(j) + (0.50-0.05*j)*raw(n-j)
smoothed(n-j) = (0.50+0.05*j)*raw(n-j) + (0.50-0.05*j)*raw(j)

Do you think something like this would produce s
ufficiently smooth blending?
ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 4 of 7
(3,072 Views)
Oops, didn't proofread that my "less than" and "greater" than symbols got interpreted as html. Here goes again on the last part:

or generally, for -10 <= j <= +10
smoothed(j) = (0.50+0.05*j)*raw(j) + (0.50-0.05*j)*raw(n-j)
smoothed(n-j) = (0.50+0.05*j)*raw(n-j) + (0.50-0.05*j)*raw(j)
ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 5 of 7
(3,072 Views)
Hi Kevin,
Thankyou for the advice, your idea was pretty much how I (think I) constructed the posted vi. Except I weight from the mid point between the start/end points. Still your idea would simplify the code I've written. I am unsure if I have introduced an undesirable effect in my code. If you run my vi you can see a pronounced bump either side of the join. I think this occurs because the weighting introduces more pull as scans approach the mid point.I will have a look at doing it your way but expect similar results. If the trend of the origonal plot were nearly a straight line on the X axis I dont think the problem would occur.
Regards Chris
0 Kudos
Message 6 of 7
(3,072 Views)
The trouble with shooting from the hip is when you shoot your own foot! As I think about it now, I realize that my earlier suggestion isn't the right way to go about it. My previous post described a type of blending operation that would really only be appropriate for a waveform that ought to be symmetric in time about the beginning/end of the cycle. Each index of the smoothed data is based on combining raw data that is equidistant in time from the beginning/end. That doesn't seem like what you really needed after all.

In fact, giving more thought to your actual robotics app, it probably wasn't such a good idea for me to think only about position. Your derivatives are going to matter too.

Generally, the more I think about it, the more I wonder if part of the problem is a matter of "getting painted into a corner" by trying to work with exactly one cycle of data. There's an implicit assumption that one can capture one cycle, then replay it repetitively to produce continuous motion. But if your beginning and ending data don't match up, then perhaps your cycle isn't really representative after all. Maybe the first step or two that a person takes when starting to walk is less typical of his/her gait than the later steps, and that's why the endpoints don't match.
Is there an option to capture several cycles of input motion data? Perhaps by ignoring the first cycle or two, and median-averaging the next 3-5, you might get raw data that's more representative and that will hopefully make for a smaller discontinuity.

Smoothing idea v2.0 (with lots of hand-waving and no details or guarantees):
Find the best-fit cubic over the last 15 points plus the first 5 points. Find another over the last 5 points plus the first 15 points. These two cubic equations now give you 2 distinct predicted values for the 10 points nearest to the transition. One prediction is biased more by the behavior at the end of the cycle, the other by the behavior at the beginning.
You could now replace those 10 points with a ramped weighted average of the two predictions. By using a cubic, your motion derivatives should be reasonable. For example, acceleration is still allowed to vary (linearly) over that interval; you aren't artificially constraining yourself to constant or zero accel.
I guess you might also want to blend from actual raw data to predicted data as well -- perhaps that could be a separate weighted average. Something like this:

...
{raw data region}
start blend from raw to end-biased cubic
{blend region}
end blend from raw to end-biased cubic
start blend from end-biased to start-biased cubic
{blend region}
end blend from end-biased to start-biased cubic
start blend from start-biased cubic to raw
{blend region}
end blend from start-biased cubic to raw
{raw region}
...

Implementation is left as an exercise for the reader...
ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 7 of 7
(3,072 Views)