11-12-2009 01:57 AM
Hi there,
Looking to bounce ideas off anyone who's done this. The whole project is a robot that tells you the source of a sound. I've got it telling me the angle of the sound source (which right now is the loudest sound in the room, clapping, shouting, whatever). I'm trying to narrow what it reacts to.
I made a bandpass filter (but haven't tested the capabilities of detection yet) to only detect some band of sound which is fine, but to be more robust, pattern matching would be better. For instance, maybe 4 claps no more than .5 a second apart?
My hardware is pretty good, I have the M Audio Fast Track Ultra with 3 Behringer condenser mics, so timing issues are not important.
I'm just not sure how to do something like this. If I use just a filter, any frequency in that band can trigger it, which is fine but we don't have a remote to emit a frequency. Also I am having difficulty with the "create waveform" and "play waveform" in labview. Ie difficulty means it did not play a sound when I created a 10k freq 50 amp waveform. So at least 4 shoults or at least 4 claps is probably fine.
Any ideas?
Thanks!
Crysta;
11-12-2009 04:07 AM
Hi,
Ideas that came to my mind:
fetch 100ms sound from each mic
calc (complex) FFT of each channel
? for each fft bin (of intrest) calc amplitude and direction ( leaving about 1k-2k values)
calc difference to former fft bins (n-1 or n-x?) so constant background sources cancels out
define a trigger level and create a list of occurences (bin3, positve slope, timestamp)
feed this list to a stateMC to detect number of pulses)
look what correlation functions can do for you....
sounds like fun 🙂