05-23-2009 05:13 AM
Hi all!
I have two signals. One is original with about 10k samples, duration - 1s.
The second - is a demodulated signal almost the same as original but with about 200-500 samples.
To be able to compare two signals I should have samples for both signals at the same time.
Then I need to substract one signal from other to get deviation and calculate error.
How can I interpolate second signal to get samples at the sime time that first signal has ?
Thanks in advance
05-23-2009 07:32 AM
I would say by doing what you ask: Interpolate using the interpolate function in the array category.
Herbert
05-23-2009 08:16 AM
Herbert wrote:I would say by doing what you ask: Interpolate using the interpolate function in the array category.
Herbert
The most used way of doing it I think you will find here http://en.wikipedia.org/wiki/Upsampling. But I think the interpolate idea is good to. If you need to insert more than one sample use the ramp pattern
I made something quick and dirty. Test it and see if you can use it. You have to do something about the missing samples at the end.
05-23-2009 09:24 AM
Thanks, but I don't undestand what your example does.
It seems not what I wanted, as the timing of signals should remain unchanged - 1 second. Only increased number of samples05-23-2009 10:35 AM
05-23-2009 12:27 PM
05-23-2009 12:43 PM
05-23-2009
01:20 PM
- last edited on
10-10-2024
10:53 PM
by
Content Cleaner
Have you tried the Align & Resample Express VI?
05-23-2009 01:51 PM - edited 05-23-2009 01:54 PM
Yes, and after that I get array of 990 elements not 1000. Which means Align & Resample functions does something wrong. Please see attached VI.
Why after linear interpolation, when I compare signals - I get continuously increasing deviation between two signals ? 😞
05-23-2009 03:22 PM