08-23-2007 11:07 AM
08-23-2007 09:51 PM
08-24-2007 09:22 AM
08-24-2007 10:02 AM
Sorry, not near a LV box to look at or try your code.
Can you please [explain] why increasing the sampling frequency will get lower RMS value?
Because you're lying to the filter function. You actually sampled at 80 kHz, but you told the filter function that you sampled at 204.8 kHz. That's the meaning of the "sample frequency" input -- you identify the sample rate used to digitize the signal.
With a 10 kHz sine wave, you have 8 samples per cycle. When the filter function sees those 8 samples per cycle, and is told by you that the sample rate was 204.8 kHz, it has to treat that as a sine wave at 204.8 / 8 ~= 26 kHz. Surely a 26 kHz sine wave will be attenuated more than a 10 kHz sine wave.
Trouble is, comparing your test cases 2 and 3, I would expect more attenuation for case 2 than for 3. *That* part I don't understand. But the comparisons from 1-->2, 1-->3, and 3-->4 all seem reasonable. Additional attenuation should be expected.
-Kevin P.
08-24-2007 11:06 AM