In creating my program to apply a FFT to a time domain signal, I used the FFT Spectrum (Mag-Phase) VI as I only need the magnitude to be graphed. I am using RMS averaging currently, but I don't understand how I can calculate standard deviation for a specific frequency range (using the stats VI, I can get it for the whole range of course). In this case, I'd like to know the standard deviation of the amplitude between 200 to 300 Hz. Any suggestions?