LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Validity of fourier spectrum obtained

Hi,
 
I have some query on the validity of the Fourier Spectrum I have obtained.
 
I am currently using Labview 7.1 and S-series DAQ card PCI-6133 which have an analog resolution of 14 bits.  I am trying to sample a signal with a certain frequencies of 500Hz. I have used the sampling frequency of 10000Hz and a sampling size of 100-200 samples.
 
I have used the Express Vi for Spectrum measurement with the following configuration
Selected Measurement: Magnitude (Peak)
View Phase: Wrapped and in radians
Windowing: Hamming
Averaging: Hamming
Weighting mode: Hanning
Number of averages: 10
Return a signal only when averaging complete
Setting the out put to be in dB
 
The output which I have output to a graph suggested a presence of a signal at its frequency of 500Hz but the magnitude written is of around -60dB...which seem to suggest the reading is at uV...  Even though I expect the signal reading to be weak... but I do not think it would be that weak. Also is it possible for such a reading for this S-series card or the reading is just some mathematical computation result?
 
 
I am wondering whether such a signal reading is valid or to what extent is this plausible?
 
Thanks for your help!
 
 
0 Kudos
Message 1 of 5
(3,109 Views)
Hi water,

to check the integrity of your signal it's maybe easiest to plot it BEFORE doing FFT. Do you really get a sine at ~500Hz or just white noise?
How can we help you, if we don't know anything about your signal? You can attach vi's or pics from front panel/block diagram...

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 2 of 5
(3,099 Views)

Hi,

I think I have verified that the signal is due to some electrical coupling signal from my func generator... Anyway, is there any good way we can reduce such noise?

 

Thanks for your help!

0 Kudos
Message 3 of 5
(3,067 Views)
Hello water,

you speak of noise in the order of -60dB. You use an ADC of 14 bits resolution, giving you a ideal maximum of ~80dB dynamic range (when my calculation is right).
So at the moment you loose about 4bits of accuracy due to noise.
What do the specs of the DAQ card say about accuracy of the ADC measurement? What kind of setup do you use?
The first (good way 🙂 is always to use better shielding...

Best regards,
GerdW


using LV2016/2019/2021 on Win10/11+cRIO, TestStand2016/2019
0 Kudos
Message 4 of 5
(3,056 Views)
water,

With 10000 samples per second and 200 sample data sets you get about 10 cycles of your 500 Hz signal. With the windowing only a few cycles will be present for the FFT algorithm. Usually FFTs work better with longer sample sets. The other issue is that the VI is set to do 10 averages. I have not looked into the internals of the VI, but if it is averaging the datasets and the sampling is not synchronous with the 500 Hz signal, then the averaging will tend to cancel it out rather than enhancing it.

Try running a dataset of 4096 or 8192 samples (assuming the signal frequency is stable over that time frame) and turn off the averaging. See if that makes a difference in what you see.

Lynn
0 Kudos
Message 5 of 5
(3,045 Views)