05-05-2010 07:15 AM
Hello
I am trying to plot my sampled data but I think I am donig somethinng wrong.
I took 1000 samples pf my analog signal at 1Kz, dt=10, stored it in an array and through fft spectrum.vi I tried to plot the data, Ampliteude vs frequency. The analog input is a magnetic field meter, I was measuring the field of my pc for testing.
The problem I think is the scale of the frequency, my opinion is that the peak had to be in 50Hz and not in 0,05 Hz
I attached snapshots of my program and my results. could you please check it and tell me if i am doing something wrong?
Regards
Solved! Go to Solution.
05-05-2010 11:45 AM
Garbage in, garbage out!
You are lying to LabVIEW so its giving bad data as a result.
If you read the help for the timed loop you will see the dt is in units of the timing source... you are gathering data at 100 Hz.
Next:
"dt" for a waveform should be in seconds so for 100 Hz the dt should be 0.01 NOT "10".
So read the help, mod your code and you should get much more resonable results.
Ben