LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

I want my spectrum analysis VI run faster

Hi guys,

With help of kinds people in community I succeeded in building a VI acquiring frequency spectrum using DAQ. But the speed is not that satisfactory(the indicators refresh about every 10 seconds, 1s or a smaller timespan is appreciated). I need a high frequency resolution so the ratio between sampling frequency(2kHz) and numbers of sample(200k) is set to 0.01Hz. I'm still learning about signal sampling and have currently no idea what I can do except building a new algorithm.

Best Regards

0 Kudos
Message 1 of 17
(3,570 Views)

You are setting your DAQ for 200k Samples with a acquisition rate of 2kSa/s, that takes 100s to get the data. Your frequency resolution is related to the length of time of the FFT, that is, 1 s of data equals 1 Hz resolution, 10 s of data equals 0.1 Hz, 100 s of data is 0.01 Hz. Your frequency bandwidth is 1/2 the acquisition rate, that is 1kHz.

 

If you want 0.01 Hz resolution you need 100s of data, no way to make it faster. Not sure what you are asking.

 

mcduff

Message 2 of 17
(3,542 Views)

@ZeSuli wrote:

I'm still learning about signal sampling ...


I'll try to illustrate with some "back of the envelope" examples.  Imagine a single cycle of a Sine Wave.  Let P (for "Period" be the width (in time) of this single cycle.

  • What is the minimum number of points you need to sample in order to say "I have a sinusoid here"?  Well, one point wouldn't be enough, as it couldn't tell the difference between a sinusoid and a DC signal.  So you need two points (and hope that it doesn't "wiggle a lot" between those samples).  If we assume that this sinusoid represents the highest frequency you are interested in sampling, this leads to the requirement that you need to sample at twice this frequency (the Nyquist requirement).
  • Suppose you sample at some sampling Frequency, fs, take N samples (for a total Sampling Time of N/fs) and when you plot all of your samples, you see this Single Sinusoid.  I hope it is obvious to you that this represents the lowest frequency you can detect, a frequency of fs/N.
  • A corollary of the previous point is that this "lowest frequency" is also the Frequency Resolution of your sampling.

So you want to sample at 2 kHz and want a Frequency Resolution of 0.01 Hz, so 2000/N = 0.01, which gives N = 200K (which takes 100 seconds, the reciprocal of your frequency resolution).  You have to sample a long time to get a narrow frequency resolution.

 

On the one hand, there's no getting around this requirement -- you need (sampled) data to analyze (although some have been know to "make up" their data).  I just looked at your code -- you say the indicators refresh every 10 seconds, but if you are reading 200k samples at 2k Hz, it should refresh every 100 seconds (a significant difference!).

 

If you are interested in how the spectrum changes as you collect the data, you can watch it "evolve" by using some tricks.  Suppose you record the first 100 seconds and compute the spectrum.  You now take one more second (so you have 101 seconds of data) and compute the spectrum of the last 100 seconds -- you'll have a "newer" (changed) spectrum.  Every second, take another second's worth of data and analyze the most recent 100 seconds of data.  You can watch the spectrum change (recalling you are "looking backwards in time" for the last 100 seconds.

 

Sounds really strange, doesn't it?  How can you "fix" this?

 

What makes it so difficult is the extremely high frequency resolution you are trying to achieve!  Do you really have data to analyze where you need a frequency resolution of 0.01 Hz?  You might be tuning something, and want to set the frequency very precisely, but then you "work slowly", every hundred seconds making a tiny adjustment and taking another 100 seconds of data.  Think about hearing -- we hear in the 100-10k Hz range (mainly), but most of us can't perceive pitch (frequency) precisely (which is why so many sing "off key", not quite on the right note).

 

Bob Schor

Message 3 of 17
(3,538 Views)

In an unrelated thread I recently linked an old post of mine that might prove at least *partially* useful here.  It got several new thumbs up from people, so maybe it's a not-so-widely-known DAQmx feature that deserves some more publicity?

 

Let's target a 10 second data set to give 0.1 Hz resolution.  You can adjust from there as needed.

 

The *first* spectral analysis is gonna take 10 seconds.  There's no way around it if you need 0.1 Hz resolution.  However, there's a way to do your *subsequent* spectral analyses as often as you choose.  Each time you can retrieve the most recent 10 seconds of samples, a kind of sliding window.  The 10-second windows can overlap as much or as little as you choose.  (Of course, at a certain point it would get silly to recompute the spectral content 10 times a second when it's 99% the same input data each time since the sliding windows would have 99% overlap.)

 

A thing I've done before is to then also extract just the latest 1 second or so from that dataset and do a lower-resolution spectral analysis on that smaller chunk.  This has less frequency resolution, but gives you at least a coarse view of things in something closer to real time.  

 

 

-Kevin P

 

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 4 of 17
(3,533 Views)

Hi Bob Schor,

 

Yes exactly I have the same question about why the data would refresh every 10s instead of 100s. That confuses me about how the real inner algorithm works! And the trick you provide is what I'm currently thinking about. As for why I need a so high resolution, I need to build a model based on the power frequency(or in other name grid frequency, the model is very sensitive to the change of it). And the instrument(we don't have that for now) can even provide you a realtime-frequency with resolution of 0.1mHz. I'm looking more into it and last but not least thank you for your patient answer!

 

Best Regards

ZeSuli

0 Kudos
Message 5 of 17
(3,492 Views)

RE: data refresh after 10 sec instead of 100 sec.

 

Just a guess becuase I'm on LV 2016 and can't open your code.  But I know that 10 sec is a standard default timeout value for DAQmx Read functions.  So I'd venture that you haven't overridden the default 10 sec timeout and aren't noticing/reacting to errors.  So after 10 seconds, the Read function times out, asserts an (ignored) error, while returning whatever data it has accumulated up until the timeout.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
Message 6 of 17
(3,428 Views)

Hi Kevin P,

 

Thanks for your reply now and before. I think this is probably the reason and I will try change the setting tommorow. I keep wondering and searching what algorithm is in the some of special frequency measurement device(which can rapidly gives you the current value of frequency with extremly high resolution). Please let me know if you have any idea.Smiley Happy

 

Best Regards

ZeSuli

0 Kudos
Message 7 of 17
(3,425 Views)

I keep wondering and searching what algorithm is in the some of special frequency measurement device(which can rapidly gives you the current value of frequency with extremly high resolution).


Depends on your signal. Assume you have a clean sinusoidal signal, you can count zero-crossings to get the frequency. Assume you have a clean square wave signal, you can count the rising or falling transitions to determine the frequency accurately and fast.

 

mcduff

0 Kudos
Message 8 of 17
(3,420 Views)

@mcduff wrote:

I keep wondering and searching what algorithm is in the some of special frequency measurement device(which can rapidly gives you the current value of frequency with extremly high resolution).


Depends on your signal. Assume you have a clean sinusoidal signal, you can count zero-crossings to get the frequency. Assume you have a clean square wave signal, you can count the rising or falling transitions to determine the frequency accurately and fast.


Assume two "perfect square waves", one of 1 Hz and the other of 1.01 Hz.  Start them off in phase and count the rising transitions.  They will remain the same for 99 seconds (being 1, 2, 3, ... 99), but the last second the faster one will register 2 counts (to go to 101) while the slower one registers only 1 (to go to 100).

 

Bob Schor

0 Kudos
Message 9 of 17
(3,416 Views)

@Bob_Schor wrote:

@mcduff wrote:

I keep wondering and searching what algorithm is in the some of special frequency measurement device(which can rapidly gives you the current value of frequency with extremly high resolution).


Depends on your signal. Assume you have a clean sinusoidal signal, you can count zero-crossings to get the frequency. Assume you have a clean square wave signal, you can count the rising or falling transitions to determine the frequency accurately and fast.


Assume two "perfect square waves", one of 1 Hz and the other of 1.01 Hz.  Start them off in phase and count the rising transitions.  They will remain the same for 99 seconds (being 1, 2, 3, ... 99), but the last second the faster one will register 2 counts (to go to 101) while the slower one registers only 1 (to go to 100).

 

Bob Schor


Not sure I understand, period of 1 Hz wave is 1 s, period of 1.01 Hz wave is 0.99 s. If your timing device is accurate you should see the difference. (I know you like experiments, so I did one in Mathematica and LabVIEW and I can see a difference after 1 cycle on a time plot.) For a FFT frequencies are put into bins, so you need 100s of data for a 0.01 Hz bin for resolution otherwise the frequencies will be binned together.

 

mcduff

0 Kudos
Message 10 of 17
(3,410 Views)