LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW gobbling my memory??

Hi folks - trying to do Cross-Correlation on 2 large files, must convert from I16's in the 2 files, scale all by a range value, divide by a full-scale single float, perform the cross-correlation, and display the result.

For 2 files, each with 8M 2-byte I16's, I get a -20001 error from the cross-correltation function, "not enough memory".

My machine has 2GB of RAM in it, but just before the function barfed, Windows Task Manager told me the LabVIEW memory allocated was 682MB.

My 3 questions are

1) Why is the function barfing when I still have almost another free gig of memory?
2) Why is LabVIEW sucking up so much memory?
3) Am I doing anything silly to create more copies than I need to do the job?

any insight would be appreciated, thanks, paul
0 Kudos
Message 1 of 11
(3,997 Views)
here's the vi...  
0 Kudos
Message 2 of 11
(3,993 Views)
Could you also attach the subVI too?
 
FIrst of all, you convert twice:
  1. from I16 to SGL at the division.
  2. From SGL to DBL at the cross correlation subVI (Notice the grey coercion dots!)

It would definitely be more efficient to make the "8192000" diagram constant DBL to avoid all these extra conversion and go straight from I16 to DBL.

Starting with two DBL arrays with 8 million points each, your output array has 16million elements (128MB in memory!), then you send the entire thing to a graph! Most likely, your graph display is no more than 1000 pixels across, meaning you are trying to display tens of thousands of points for each column of pixels.

Have you done some profiling? If you run out of memory, make all the arrays only half size until things work. Look at the memory usage to determine ho many copies of your data are in memory.

0 Kudos
Message 3 of 11
(3,979 Views)
here's the data reading sub-vi...

thanks, I'll a single conversion.  Was hoping that using SGL floats would cut the memory requirement, but like (I think) you're pointing out, I'm stuck with DBL for the graph.  Wonder if there's a polymorphic-ish way to use only SGL's in the graph?
0 Kudos
Message 4 of 11
(3,963 Views)
If you wire SGL to the graph, it'll display DBL SGL (Edited : oops).  It adapts to the data type as far as I know.

ALL your values need to be SGL, otherwise LV will  up-cast the non-DBL values and you'll be left with a DBL graph.

Make sure ALL values being sent to the graph are SGL and it'll automatically convert to SGL data type.

Or to illustrate the point, put a "to SGL" before the graph and see.

Hope this helps

Shane.

Message Edited by shoneill on 04-07-2006 04:20 PM

Using LV 6.1 and 8.2.1 on W2k (SP4) and WXP (SP2)
0 Kudos
Message 5 of 11
(3,953 Views)
Your SGL vs DBL problem is NOT with the graph, but with the crosscorrelation.vi, it only works with DBL so you're stuck. 😉
Leave the SGL out of it completely! 🙂
 
The "Read data...vi" looks fine. Make sure that its front panel is closed when running the main VI. Alternatively, set its priority to subroutine to guarantee that its indicators don't keep extra data copies.

Message Edited by altenbach on 04-07-2006 09:27 AM

0 Kudos
Message 6 of 11
(3,936 Views)
Nice catch Altenbach.

Didn't know that.

Don't have a current LV version.

Man, when can I upgrade.

Shane.
Using LV 6.1 and 8.2.1 on W2k (SP4) and WXP (SP2)
0 Kudos
Message 7 of 11
(3,912 Views)
Check out the tutorial Managing Large Data Sets in LabVIEW for some tips and tricks on how to handle your large data set.  In particular, you graph problem is addressed.
0 Kudos
Message 8 of 11
(3,890 Views)
Hey thanks, folks, for all your help.  Now I seem to be bumping up against "intrinsic" limits of certain signal processing functions.  The FFT, Cross Spectrum, Cross Correlation, seem to have built-in limits of 8M input array sizes.  Looking at having to write the functions I need as DLL's, then call them from LabVIEW...

thanks again! paul
0 Kudos
Message 9 of 11
(3,863 Views)


@PaulOfElora wrote:
The FFT, Cross Spectrum, Cross Correlation, seem to have built-in limits of 8M input array sizes.

 

Well, I just did a FFT of an 16M point DBL array and it worked just fine. (Of course this won't be a fast operation). Where did you get the information that there is an 8M size limit?

0 Kudos
Message 10 of 11
(3,852 Views)