LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

out of memory due to huge array

Hi,

I built an imaging data analysis program: the program needs to open files of 131MB, 262MB, 524MB and 1.048GB. The problem is that I receive an "Out of Memory" message from Labview (and only then from Windows OS) for the 0.5 and 1G files.

Previously, I had this problem with the 262MB file, but I managed to open it by chunking my data into 2 or 4 sections. For the 524MB file I had to chunk the data into 32 or even 64 peices and then process it as a 2D array of 32 or 64 lines. Until here it works fine - The problem is that after this first reading and processing of the data I need to do quite a bit of manipulations on it, and then I receive the "Out of memory" message for the 524MB file. More specificly, I have a "slicing" functionality in my program that takes this huge 2D array, rearrage it into 2D array of #pixels*4096 using the "array reshape" function, slice a portion of it using "array subset" function, and repeat the imaging process of the complete array. The "slicing" functionality is in an event structure, waiting for the user to click on the "slice" button on the front pannel. Now, I manage to read the file and do the first processing, but then I receive the "Out of Memory" message saying it fails in the tunnell going into the event structure.

I attach two options of my read file subvi (it is a small part of my whole program). version 2 uses "build array" and fails for 262MB and up. version 3 reshape the array and could work for 262MB. For the 524MB I had to change it again and remove the "reshape" from the reading section, and repeat the processing several times with each chunk.

Any suggestions how I can proceed with the 524MB and 1GB files ?

One more thing - I am not sure the system even go into my virtual memory before it gives me the out of memory message - how can I check it ?

Thanks in advance,
Eyal
0 Kudos
Message 1 of 7
(3,935 Views)
Reading in that much data is always a problem.

The Reshape Array function creates a new data space in memory for the reshaped array. So, after you read in all that data, it gets copied when it gets reshaped.

I've modified your example VI a bit. Instead of building the array on the For loop and then reshaping it, try initializing the array first for it's final size, then as you read in the data, replace elements in the initialized array with the data.

This should be more efficeint since the array does not get copied. Always try to initialize your arrays first. This reserves a block in memory for it and it won't need to be resized or copied if you operate on it correctly.

I don't know for sure if this will work since I don't have your data file to read. But it might get you going in the right direction.

Ed


Ed Dickens - Certified LabVIEW Architect - DISTek Integration, Inc. - NI Certified Alliance Partner
Using the Abort button to stop your VI is like using a tree to stop your car. It works, but there may be consequences.
0 Kudos
Message 2 of 7
(3,922 Views)
If you allocate a single array or data object in LabVIEW, LabVIEW always uses contiguous memory. This means that on the Windows platform, your practical limit for a single array size is about 1GByte, and could be considerably less if your memory is full or fragmented. As you mentioned before, you can get around this somewhat by breaking your data into chunks. This allows LabVIEW to place it in separate spots in fragmented memory. A better solution is to read only a portion of your image in memory at once. This may require some interesting algorithms, but, as Ed said, dealing with large data sets is not trivial. If done properly, you solution should scale to any size image you care to look at.

One final note. If you haven't already done it, you may want to check out the web tutorial Managing Large Data Sets in LabVIEW. It has a lot of tips and tricks you can probably use.
0 Kudos
Message 3 of 7
(3,901 Views)
Ed, DFGray, and all,

Thanks a lot, both of you. I used Ed's advice and it helped me to open 0.5GB file. Now, let's face the real challenge - the 1.1GB file.... 😉

I couldn't just initialize an array of this size (0.5G elements of 16bit each), probably due to lack of available memory block of this size. I then tried to do it by chunking again: divide my file into 4 or 8 pieces, and run a lop that initialize an array of the required size, and read the data (this new version of my read_file_5.vi is attached. As you can see I have a loop in a loop to perform the above). However, I could not read all the file. At about 3/4 of the way I get an "Out of Memory" message from Labview, this time it fails at the tunnel located at the left boarder of the external loop). I will highly appreciate any idea regarding how I can solve it.

In addition: DFGray - you wrote "A better solution is to read only a portion of your image in memory at once. ...". I am not sure I understand what you mean by read a portion of your image in memory - do you mean to read and process a portion at a time and repeat it for the number of portions I divide it into ? I will be glad if you could add few more details...

And last thing - I did read the document you referred me to (the chunking idea came from there). Do you think "Functional Global Database" could help me here?

Thanks again,
Eyal
0 Kudos
Message 4 of 7
(3,883 Views)

In addition: DFGray - you wrote "A better solution is to read only a portion of your image in memory at once. ...". I am not sure I understand what you mean by read a portion of your image in memory - do you mean to read and process a portion at a time and repeat it for the number of portions I divide it into?
It sounds like this is what you need to do. The way you are currently reading the file in still is trying to get the entire file into memory at once.

Since you seem to be able to read in about 256MB with no problems, what you could try is to read in a 256MB chunk, process it and save the results. Then read the next 256MB chunk, process it and append the results to the first chunks results. Keep doing that until you have processed the entire file. Notice on the "Read File" function that there is a "pos offset(0)" terminal input. This specifies where in the file to start reading. So on the first read, this would 0, on the second read, this would be 0+256, then 0+512....

Doing it this way will never have more than 256MB in memory at one time.

Ed

Message Edited by Ed Dickens on 06-02-2005 09:28 AM



Ed Dickens - Certified LabVIEW Architect - DISTek Integration, Inc. - NI Certified Alliance Partner
Using the Abort button to stop your VI is like using a tree to stop your car. It works, but there may be consequences.
0 Kudos
Message 5 of 7
(3,873 Views)

Hi,

 

I created a library to do the chunking of array automatically. You can find it here:

 

Fragmented Array Library:

http://decibel.ni.com/content/docs/DOC-9321

 

As the name suggests, it creates fragmented arrays. It presents to the user one single large array, but actually automatically creates several smaller ones and manages them transparently. This should allow you to concentrate on the program and algorithms and not the technical aspects of chunking.

 

Regards,

0 Kudos
Message 6 of 7
(3,345 Views)

Oops, this thread is 5 years old 🙂

but since someone else might find it, it still makes sense to post here.

0 Kudos
Message 7 of 7
(3,344 Views)