LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabVIEW: Memory is full when creating arrays

Solved!
Go to solution

I am trying to analyze about 30 CSV files, snip out the section I want, and overlay the data onto a chart.  It is read as text but changed to double before being indexed.  The 1D array size is going to be about 12k elements.  However, at the indexer on the for loop, it crashes telling me there is not enough memory.  I did some research and I thought maybe I could increase the page size or use virtual memory (not sure I fully understand what I read) but this seemed to do nothing.  

 

Anyone know how to address this issue?  I can't even get two and I need 30.  I've attached the VI and an image of what I did with the memory.  The actual csv file(s) is sensitive and cannot be posted.  It's just two columns of data.

 

No Memory.JPGmemory allocation.JPG

0 Kudos
Message 1 of 14
(5,205 Views)

Looks like your CSV files are very large. If so you will have to read the files, line by line to avoid the situation where you read the entire file as a string then convert.

 

The process is straight forward until you get to file that are over 2G and then you will have to work.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 2 of 14
(5,189 Views)

Hi, could you post it on Labview 2014 SP1 version please

Thanks

CLA, CTA
0 Kudos
Message 3 of 14
(5,188 Views)

Here it is in 2011 so as to cover everyone.  The read delimited spreadsheet VI may be broken.

 

0 Kudos
Message 4 of 14
(5,182 Views)

@Ben wrote:

Looks like your CSV files are very large. If so you will have to read the files, line by line to avoid the situation where you read the entire file as a string then convert.

 

The process is straight forward until you get to file that are over 2G and then you will have to work.

 

Ben


You think this is where the memory is getting blown away?  Is there a way to just destroy it out of memory after I've processed it?  Or does it stick around in memory until the end?  Surely there's a way to allow LabVIEW to access more memory.

0 Kudos
Message 5 of 14
(5,178 Views)

A couple of things to try in order to reduce the amount of work per iteration (and therefore reduce the memory usage):

1. Since you only care about 1 column, index that column out first (right after the Read From Delimited File).

2. Instead of multiple Delete From Arrays, just use an Array Subset.

3. Combine your two FOR loops.  There is no point in having them separate.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 6 of 14
(5,168 Views)

How large are your CSV Files?

 

Convince yourself.

 

Use Tools >>> Profile >>> Show Buffer allocations to see where buffers are allocated.

 

and

 

Tools>>>   Profile >>> Profile Buffer allocations to see how much is in each buffer.

 

LV will generally allocate memory for a buffer once unless it needs more space and then it will expand the buffer. Also keep in mind that buffers must be in contiguous memory so if memory is fragmented you may not find a single block large enough. That is whey the "line by line" approach I suggested. That way we are just using a buffer large enough for just a single line instead of the entire file.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
0 Kudos
Message 7 of 14
(5,159 Views)

Can you also please attach a typical data flie (Zip it up if large, these things typically compress well).

 

I would go a step further than Tim: Just read it with "transpose=true" and get the "first row" output (which will give you the first column because of the transpsition). In a next step you could use lower level file IO to read and process it in chunks for less memory footprint.

 

Converting a string to a floating point numeric is a comparatively expensive operation. For large files, you should do things using binary files.

 

 

0 Kudos
Message 8 of 14
(5,139 Views)

So I have used some of the suggestions (except for the line by line, haven't tried that yet) and have also added area under the curve calc.  I am crashing each and every time I run this with 1 iteration.  LabVIEW is sending error reports, but maybe y'all can help.  I will only upload as 2015 because of the crash.  

 

I tried to look at buffer allocation tool and adjusted it but it still crashed.  It was crashing before I did this as well.  Any thoughts?

0 Kudos
Message 9 of 14
(5,118 Views)

Also, attached is a file of the data

0 Kudos
Message 10 of 14
(5,117 Views)