08-24-2017 11:34 AM
I am trying to analyze about 30 CSV files, snip out the section I want, and overlay the data onto a chart. It is read as text but changed to double before being indexed. The 1D array size is going to be about 12k elements. However, at the indexer on the for loop, it crashes telling me there is not enough memory. I did some research and I thought maybe I could increase the page size or use virtual memory (not sure I fully understand what I read) but this seemed to do nothing.
Anyone know how to address this issue? I can't even get two and I need 30. I've attached the VI and an image of what I did with the memory. The actual csv file(s) is sensitive and cannot be posted. It's just two columns of data.
Solved! Go to Solution.
08-24-2017 11:56 AM
Looks like your CSV files are very large. If so you will have to read the files, line by line to avoid the situation where you read the entire file as a string then convert.
The process is straight forward until you get to file that are over 2G and then you will have to work.
Ben
08-24-2017 11:57 AM
Hi, could you post it on Labview 2014 SP1 version please
Thanks
08-24-2017 12:01 PM
Here it is in 2011 so as to cover everyone. The read delimited spreadsheet VI may be broken.
08-24-2017 12:06 PM
@Ben wrote:
Looks like your CSV files are very large. If so you will have to read the files, line by line to avoid the situation where you read the entire file as a string then convert.
The process is straight forward until you get to file that are over 2G and then you will have to work.
Ben
You think this is where the memory is getting blown away? Is there a way to just destroy it out of memory after I've processed it? Or does it stick around in memory until the end? Surely there's a way to allow LabVIEW to access more memory.
08-24-2017 12:15 PM - edited 08-24-2017 12:18 PM
A couple of things to try in order to reduce the amount of work per iteration (and therefore reduce the memory usage):
1. Since you only care about 1 column, index that column out first (right after the Read From Delimited File).
2. Instead of multiple Delete From Arrays, just use an Array Subset.
3. Combine your two FOR loops. There is no point in having them separate.
08-24-2017 12:20 PM
How large are your CSV Files?
Convince yourself.
Use Tools >>> Profile >>> Show Buffer allocations to see where buffers are allocated.
and
Tools>>> Profile >>> Profile Buffer allocations to see how much is in each buffer.
LV will generally allocate memory for a buffer once unless it needs more space and then it will expand the buffer. Also keep in mind that buffers must be in contiguous memory so if memory is fragmented you may not find a single block large enough. That is whey the "line by line" approach I suggested. That way we are just using a buffer large enough for just a single line instead of the entire file.
Ben
08-24-2017 01:12 PM
Can you also please attach a typical data flie (Zip it up if large, these things typically compress well).
I would go a step further than Tim: Just read it with "transpose=true" and get the "first row" output (which will give you the first column because of the transpsition). In a next step you could use lower level file IO to read and process it in chunks for less memory footprint.
Converting a string to a floating point numeric is a comparatively expensive operation. For large files, you should do things using binary files.
08-24-2017 02:36 PM
So I have used some of the suggestions (except for the line by line, haven't tried that yet) and have also added area under the curve calc. I am crashing each and every time I run this with 1 iteration. LabVIEW is sending error reports, but maybe y'all can help. I will only upload as 2015 because of the crash.
I tried to look at buffer allocation tool and adjusted it but it still crashed. It was crashing before I did this as well. Any thoughts?
08-24-2017 02:37 PM
Also, attached is a file of the data