11-03-2010 12:19 PM
Ok, I see what you mean now - If I save all the data in a binary file I can load a pixel at a time and analyse it.
My analysis involves looking at the intensity distribution for each coordinate in the stacked array. For small datasets I can flick between the histogram for each pixel and analyse the distribution. I wanted to increase my sample size, hence the 5000 csv files, and see the effects on the histogram.
This way might be slower but it should work. Cheers.
11-04-2010 08:10 AM
Could you zip the files you are trying to load and the code you are using the load it and post them? If this is a bug, I would like it reported so we can fix it.
Thanks!
11-04-2010 08:28 AM
I've attached the program and a folder containing 2 csv files to reduce the size. You can copy them to get 5000 files for testing. When the program is run, and I select all 5000 files, only 2117 will load.
Let me know how you get on.
11-04-2010 09:39 AM - edited 11-04-2010 09:40 AM
@DFGray wrote:
Could you zip the files you are trying to load and the code you are using the load it and post them? If this is a bug, I would like it reported so we can fix it.
Thanks!
Labview is a memory guzzler. But OK RAM do not cost much today. Take a look at this vi. I did a test on my computer. Using the SGL data type. It used 185 Mbyte. That is 2x the data size (4*68*68*5000). My computer have 2G byte with memory. So it should be enough. But then I use the DBL datatype I got an out of memory error. Debug is turned of. Using XP SP3 32 bit
11-04-2010 10:30 AM
I made 5000 copies of your data file and ran a modified version of the VI on the 5001 files. It works with DBL or SGL representation. With DBL the total RAM usage is almost 600 MB for LV (according to the OS) and 425.84 MB for the VI (according to the Profiler). It takes a minute or two to run, but everything works. This is on Mac OS X 10.5.8, 2 quad core Xeon processors and 4 GB of RAM. LV 2010.
The main modification I made was to preallocate memory for the array and use Replace Array Subset inside the loop. See the image for details.
Lynn
11-04-2010 10:59 AM
I made 2500 copies of each file and the code read them with no problems. I modified the array to output doubles, and got the same result (LabVIEW was using about a gigabyte of memory when done).
I was using LabVIEW 2009 SP1 32-bit running on Windows 7 64-bit, so this may cloud my results (the 32-bit memory space is a bit more open). However, individual array sizes up to about 800MBytes are possible using 32-bit Windows (I have done this), so something else must be going on.
What are the names of your files? Do you have a path length or naming issue?
Note that the header information in the files will lead to a "corrupt" few lines in your data arrays.
Using double the amount of calculated memory is expected, since you have a data set in the wire and another in the control.
If you have not read it yet, you may want to check out Managing Large Data Sets in LabVIEW.
11-04-2010 11:25 AM
@DFGray wrote:
Using double the amount of calculated memory is expected, since you have a data set in the wire and another in the control.
Of course that is a inherent part Labview programming. But one more thing to mention. Then working with arrays. Labview requires a continuous memory allocation for the data. If the windows memory manager can not deliver this. You will get an out of memory error. Like I did. Windows will only give you the free amount of memory. Momory may be fragmented like a hard-disk
11-04-2010 01:01 PM - edited 11-04-2010 01:03 PM
I guess your problem is related to a RAM problem. Get some more RAM on your computer and you will be fine. RAM is cheap, sitting on your thumb is expensive. However The format of your data file. Have some minor flaws that cause artifacts in the output. The header is one thing. But your files also have an extra "end of line" at the end of the file. Causing extra data to be added. I have made VI that should take care of your problems. It work with 5000 files on my computer with the SGL data type. But not DBL. It will fail because windows can not allocate continuous RAM for the indicator, showing all the data. It is quite fast also 42 seconds on my computer.
My Vi is not the definitive way to this, but it is better than your current setup. I made it in LV2010 but it is saved in 8.6. It should be correct but test it to be sure
11-04-2010 01:32 PM
Sigh need more edit time! But well. As mentioned before remember that every indicator/control needs its own copy of the data. So you can increase speed, and reduce memory usage. By removing any not strictly needed indicator. In everyday Labview this may not be of so great importance. But for memory intensive applications like yours. It may have something to say
11-05-2010 07:10 AM
Thank you all for your suggestions and for having a look at the program. The modified version is allowing me to load more profiles (XP SP3 32 bit 3GB RAM). I will look into the memory management a bit more also.