11-09-2011 03:52 PM - edited 11-09-2011 03:54 PM
Hello,
I would like to know, if there is any way how to reduce allocated memory of small array with several elements, that consist of complex data structure.
The type definition my program works with is attached.
I am dealing with very limited RAM (768 MB) and can't add any more (industrial PC sealed with a sticker) and my program takes about 400 MB of RAM with aprox. 6 elements in each array of the data structure bellow.
The Variant data consists usually of several variables and some references.
Any help appreciated.
Karel
typ.ctl was saved in LV2010 SP1
Solved! Go to Solution.
11-09-2011 04:17 PM
You will need to share more code in order for anyone to be able to help you. I also recommend a search on this forum for "in-place" and "inplaceness" for hints on minimizing the number of times that data is copied, which will reduce memory use.
11-10-2011 09:42 AM
Hi Bublina,
You can find some tips about optimizing the memory usage of your code in this KB article.
If you are looking for more specific help, please provide some further information about your application:
What is your host system? (operating system running the vi, i.e. Windows 7, XP, maybe Real- time?)
What is your LabView version? (it is always good to know that 🙂 )
Maybe provide some of the code that uses your complex data structure? (modify, pass, copy, cross loop boundary)
Best Regards,
St.
11-10-2011 09:44 AM
Sorry, I can't post all the code this data structure runs through. The app is very big and has a lot of subVIs.
The data is passed through a shift register in the mainloop and enters all subVIs through type defed terminals.
Once is this array loaded from HDD, the app doesn't consume any more significant amount of memory.
The memory consumption is visible, if is this array stored into a binary file on disk. I will post a sample VI, when I get into office.
11-10-2011 10:36 AM
A complex data structure does not use more memory that the same amount of information arranged differently. The problem is most likely how you use it.
Your data structure is an array where each element contains other data structures of potentially variable size. Does the size of the main array change or is it constant? Do the sizes of the "inner arrays" change often? What is the typical number of elements?
11-11-2011 04:07 AM - edited 11-11-2011 04:08 AM
Thank you for all your suggestions.
I made an simple example. I started my program, made it load the data structure from disk, and then saved it on disk, kept the saving VIs open,
copied the control that passes the data, opened a new VI, pasted the control, made it a constant and ran the saving VIs again to save the constant.
As result, I got two different files, one of them were very big, these were saved by my application.
And another one set of files, that are small. This was acquired by using the same code, but using that copy and paste control to constant I described.
If you run the VI called compare, you can see, there is the same data in both files, loaded with the same code.
The example can be downloaded from a storage webservice located here.
http://www.uschovna.cz/zasilka/D4LT3YM6K59IN8NS-7R6
click the "example.rar" the download file popup should apear.
If you have any trouble downloading it, pm me, I will reupload.
I believe, that using some array functions or running this array through autoindexed cycles somehow forced labview to allocate more and more memory.
Creating a memory gap, that is not used.
Sadly, the code this data runs through (I only pass it through terminals, no ques, no globals or locals) is very big and I need some hints what can be causing it.
11-11-2011 04:18 AM
Update,
the posted code contains an unsaved VI (compare.vi)
I reuploaded the proper package again.
Please download here:
11-11-2011 05:20 AM - edited 11-11-2011 05:21 AM
Hi bublina,
so you try to load files with an added size of ~180MB into the RAM of your PC equipped with just 768MB RAM and wonder about RAM usage?
- You wonder about RAM usage when loading huge files into IMAQ data structures? IMAQ handles vision data - and that data may internally use a different representation than is used in the files eating even more RAM...
- What result do you expect when comparing an array of 4 huge files with an array of 4 very small files and using Compare Aggregates mode? It will be FALSE. Your VI is just a memory eating RubeGoldberg construct...
- You really expect us to load a 69MB file from your server to just look at your VI? Wouldn't it be sufficient to just upload 2 picture files, one "tiny" and one "not so huge"? (Btw. using 7Zip reduces the file size to "just" 47MB.)
11-11-2011 06:10 AM
Hi GerdW,
I understand the concept of RAM consumption, if you load a file from HDD into RAM, it will consume similar amount of memory in both.
I just wanted to be clear about the problem, thats why I uploaded those big files.
I attached the result you would get, if you ran the compare.vi.
In a nutshell, as stated before, if I pass the data (the tiny ones) through some function, terminal, subVI or anything it flows through, it becomes huge in memory.
Still, the data is the same... purpose of the example to show this.
I am trying to find the code where this happens.
I dont think that IMAQ picture data handling is related to this problem. The strucure only contains the reference to the memory, where the data is allocated.
11-11-2011 06:27 AM
Hi altenbach,
the size of all arrays varies in time, as users have the possibility to delete and add new elements (for all arrays).
The typical number is n<10.