11-15-2013 05:12 AM
Hi,
In order to improve some code, I'm trying to understand how LabVIEW allocates memory when handling a large data set and trying to display it.
If you have a look at the attached code, the "Sine Waveform" function generates a sine tone with 6E6 samples with DBL type, so 64 bits per sample. In theory, this data set should need 48 MB of RAM.
If I delete the "signal out" indicator and check the memory used with the Profile Performance and Memory tool, I get the 48 MB value that I expected.
However, with the "signal out" indicator, the Profile and Performance and Memory tool returns a 96 MB value. This value seems right as the indicator needs an additional copy of the data set. What is unclear is that the Windows Task Manager reports around 250 MB for LabVIEW.exe. Before the execution of the VI, the LabVIEW.exe process used 110 MB. So running the VI requires around 140MB which represents 3 times the initial data set. I don't understand from where this third copy comes.
Why does LabVIEW need an additional copy of the data set and why is it not shown in the Profile Performance and Memory tool ?
Thanks,
Alex
11-15-2013 01:14 PM
I've found the Windows Task Manager to be a much better indicator of memory usuage than the profiler. I do not know how the profiler works, but I have seen instances of memory leaks in my program that were obvious in the Task Manager but not in the profiler. In one of my programs I allocate a large dataset with a DVR and then reuse it, in the vi profiler I see a memory allocation but over time as that program runs the memory usage reported by the profiler goes to zero!
So, this is my best guess, the vi profiler only shows some of the memory allocations and it appears that it under reports some of them. In your case, I believe the indicator actually makes two copies of the data, not one, that is why the Task Manager is reporting an extra memeory set. (The extra copy may have to do with data history.)
Sorry this does not help, but what you are seeing is not unusual.
Cheers,
mcduff
11-18-2013 06:37 PM - edited 11-18-2013 06:39 PM
There is allocated memory and allocated memory! LabVIEW does not immediately return every byte of memory to Windows when it is not used anymore. It keeps its own books about what memory is still in use and what not. The Windows memory manager also does a similar thing. The reason being that requesting a memory block from the global memory pool costs more time and resources than maintaining an intermediate list of almost deallocated blocks of memory. The VI profiler reports the memory that is really used at any point according to the LabVIEW memory manager. The Windows Task manager returns the memory that an application has requested from Windows, independent if the application really uses that memory internally. even Windows itself has different ideas about what memory is assigned to an application, depending on which level of Windows you check, since its memory manager does a similar optimizing allocation strategy.
If this caching would be completely removed any heavily memory dynamic application like LabVIEW would have a terrible performance.