LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Memory Usage and RAM

Hello,
 
I would like to know why when you open LabVIEW, the memory usage is about 30Meg as shown in Windows Task Manager, but then when you open a vi. and close it again, it stays at the new value.  In my case it goes up to about 150Meg; this is due to the size and nature of the vi. which I open.
 
If you monitor your memory usage in Task Manager, and it shows a slowly continuously increasing amount of memory usage whilst your LabVIEW application is running, is that a sign that there is some inefficiency in your programming/reference left opened and not closed.
 
Kev
0 Kudos
Message 1 of 5
(3,986 Views)
What is the "size and nature" of your VI?
 
Boundless memory increases during the run can have many reasons, a typical case is building arrays into unitintialized shift registers.
 
If you are worried about keeping references open, why don't you close them explicitely and see if it makes a difference. 😮
 
Can you explain in a bit more detail what your VI is doing?
 
How is the memory usage if you monitor the "VI properties...memory usage" page?
 
Have you done any profiling of you VIs and subVIs?
 
What version of LabVIEW are you using?
0 Kudos
Message 2 of 5
(3,959 Views)

The application I have is this:

One main vi handles the user interface and test prodecure, comprising of 20 tests.  The running functions for each test are not particularly intensiven nor are the 6 ro 7 while loops running in parallel.  A number of while loops operate in parallel: the least important running every 200ms, the most important at 10ms.  The memory properties of this is: Front Panel objects 881K, Block Diagram objects 12962K, Code 4699K, Data 62870K, Total 81413K.  In this vi, I have tried to make as many units into subVIs as possible; I use only 1 or 2 arrays, but many local variables.

The second vi operates in the background dealing with communications with an ECU reading CAN messages from a buffer.  The most intensive function is a while loop running at 5ms, reading the ECU data, converting it with a subVI, and writing it to a global array, which is read by the main vi. 

 

THE PROBLEM: after long running, the 5ms loop timing cannot be maintained, causing the buffer to overflow as I cannot read quick enough.  I only read 1 sample each loop iteration for timing purposes since i have no timestamping of the data and use the loop timing to generate a timebase.

 

1) The subVI call in the 5ms loop, I dont think is set to subroutine, so that's worth trying.  I guess this is a must, whenever it's a function that does not need front panel interaction.

2) At the start of each of the 20 tests, I initialise an array (I reuse the same array in each test) to a large size e.g. 18000x30, which is used to store running test data.  Before I initialised it to an empty array.  The difference is that to save the data, before, it would constantly create a copy and resize the array, which I thought was the reason for the PROBLEM.

3) I know the memory stats for the main vi are much too large, but i have noticed that if I delete all the array initialisations, the Data becomes 2260K.  Is it the initialisation causing the memory increase or is it the use of local variables which causes a separate copy of the same array.

4) With regard to open references, sometimes I have left them open, eg, queue references or vi references, but I am checking to close them down.  How much of an issue is it to leave a reference open, and also if I then open the same reference again?

Thanks
 
Kev
0 Kudos
Message 3 of 5
(3,917 Views)
Kev,

You definitely want to avoid the use of local variables. Each instance creates another copy of your data. Initialize your array to the maximum size of the data set it will hold and then use Replace Array Subset to put the data into the array. Use a shift register to pass it from one iteration of the loop to another. Avoid use of Build Array or Auto indexing to enlarge an array. For small arrays they are OK, but when the arrays reach the sizes you are using these functions can cause memory and speed problems.

There is a white paper on the NI web site ( I did not look it up now) about how to handle large datasets. It has excellent information for what you are trying to do.

Lynn
0 Kudos
Message 4 of 5
(3,911 Views)

In addition to johnsold's good suggestions, I'd like to give you a specific tip regarding one of the things you mentioned:

2) At the start of each of the 20 tests, I initialise an array (I reuse the same array in each test) to a large size e.g. 18000x30, which is used to store running test data.  Before I initialised it to an empty array.  The difference is that to save the data, before, it would constantly create a copy and resize the array, which I thought was the reason for the PROBLEM.

Because arrays require contiguous blocks of memory, and because LV manages allocation/deallocation auto-magically, there are times that performance degrades as you re-run your LV program over and over.  A colleague & I discovered a neat technique that was a lifesaver in our app.  We had an RT app where the RT controller would completely crash after about 3-5 runs due to this memory issue.  Look at my post in this thread for a solution.

-Kevin P.
 
ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 5 of 5
(3,885 Views)