LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Memory usage in a while loop with an xy graph

Solved!
Go to solution

Hi all,

 

I am having memory usage issues with a vi that loops, records data, and plots it in an x-y graph as it runs.  I wrote it using shift registers and "array subset" functions, which I assumed would reduce the overhead, but apparently this is incorrect.  If it runs for several days (normal, since this is mostly a vi for tracking and viewing a backgroudn process that runs continuously), it takes several minutes just to stop the loop.  I assume it would eventually just stop working entirely.  I have searched for this issue, and I see suggestions to auto-index or pre-allocate.  However, I also want to plot every point as the loop runs, so I need shift registers.  Do I need to use some convoluted pre-allocation, array shift, and then insert into array approach, or is there a better way?  Will that even work?

 

 

Thanks in advance

0 Kudos
Message 1 of 18
(6,153 Views)

If your array gets infinitely large, it will use infinite amount of data, especially if you're displaying that data. Why do you need to display all of your data for days? Could you compress the data, like take an average down to a smaller sample rate just for display purposes? Same extra data to a file that can be loaded if the user wants to see farther back in to the past.

Cheers


--------,       Unofficial Forum Rules and Guidelines                                           ,--------

          '---   >The shortest distance between two nodes is a straight wire>   ---'


0 Kudos
Message 2 of 18
(6,126 Views)

Thanks for the response.  I don't need to display all the data--I am subsampling the array continually to take only the last 1000 points (which is about 10 minutes).  But the behavior of the vi, gobbling up memory and taking 15+ minutes to stop running after several days running, suggests to me that it's somehow allocating memory for everything anyway.

 

I found a mention of the array subset function online failing to release memory, so I thought maybe that is the problem.  I'm not sure how to test this directly, as I definitely only have two 2x1000 arrays showing up in my block diagram at any one time.

0 Kudos
Message 3 of 18
(6,112 Views)

Just to be clear, I subsample the array and then pass that result to an xy graph and to the next iteration of the loop through a shift register.  So as far as I can see, there is no infinitely large array.  I'm just assuming that is the reason for the performance issues.

0 Kudos
Message 4 of 18
(6,109 Views)

It sounds like you're going to need to share your code. If you array is size limited like you say, then there shouldn't be a memory issue there.

Please include a Snippet of your code, or attach your VIs.

Cheers


--------,       Unofficial Forum Rules and Guidelines                                           ,--------

          '---   >The shortest distance between two nodes is a straight wire>   ---'


0 Kudos
Message 5 of 18
(6,085 Views)

We'll need to see the rest of the code to fully understand what is going on.  Untill then a general observation:

 

you are monitoring a contueous process that runs for days.  Decouple the data display from the data acquisition- no user is staring at the display for hours!  Suggestion: collect the data and write it to a TDMS file and display offline with a TDMS file viewer (Like the built in LabVIEW one, Scout, Excell or Diadem)    Without doing anything else you change the scope of what the DAQ and File loops do and simplify the problem greatly. 


"Should be" isn't "Is" -Jay
0 Kudos
Message 6 of 18
(6,082 Views)

Typically an application like this should be using a Chart, not a Graph.  A Chart does the circular buffer for you.  But it does not give you control over the x-axis like the x-y graph.  So I would like some more clearification about what exactly this data is.  Perhaps we can change it some so that it works with a chart and then life becomes very simple.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 7 of 18
(6,037 Views)

I shoud have stopped messing with this or put the previous version into source control after I posted my previous message, but I didn't.  I tried switching to a chart--maybe it will fix the problem if the graph was the cause of it.  I was plotting time on the x-axis, but since the readings are usually roughly equally spaced, I can live without that.  The script uses some old sub-vis to talk to hardware and some shared variables to talk to VIs on this and other computers, so I replaced those things with random number generators and local variables so that this can be run by other people.

0 Kudos
Message 8 of 18
(5,992 Views)

You are correct that your array is not going to infinity. You're removing an element and inserting a single element, so the array is always size 1000. (By the way, this can be done with a single Replace Array Subset function). Your chart will accrue memory, but will max out at the Chart History Length setting on the chart.

 

But you are writing to file every single loop two data points... to infinity and beyond. You are opening and closing that file every single loop, but the real problem is that the file will get bigger and bigger and take longer and longer to open and close. It will eventually limit your application loop rate even.

 

You should adapt your code to write chunks of data at a time, not every single loop. You should also set it up so it creates new files once the current data file gets to a certain size.

Cheers


--------,       Unofficial Forum Rules and Guidelines                                           ,--------

          '---   >The shortest distance between two nodes is a straight wire>   ---'


0 Kudos
Message 9 of 18
(5,982 Views)

I confess I have been manually archiving the log file when it grows large.  I should make it poll the file size and automatically archive when it hits a certain limit.  I didn't think opening and closing the file would build overhead as the loop runs, though--is that possible?  Could it be that the contents of the file are being written over and over into memory and not cleared out for some reason?

0 Kudos
Message 10 of 18
(5,972 Views)