03-03-2016 08:39 PM
Hi all,
I am having memory usage issues with a vi that loops, records data, and plots it in an x-y graph as it runs. I wrote it using shift registers and "array subset" functions, which I assumed would reduce the overhead, but apparently this is incorrect. If it runs for several days (normal, since this is mostly a vi for tracking and viewing a backgroudn process that runs continuously), it takes several minutes just to stop the loop. I assume it would eventually just stop working entirely. I have searched for this issue, and I see suggestions to auto-index or pre-allocate. However, I also want to plot every point as the loop runs, so I need shift registers. Do I need to use some convoluted pre-allocation, array shift, and then insert into array approach, or is there a better way? Will that even work?
Thanks in advance
Solved! Go to Solution.
03-03-2016 10:10 PM
If your array gets infinitely large, it will use infinite amount of data, especially if you're displaying that data. Why do you need to display all of your data for days? Could you compress the data, like take an average down to a smaller sample rate just for display purposes? Same extra data to a file that can be loaded if the user wants to see farther back in to the past.
Cheers
--------, Unofficial Forum Rules and Guidelines ,--------
'--- >The shortest distance between two nodes is a straight wire> ---'
03-03-2016 10:38 PM
Thanks for the response. I don't need to display all the data--I am subsampling the array continually to take only the last 1000 points (which is about 10 minutes). But the behavior of the vi, gobbling up memory and taking 15+ minutes to stop running after several days running, suggests to me that it's somehow allocating memory for everything anyway.
I found a mention of the array subset function online failing to release memory, so I thought maybe that is the problem. I'm not sure how to test this directly, as I definitely only have two 2x1000 arrays showing up in my block diagram at any one time.
03-03-2016 10:46 PM
Just to be clear, I subsample the array and then pass that result to an xy graph and to the next iteration of the loop through a shift register. So as far as I can see, there is no infinitely large array. I'm just assuming that is the reason for the performance issues.
03-03-2016
11:17 PM
- last edited on
05-06-2025
03:37 PM
by
Content Cleaner
It sounds like you're going to need to share your code. If you array is size limited like you say, then there shouldn't be a memory issue there.
Please include a Snippet of your code, or attach your VIs.
Cheers
--------, Unofficial Forum Rules and Guidelines ,--------
'--- >The shortest distance between two nodes is a straight wire> ---'
03-03-2016 11:20 PM
We'll need to see the rest of the code to fully understand what is going on. Untill then a general observation:
you are monitoring a contueous process that runs for days. Decouple the data display from the data acquisition- no user is staring at the display for hours! Suggestion: collect the data and write it to a TDMS file and display offline with a TDMS file viewer (Like the built in LabVIEW one, Scout, Excell or Diadem) Without doing anything else you change the scope of what the DAQ and File loops do and simplify the problem greatly.
03-04-2016 04:47 AM
Typically an application like this should be using a Chart, not a Graph. A Chart does the circular buffer for you. But it does not give you control over the x-axis like the x-y graph. So I would like some more clearification about what exactly this data is. Perhaps we can change it some so that it works with a chart and then life becomes very simple.
03-04-2016 12:23 PM
I shoud have stopped messing with this or put the previous version into source control after I posted my previous message, but I didn't. I tried switching to a chart--maybe it will fix the problem if the graph was the cause of it. I was plotting time on the x-axis, but since the readings are usually roughly equally spaced, I can live without that. The script uses some old sub-vis to talk to hardware and some shared variables to talk to VIs on this and other computers, so I replaced those things with random number generators and local variables so that this can be run by other people.
03-04-2016 12:32 PM - edited 03-04-2016 12:36 PM
You are correct that your array is not going to infinity. You're removing an element and inserting a single element, so the array is always size 1000. (By the way, this can be done with a single Replace Array Subset function). Your chart will accrue memory, but will max out at the Chart History Length setting on the chart.
But you are writing to file every single loop two data points... to infinity and beyond. You are opening and closing that file every single loop, but the real problem is that the file will get bigger and bigger and take longer and longer to open and close. It will eventually limit your application loop rate even.
You should adapt your code to write chunks of data at a time, not every single loop. You should also set it up so it creates new files once the current data file gets to a certain size.
Cheers
--------, Unofficial Forum Rules and Guidelines ,--------
'--- >The shortest distance between two nodes is a straight wire> ---'
03-04-2016 12:36 PM
I confess I have been manually archiving the log file when it grows large. I should make it poll the file size and automatically archive when it hits a certain limit. I didn't think opening and closing the file would build overhead as the loop runs, though--is that possible? Could it be that the contents of the file are being written over and over into memory and not cleared out for some reason?