01-05-2010 03:31 PM
After I use the 'Delete' command from the Advanced File Functions, then create a new file with the same name, LabVIEW must have the original in memory because the file keeps getting bigger and this is not what I want to do. I have tried the 'request Deallocation' from the Application Control-->memory control, but this does not work either.
Your help is appreciated, and thanks,
Vince
Solved! Go to Solution.
01-05-2010 04:47 PM
What kind of file? Can you post your code?
Deleting files only deals with files that are on disk. There technically is no such thing as a file in memory, only data. So if the file is getting bigger and bigger, it must be a situation where you are writing more and more data to a file. Without seeing your code and what you are doing with your files and any data you are manipulating in your VI, it is difficult to comment further.
01-05-2010 05:19 PM
First I am writing arrays to binary files, then later retreiving them to store them in an .hdf5 for optical characterization of Transparent Conductive Oxides. So when I first create the .dat file it is with the following while loop:
loop.
The internal 'Create File' is this:
Then I write data to the file, and create a temp copy of it in case of stopping the vi mid-stream. After that I retrieve the data from the file, write it to the .hdf5 file, then delete it with this code:
After deleting these files, I restart the process to measure another location, but the file is now twice the size of the original, and the third time three times larger.
Hopefully this answers enough questions.
Thank you,
Vince
01-05-2010 05:22 PM
01-05-2010 10:47 PM
None of what you posted shows where you actually write the data to the file. How is the data being generated and where is it written to the file? I would guess that you have an ever growing set of data within your code, and although you delete your file, you are maintaining all that old data in memory and wind up writing it all out again with any new data added to your dataset.
01-06-2010 08:32 AM
The data is coming from different spectrometers, and it is outputted as an array. Here is where the data is collected and written to the binary file for compression, and so LabVIEW does not take up all the computers memory when measuring a lot of spots on the sample:
and the 'Write 4d Binary' looks like this:
So here are the internals, and yes how do I delete the data from memory so when I measure another spot on the sample it starts with a blank slate? Do I use the 'Flush File' code, then delete that newly created file?
01-06-2010 08:54 AM
Ravens Fan was correct!
You are not deleting your old data.
Your problem is the uninitialized feed back node.
It holds all your data from the passed runs
You need to feed an empty array into initialing terminal.
Omar
01-06-2010 08:56 AM - edited 01-06-2010 09:01 AM
I delete my previous message => all was wrong or not in the subject; Sorry, i m tired.
01-06-2010 08:57 AM
I have never seen a 4-D array in actual use before.
You are not clearing your data. In the second snippet you posted, you are continually appending new 3-D arrays to you 4-D array. You are maintaining that 4-D array with the feedback node.
What you need to do is when you want to clear that 4-D array, to wire an empty 4-D array into that feedback node instead of the result of the Build Array function. You can use a select function or a case structure to choose between the results on the wire and an empty 4-D array constant depending on whatever boolean condition you use to determine the array needs to be emptied.