03-19-2010 01:05 PM
Hi,
My applications collects large array of data, i.e. 64x512x512, single precision complex, so it is ~134MB.
Once measurement is done, I need to export this array to Matlab using script node for data processing.
At this point, Labview total memory usage shown in windows task manager is ~1.1GB.
I also used matlab profile to monitor the memory usage for each sub vis.
First, I convert this 3D array to 1D array, and divide into 4segments.
Now these are connected to the input nodes of the matlab script node.
At this point, there is additional data copy, since my data is single precision complex, but matlab script node input supports only double precision.
When the matlabscript node is excuted, then it gives memory error message and it stops.
At this point, LabView total memory usage is shown to be 1.9~2GB, which is the limit of the XP 32bit.
I tried to do the data processing in LabView, but those functions, e.g. window function and FFT are also double precision only, which means that there is data conversion and additional data copy as well. I also tried to use double precision for all measurement data, hoping to minimize additional data copy. But in this case, overall memory usage is more.
At this point, I do not know why it increases memory so much, when it excutes Matlab script node. It seems there must be additional data copy including the data conversion. In this case how to minimize the memory usage?
If I save the data into file for matlab, what is the efficient method ?
Thank you.
Chang
03-22-2010 03:08 PM
Chang:
If you have to work with an array that large, your best bet is probably to write the data to a text file and import it from MATLAB.
The easiest way is to write the data after you have converted it to the 1D array. You can simply pass the array to a "Write to Text File" function.
03-22-2010 04:29 PM
Dear Caleb,
Thanks for your reply.
I've tried to save the data in Text format. but the file was corrupted for some how.
I will try. Which file format is efficient in terms of file size, and importing data into Matlab?
Thanks,
Chang
03-23-2010 07:00 PM - edited 03-23-2010 07:00 PM
Hello, Chang!
How was the file corrupted? Was it not readable in MATLAB, or was it an issue with writing the file in LabVIEW?
For reading in MATLAB, I can't really make a recommendation for file formats, but I believe an ASCII text file will work. The other way you could write the text file in LabVIEW is by using a "Write to Spreadsheet File" function. You can pass a 1D array in without the need to convert it to strings. You can also specify the delimiter (tab, comma, \n, etc). Unfortunately, I don't know enough about MATLAB programming to be of much help in reading the file once you have it in MATLAB.
Unfortunately, if you want to use the script node in LabVIEW, you're going to have to reduce the input array size. Judging by your initial description, there isn't much memory usage you can eliminate that you haven't already.
Please let me know if you have questions about writing to the text file, especially if I didn't explain something well enough. I hope we can get this project to work for you, one way or another.
03-24-2010 08:10 AM
Dear Caleb,
Thanks for your reply.
I was able to save the files in text format, i.e. one for real part and the other for imaginary part.
But the file size is large. So I guess I need to divide them.
By the way, the 3D array are converted into 1D.
When I save data into text file, and reacall them in matlab, which one is more efficient, 1D or 2D array?
Another is that when data is saved in text format file, it seems there is additional data copy.
Is there any way to avoid this additional data copy in the memory?
Thanks,
Chang
03-26-2010 05:26 PM
Hello Chang:
When you write to the file, it shouldn't create the additional copy. I would assume that the 1D array would take less file space, since it would likely require fewer delimiting characters.
For the array manipulation, I did forget about the one function that may be useful to you: the In-Place Element Structure. Essentially, it will allow you to perform operations on your array without creating an additional copy in memory. For more information on the In-Place Element Structure, please refer to the link above.
03-29-2010 08:55 AM
Hi Caleb,
Thanks for your update.
Currently, I was able to save 512x512x64, double complex datat to 512 spreadsheet files, and load them to the matlab.
Data array in the main VI is passed to the sub VI, which spilts the array into 2 and swap it, then in the for loop it saves each 2D array into two files, i.e. real and imaginary.
But I'm not sure if there is additional data copy, when I pass the array data to this sub VI.
My original data is single precision complex, but it seems the 'spreadsheet file save' only accepts double data format. So I guess there is at least one additional data copy.
Thank you for letting me the in-place element vi.
Chang
03-30-2010 01:52 PM
Hi Chang,
Caleb is out of the office today, but will be back tomorrow.
We can confirm that it WILL create the additional copy if the array is converted from single to double before writing to file. File writing doesn't create a copy, but data type conversion does.
Also, you won't be able to use in-place element structures for datatype conversion . In-place element requires 1:1 input --> output, i.e. single input has to go to single output. Basically, just giving the broader criteria:input data types and terminal counts must all have matching outputs.