01-05-2011 04:59 AM
Hi,
I am dealing with a large TDMS dataset (5.6GB) and I wish to perform analysis on it in MATLAB. Obviously, dealing with such a large dataset in one go would be impossible in MATLAB so I need to split it into smaller chunks.
I have started playing with Diadem to achieve this. I would like to create an automated script as I may need to do this quite a few times in the future. I am new to VBS but have managed to extract a small amount of data from the data portal and store it in an array. I was wondering how I can then write this data into a separate TDMS file. My amateur code is attached!
Thanks in advance for any help,
Matt
Solved! Go to Solution.
01-05-2011 12:12 PM
Hi Matt,
The best way to do this is to apply the row filter you want during the importing process, so that you only get the array elements you're interested in as new channels in the Data Portal-- you can do this with the DataFileLoadRed() command. Then you can just save the subset values you wanted (the entire contents of each channel) to the new export file, which could be a much smaller TDMS file or even a *.mat file if you've downloaded and installed the MATLAB DataPlugin we post free at www.ni.com/dataplugins.
Brad Turpin
DIAdem Product Support Engineer
National Instruments
01-05-2011 03:39 PM
Hi,
Thanks for the advice. The problem with that solution is that it has to keep reloading the data. I should probably have been more clear but the dataset is 180seconds in length and I need to split it into ~3second chunks. Loading the data in Diadem takes a good half an hour so having to reload it each time I want to save a segment of data is a little time consuming. Are there any ways of using the method I posted earlier to load the large dataset, store the wanted data in an array, and then save it, iterating through this method until the entire dataset has been split up.
Thanks,
Matt
01-05-2011 05:53 PM
Brad,
I do apologise, I didn't look at this very carefully. You have shown how to read in a small amount of data. Thanks very much!
Regards,
Matt