DIAdem

cancel
Showing results for 
Search instead for 
Did you mean: 

Splitting a large TDMS file into smaller chunks

Solved!
Go to solution

Hi,

 

I am dealing with a large TDMS dataset (5.6GB) and I wish to perform analysis on it in MATLAB. Obviously, dealing with such a large dataset in one go would be impossible in MATLAB so I need to split it into smaller chunks. 

 

I have started playing with Diadem to achieve this. I would like to create an automated script as I may need to do this quite a few times in the future. I am new to VBS but have managed to extract a small amount of data from the data portal and store it in an array. I was wondering how I can then write this data into a separate TDMS file. My amateur code is attached!

 

Thanks in advance for any help,

Matt

0 Kudos
Message 1 of 4
(6,409 Views)
Solution
Accepted by topic author Matt_EE

Hi Matt,

 

The best way to do this is to apply the row filter you want during the importing process, so that you only get the array elements you're interested in as new channels in the Data Portal-- you can do this with the DataFileLoadRed() command.  Then you can just save the subset values you wanted (the entire contents of each channel) to the new export file, which could be a much smaller TDMS file or even a *.mat file if you've downloaded and installed the MATLAB DataPlugin we post free at www.ni.com/dataplugins.

 

Brad Turpin

DIAdem Product Support Engineer
National Instruments

Message 2 of 4
(6,383 Views)

Hi,

 

Thanks for the advice. The problem with that solution is that it has to keep reloading the data. I should probably have been more clear but the dataset is 180seconds in length and I need to split it into ~3second chunks. Loading the data in Diadem takes a good half an hour so having to reload it each time I want to save a segment of data is a little time consuming. Are there any ways of using the method I posted earlier to load the large dataset, store the wanted data in an array, and then save it, iterating through this method until the entire dataset has been split up.

 

Thanks,

Matt

0 Kudos
Message 3 of 4
(6,372 Views)

Brad,

 

I do apologise, I didn't look at this very carefully. You have shown how to read in a small amount of data. Thanks very much!

 

Regards,

Matt

0 Kudos
Message 4 of 4
(6,361 Views)