LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Out of memory error while opening 1GB TDMS file, no splitting possible

Solved!
Go to solution

Hi all,

the file in question contains about 1,4 Million channels, each has 295 integer points. The channel name is the time stamp where its measurement has been taken. 

 

Diadem cannot load them completely and only displays the first 65.000 channels.

 

Labview (TDMS read) cannot even read small chunks of data. I've tried separating the file using Split TDMS File Memory Independent LV2009.vi (https://decibel.ni.com/content/docs/DOC-20054) but to no avail - the error I get is Code 2 (Memory is full).

 

I use Win7 and Labview 2012 SP1 32Bit.

 

Thanks in advance for any help.

 

Best regards, Markus

0 Kudos
Message 1 of 9
(3,791 Views)

Markus,

 

1.4 million CHANNELS? Are you kidding?

 

Well, when you look at the parameters TDMS Read supplies, you can limit the amount of data values per channel per read ("count") or the amount of channels you read ("channel name(s)").

With those options, you can load the file "step per step", do your calculations and proceed to the next chunk.

 

hope this helps,

Norbert

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 2 of 9
(3,788 Views)

Hi Norbert_B,

thanks for your quick answer. 1.4 M channels is quite a lot, I know. The issue is that I get measurements (295 integer values) at about 50 Hz rate, and I rather want to save this raw data. Is there a better way to store it - more groups, less channels or something like that?

 

With regards to your answer, I've tried to address the channels separately - however, it takes about 0.3 seconds for each channel to load, probably due to the addressing channel part fo the reading. If there is no better way, I'll try to do this.

 

Again, thanks,

best regards, Markus

0 Kudos
Message 3 of 9
(3,781 Views)

Markus,

 

the term "channel" refers commonly to data coming from a single transducer. So if you have e.g. 50 transducers in your system, you should have 50 channels.

Acquiring data continuously generates chunks of data per time unit (e.g. 1kHz acquisition rate will generate 1000 samples per second). The chunks should be saved in the file to its appropriate channel. TDMS Write gives you the option to save data repeatedly to the same channels (again, group/channel name the important parameter!).

 

Nevertheless, the overall amount of data will not be less, even if you distribute them over channels in a better way. Still, loading ALL data at once will not be possible in a 32bit process.

 

BUT: If channels have all data as recommended, loading a singel channel at once will give you all data from the transducer for the complete measurement time. Then you can display/modify the data as requested, maybe store back to another file (or the same as new channel).

 

Having less channels could also improve loading times, but maybe there is no difference at all.... i don't have specific benchmarks on this.

 

Depending on the algorithms you are using, data reduction can improve overall times.

 

Norbert

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
0 Kudos
Message 4 of 9
(3,769 Views)

I don't see why you need to separate the reads into different channels.  Each channel you add to the TDMS file adds overhead.  That's overhead you don't need.  Just keep appending to the same channel.  You should just have the one channel if you are constantly acquiring the data.  You can then read that data in chunks easily enough.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 5 of 9
(3,755 Views)

Hi Crossrulz,

your suggestion of course has some merit. The question would be if it is "easier" to address the data that way, or if LabVIEW handles this type of structure better.

 

My data structure is as follows:

1. time stamp

2. array of 295 integer values, representing an exponentially decaying signal

And this array represents one measurement "point".

 

So to store the data like I did (channel name = time stamp, channel values = array) is logically in that it resembles the actual structure of the data.

 

Do you know if LabVIEW can handle extremely large (> 1GB) channel data?

 

Regards, Markus

 

0 Kudos
Message 6 of 9
(3,740 Views)
Solution
Accepted by topic author markhors

@markhors wrote:

[...]Do you know if LabVIEW can handle extremely large (> 1GB) channel data?

 

Regards, Markus

 


64bit LV, yes. But depending on the installed RAM, it might feel like the process is stuck.

It sounds like you create the setup of the TDMS file exactly the opposite way as recommended.

It sounds like you are having 295 transducers and read a value from each every cycle. These values should be written in 295 channels consecutively including one additional channel you use for the time stamp.

 

Norbert

Norbert
----------------------------------------------------------------------------------------------------
CEO: What exactly is stopping us from doing this?
Expert: Geometry
Marketing Manager: Just ignore it.
Message 7 of 9
(3,732 Views)

I might try this!

However, it's not like I have 295 measurement devices; it's just that each measurement consists of 295 integer values. These are afterwards fitted (this takes too much time to do it online) and one of the fit parameters is what I'm after.

Thanks for the suggestions and help!

Regards, Markus

0 Kudos
Message 8 of 9
(3,706 Views)

Just for those of you still working on this - I found out that using queues for data storage instead of static arrays solves all problems, and files > 1GB can be opened really quickly (at least in chunks of ~ 200 MB). (markhors)

0 Kudos
Message 9 of 9
(3,606 Views)