LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Reading large tdms files and displaying the data graphically

I have a labview script which reads in data at a rate of 1kHz from several sensors using two DAQ Assistants. This data is saved to TDMS files using the "Write to measurement file" express VI.

 

I then have another script which I use to read the files back. This, in theory, allows me to graphically view the entire data set from the file I have chosen. It does this by using the "Read from measurement file" express VI to read in the file 800,000 samples (this value was found by trial and error to be roughly as much as the program can manage to read in in one go) at a time and then displays the data on a waveform chart.  This works fine with smaller files, but with larger files (5 to 8 GB) it works until it reaches a certain point (after about 29 chunks of 800,000 samples) and then the program crashes and an error comes up saying that there is "not enough memory to complete the operation". 

 

Is there a way to make it read in and display the whole data set from the larger files?

 

I am using LabVIEW 2015 (32-bit) with Windows 7.

0 Kudos
Message 1 of 5
(5,068 Views)

You can read the data in chunks then do your processing on those chunks, etc. Handling all that data at once most likely will not work.

 

As far as displaying the data, your screen is probably only a few thousand pixels wide, there is no way you can display that much data on a small screen. I use a min/max decimation to display the data. See https://forums.ni.com/t5/Example-Code/Managing-Large-Data-Sets-in-LabVIEW/ta-p/4100668 for how to implement one.

 

mcduff

 

0 Kudos
Message 2 of 5
(5,058 Views)

I run into this issue as well.  I collect many GB of data and want to display that to the user.  But it makes no sense to display everything in one shot since there's only so many pixels available.  LabView 32 has basic memory limitations and no normal computer will work with that amount of data in any reasonably rapid fashion.  So, I create display channels from the data which are a combination of decimation and resampling of the original data (Channels are captured at different data rates in many cases).

 

I have to process those display channels one at a time, using 'chunks' read from spanned TDMS files (Each file <= 2GB) and I process those into a preview TDMS file that sits next to the original TDMS files.  The preview (I call it the Post-process file) file is what is used to look at the data and select time ranges for additional display.  When a time span is selected, the original files are read to display (Or export) the full resolution data.  I got that concept from the common approach image managers use (Thumbnails) to deal with very large image files or how video editors use key frames.

0 Kudos
Message 3 of 5
(5,019 Views)

The TDMS File Viewer is your friend here. Scout from Signal X is another very useful tool


"Should be" isn't "Is" -Jay
0 Kudos
Message 4 of 5
(5,012 Views)

Another hint... I could actually downsample or decimate my data on my RT controller into my preview TDMS file (Would save potentially lots of post processing time) but I chose to generate the preview file after data collection so that I can choose the decimation rate based on the timespan of the data actually collected.  If the data isn't big in the first place, I don't even create the preview file.  If it's super big, lots of decimation occurs for preview.  This dynamic behavior was preferred since the super large files are only generated for specific types of tests that don't occur very often.

0 Kudos
Message 5 of 5
(5,008 Views)