LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Memory leak when converting from TDMS to CSV?

I have this LabVIEW 2017 program that manages log files. The log files are zipped TDMS files. One of the features of the program is to convert from TDMS to a flat CSV file. 

 

When this feature is run, the memory as reported by the Windows Task Manager increases by around 100 megabytes. If I do it again, I get another 10 megabytes on top of that. My concern is that if this keeps going I will eventually run out of memory. This program is meant to run 24/7 managing log files created by another application. The conversion is not done all the time, which is probably why it hasn't been a problem yet. 

 

I have already looked for leaking file references, but the code closes the references it uses. I tried adding a TDMS flush right before closing the TDMS file. It didn't help. 

 

I have read that TDMS uses memory for advanced indexing, but it seems like this would be deallocated when the reference was closed. 

 

I have attached the convert function. The calling code runs this in a loop, once each for all the files to be converted. The String output is used to update a status on the program's front panel.

0 Kudos
Message 1 of 5
(241 Views)
  • Where does the unzipping happen?
  • Flush only makes sense when writing and you are doing read-only.
  • You are talking about the memory of the LabVIEW process, right?
  • How big are these files?
  • Can you attach a typical TDMS file?
Message 2 of 5
(202 Views)
  • Where does the unzipping happen?

The program uses a state machine design. The unzip happens in a prior state.

 

  • Flush only makes sense when writing and you are doing read-only.
  • You are talking about the memory of the LabVIEW process, right?

No. I build this as an application and am watching the memory usage of that process.

 

  • How big are these files?

A typical file is around 5 megabyte once uncompressed.

 

  • Can you attach a typical TDMS file?

The files I am working with have proprietary data in them. I will have to generate a dummy file of similar size. 

Message 3 of 5
(189 Views)

Here is a dummy zipped TDMS file. 

 

I have confirmed that if I run this through my application I get the bump in memory usage.

Message 4 of 5
(170 Views)

@cgibson wrote:

I have this LabVIEW 2017 program that manages log files. The log files are zipped TDMS files. One of the features of the program is to convert from TDMS to a flat CSV file. 

 

When this feature is run, the memory as reported by the Windows Task Manager increases by around 100 megabytes. If I do it again, I get another 10 megabytes on top of that. My concern is that if this keeps going


You should keep in mind that LabVIEW has its own smart Memory Manager on top of the low-level Windows memory allocators. This means it will allocate memory in advance and keep it allocated for performance reasons. You don’t need to worry too much about additional allocations during single runs. To check for memory leakages, run the code continuously in a loop as an stress test over a long period and monitor it. After a certain number of iterations, the memory usage will stabilize and will not increase further. I don’t see any memory leaks in the given code — it remains stable at around 450 MB.

Screenshot 2025-08-27 09.20.19.png

Under certain rare conditions, when working with very large data blocks, you may encounter memory fragmentation issues. The simplest way to avoid this problem is to use a 64-bit environment.

Message 5 of 5
(135 Views)