LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

TDMS Opens Slowly

I am opening a large (300 MB) TDMS file using the LabVIEW "TDMS Open" vi.  This takes almost 10 (!) seconds to just run the Open VI, let alone query channels, etc.

Any thoughts on why this is sooo slow, or how to speed it up?  This is in LV 8.2.  I'm trying to write a file browser for a series of TDMS files generated during a test.  There may be 20 300 MB files, and 200 seconds just to open each one is unacceptable...

Message Edited by Joe Gerhardstein on 10-24-2007 01:51 PM

Joe Gerhardstein
Viasat
Certified LabVIEW Architect
Certified TestStand Developer
Certified Professional Instructor
http://www.viasat.com
0 Kudos
Message 1 of 12
(4,111 Views)
Sounds like single value acquisition / small data chunks. If so, you might want to try this:
http://forums.ni.com/ni/board/message?board.id=60&message.id=6719&requireLogin=False

Herbert
Message 2 of 12
(4,100 Views)
Yes, actually it is the problem, though in my case I am streaming audio to disk.  I ran the "TDMS Defragment" VI and was able to reduce the index file size from 25 MB to 12 kB.

Thanks!

Message Edited by Joe Gerhardstein on 10-25-2007 06:28 AM

Joe Gerhardstein
Viasat
Certified LabVIEW Architect
Certified TestStand Developer
Certified Professional Instructor
http://www.viasat.com
0 Kudos
Message 3 of 12
(4,075 Views)
Do you now if the TDMS Defragment can be run in a reentrant VI so I can compress several files simultaneously?  I am trying to do this, but other parts of my code that access the TDMS file I/O VIs seem to be blocked while defragmenting.
Joe Gerhardstein
Viasat
Certified LabVIEW Architect
Certified TestStand Developer
Certified Professional Instructor
http://www.viasat.com
0 Kudos
Message 4 of 12
(4,060 Views)
You can't run it simultaneously in the same process. Of course you can start several processes, but I would agree that's not a great solution - it would work though. Another way of achieving what you're doing is to reduce the time defragmenting takes by using the NI_MinimumBufferSize property during acquision (this is what the thread I pointed you to is about).

Hope that helps,
Herbert
0 Kudos
Message 5 of 12
(4,056 Views)
Well, I did set the NI_MinimumBufferSize to 10,000 as indicated in the other thread, but over the period of an hour streaming 32 channels of 16-bit audio to disk at 3 kSa/s still generates huge index files. 

What are the rules for setting NI_MinimumBufferSize?  Is there a maximum value?  Are there hardware limitations (e.g., hard drive cache)?
Joe Gerhardstein
Viasat
Certified LabVIEW Architect
Certified TestStand Developer
Certified Professional Instructor
http://www.viasat.com
0 Kudos
Message 6 of 12
(4,054 Views)
There is no real maximum other than what your system can handle. However, if you buffer too much, the operation that flushes data to disc will take respectively longer, and that might derail your measurement at some point.

Are you storing scaled floating point numbers or raw integers? Storing the raw data along with scaling coefficients is going to drive your file size down by a factor of 4. That allows for larger buffers and faster writing. LabVIEW ships with examples that show how to write and read that kind of data along with scaling.

Herbert
Message 7 of 12
(4,050 Views)
I'm doing scaled floating point at this time.  The disk writing (and analysis) are not my big bottlenecks.  As usual, displaying the data real-time in a GUI on ancient laptops is my primary limitation.

I had hoped to do the defragmentation in the background at the end of a run, but my application becomes unresponsive while defrag is running (even if I set the VI priority to lowest and run it reentrantly in another thread with VI Server).  I don't know if the problem is too many threads reading/writing to disk simultaneously, or if the other TDMS VIs such as Open and Close are blocked during defragmentation (though I suspect the latter, as my application just sits at the TDMS Open until the defrag is complete, even though defrag is running in another thread via VI Server).  The application needs to be ready for another data run immediately after completing the previous, though the next run most likely won't start right away; so I most likely have time to defrag, but I can't afford to sit around if the operator needs to start recording again.

May need to rethink when to do the defragmeting...
Joe Gerhardstein
Viasat
Certified LabVIEW Architect
Certified TestStand Developer
Certified Professional Instructor
http://www.viasat.com
0 Kudos
Message 8 of 12
(4,046 Views)

I'd like to revisit this concept of a TDMS file that opens slowly.

 

No matter the size of a datafile, I would expect that simply opening a reference to that file should go quickly.

 

For example, if I have a 10 GB text file, it still only takes basically zero time to open a reference to it.  I just tested this.  And it is also equally quick to read a single line from that file. 

 

But for my large TDMS files, it still takes ages to open a reference.

 

I am curious, no matter how you configure your buffering and defragging, it appears to me that openning a reference to a large TDMS file is still quite slow (in comparison to binary and text files) ... am I correct?  Can someone explain why?

http://www.medicollector.com
0 Kudos
Message 9 of 12
(3,290 Views)

Hmm.  Looks like I must eat my words.

 

I just ran a test on a very large defragged TDMS and it openeed really fast.

http://www.medicollector.com
0 Kudos
Message 10 of 12
(3,286 Views)