LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

"Write to binary file.vi" is creating fragmented files

HI Everyone,

I've got a high speed, large volume of data application. Its an 8 channel ultrasonic system. I've got a state machine to handle things like setup and initalisation and also visualisation. One of the states is acquisition where i capture data and write it two disk. I do this by calling two dll's (in two seperate loops) to get two streams of data - the full RF waveform (approx 2000 data points - 1D array) and hardware measurements on the waveform (approx 48 X 15 2D array). There are also some constants appended to the streams of data. These streams of data are written to two seperate files using 'Write to binary file.vi'.

I'm currently writing data at about 16MB/sec - i.e 1GB/min (RF data file is about 800MB/min and the harware measurements file is about 200MB/min) and the tests that i want to run are at least 30 mins long. This produces a lot of data.

I am finding that the files (so far i have been stopping and restarting recording with a new file name) are fragmented. Why is this? Does it matter? I am able to read the files ok, but the disk ends up really fragmented and it then takes forever to defrag the drive. Are there better ways to do this? Any ways to maximise speed?

Any suggestions appreciated.

Phil
0 Kudos
Message 1 of 5
(2,794 Views)
Thanks for the reply Pnt. Unfortunately it doesn't really answer my question.

Write to binary file is creating fragmented files and i don't know why. Is it something that i am doing wrong, or it is just the way that the vi creates the file?

Phil
0 Kudos
Message 3 of 5
(2,748 Views)
Sounds like the two files are fragmenting each other while you're writing them.
Some ways to check are
1. Write both files to an empty partition (if they're fragmented it must be with each other since there aren't any other files)
2. Write only one of the files (if it's not fragmented then the other file must be what's fragmenting it normally)

Assuming the above is the case (seem likely to me). Then it's caused be the os not knowing how much space the files need, so it'll put them right next to each (as well as the chunks needed as their requested).

I don't see this being a real problem though. But if you want to try to fix it anyway.

1. If you know the size before hand (I've not tried this so I may be missing something) you could open one file allocate the size with the set file size vi, then do the same for the other file (this is before you start writing the files) . Note this could slow down the effective hard drive speed since it has to seek back and forth to each file (I doubt you'll notice though).

2. Write each file to a separate partition. This has the same seeking issue as 1. but you don't need to know the size before hand.

3. Write each file to a separate hard drive. This needs a second hard drive but is the fastest and simplest solution.

4. Write both data stream into the same tdms file. Tdms is nearly as fast as binary, has data structure, can handle having two things writing to it at once, and you can defrag the single file (via tdms defragment) if you really care about it's reading speed (this is likely much faster than a windows defrag), and as far as windows is concerned it's not a fragmented file


0 Kudos
Message 4 of 5
(2,734 Views)
Thanks Matt,

Alas the problem is still there. I ended up changing the program so that it only saves one file before i read your post. I was also saving it to a new hard drive that had been formatted. The result? One big file that is fragmented.

The file size ends up being pretty big (i.e. 30GB) and i'm wondering whether that is the problem. Also i'm aquiring data and saving data very fast (it adds to the file ~200 times per second). There is a long term way for me to aquire more data in one go and save in larger chunks, but at the moment this is all i can do.

I'll try and capture a screen of what i'm doing to see if anyone can explain.

Phil
0 Kudos
Message 5 of 5
(2,700 Views)