11-07-2012 08:46 AM
Hello,
I am currently working on a project were I am streaming data acquired with different cRIO modules to an external HDD connected to the cRIO (9024).
The measurement has to be continous with up to 50kS/s/Ch. The connected HDD is formated in FAT32 and working fine in the beginning of the measurement.
I save the data to binary files directly on the ext. HDD and create a new file every 5mins.
The code is working fine for the first couple of hours but at some point it crashes. I think there is a relation to the number of files on the external HDD.
While debugging i found out that the "Open/Create/Replace File" VI takes more and more time when there are more files in that folder.
I can not create larger files (more then 5mins each) because then i get another problem with the 4Gb restriction of one files with FAT32.
The crashes usually happen when there are around 1000 files.
Is there a way to work around this issue? Does anybody has the same experiences with streaming to external HDD on cRIO?
Best Regards
Gerd
11-08-2012 05:06 AM
Hi,
when using fat32, the maximum number of entries in a folder is 65,534.
But you have to be careful, since depending on the lenght of each file and subdirectory it can take more entries. (from 2 to 13 entries)
Reference:
I don't know how you structured your folder but 1000 files shouldn't be enough to exceed the file count limit.
In your case I guess it's a performance issue. Because an index list is created as well for every folder. So the performance will drop rapidly when you reach a big amount of files.
You could try to change the directory after a few hundred files.