@drjdpowell wrote:
So one could think of collecting a file, then periodically start a new file, then defragment the previous one and add it to a large main file.
Oh that's a better idea than I've been doing in the past. Since my logging module is a separate running loop, I can invoke a defrag, and then the things it should be logging get pushed into a queue. When the defrag is done it will just go through and process the queue adding it to the newely defragged file. But I have seen that as a test goes on, the amount of time needed to defrag will obviously go up. And so by the time I come around to process the queue it will have even more things to log. Your idea of making the defragging it's own parallel processing in the logging module is a great idea, where we can keep logging to a new file and then when the defrag is done, just merge the two files and keep going.