05-20-2018 04:49 AM - edited 05-20-2018 04:53 AM
Hi sir,
By using your algorithms, the slowing down issue has been gone. The program runs very fast even in the tests that last more than 24 hours.
But when the program is stopped to be saved of the datas, there will be "Not enough memory to complete this operation" warning.
Also when I look at the row number of the text file, I see there is loss data. I mean that if the test lasts 20 hours which means that 20 x 60 x 60 seconds, and if my logging rate is 100 ms, then there should be 720.000 rows in the text file. But there is 120.000 rows for example.
As you said, my program stands for producer/consumer structure. I acquire datas in producer loop and transfer them with Enqueue. In the consumer loop, I open file before loop, I take the datas with Dequeue in the loop and write them with the I/O "Write to Text File" and close the file after the loop.
To solve this problem I think two options;
1 - To open a new file after every 2 hours and go on to save the datas. By the way, for example if the test lasts 20 hours, I will have 10 text file at the end of the test. I will write a post process program to combine these 10 text files. So after all of these, I will have 1 text file.
2- To use TDMS files. This is simple option. I will just replace the File I/Os with TDMS files.
What my aim is not to loss datas while logging to the file and at the end of the test not to see the "Memory is full" error.
I would like to ask you which option is the best method to solve my problem?
06-08-2018 02:19 PM - edited 06-08-2018 02:21 PM
Do both. TDMS will be much faster and more efficient than writing text files. Append to a TDMS file in your loop, and add logic to automatically close the file and create a new one every e.g. four hours so that the individual file sizes don't grow too large.
As far as the missing data is concerned - benchmark your loops to make sure that they are running as fast as you think.