08-03-2005 12:00 AM
08-04-2005 10:49 AM
gztek,
I don't have any great solutions for you, but here are a couple ideas:
1) Are you sure you really need to save the data every time? Do you expect the machine to hang/crash/shutdown or is the data extremely critical? With the current design of the program, you can already lose as many as 40000 data points if the program exits unexpectedly. You could lose even more data if you overlow the thread safe queue. If you can call save every 10-60 iterations instead of every iteration, you should see some performance improvement.
2) You could save the data into multiple files. For example, save each iteration of data into a separate file. File1.tdm, File2,tdm,...FileXX.tdm. Then after all of the data has been saved, you could loop over each of the individual files and combine them into one large file. The operation of combining the small files into one large file might be somewhat time consuming, but you could do this after all of the data has been saved at which time any performance lag should not be such a problem.
We are aware of this performance issue and are looking into it. We hope to improve performance in a future release.
-Jeff
08-04-2005 07:48 PM
08-04-2005 07:53 PM
Hi Nandini ,Jeff
Thank you for your suggestions, I will try them.
gztek
10-17-2005 10:50 AM
Hi gztek,
Just today we pushged live a new KB that outlines how to create a TDM header file for an existing binary file or files. This way you can stream the binary file with standard C functions and just write a quick XML ASCII file with the TDM header writer dll afterwards.
This link may take a few hours to turn live,
Brad Turpin
DIAdem Product Support Engineer
National Instruments