10-28-2021 04:55 AM
Hello,
I am using a simple TSQ to acquire frames from a camera and save it onto my Hard Disk. It seems like although, I am able to achieve the full acquisition speed of the camera, I am losing data while saving, please let me know what I am doing wrong. Thank You.
Solved! Go to Solution.
10-30-2021 02:43 AM
If I understand your code correctly, you create a queue that can dynamically grow; during program execution you pass data to the queue and read them back in the reading function where you discard only 1000 elements and open and close a file for each one of them. (And you possibly do it for up to 6 cameras at a time???) Now this operation is very time-consuming and I suspect that your queue could grow up in the process without being fully emptied.
One possibility is to process the entire queue each time but the real magic is to rewrite the saving function to avoid that huge number of open/close operations: I seem to understand that your data is made in fixed size blocks, so you could save all data in a single file during program activity and add a post processing function that splits this unique repository into the appropriate number of files after the acquisition is closed.
10-30-2021 03:56 AM
Hello Roberto,
Thank You for the suggestion, I need to save the data continuously, while the data is being acquisitioned, I don't want to limit to some specific number of frames but a continuous acquisition, so the data save needs to open and close, but is there some other approach that I should look forward to? Or am I missing something?
11-02-2021 05:03 AM
Well, I'm assuming image save is the most time-consuming operation in your application and so I pointed out some possibilities on that.
On one hand you can try understanding how much time is spent in your app by enabling profiling and tracking image acquisition, data processing and save operations (Execution Profiler is included in CVI full licence).
If my assumption is correct, then you need to redesign data saving part of your application: reducing the number of open/close and I/O operations is the basis of making data saving more efficient. Since you save data in binary format, I assume there is a separate application that reads data back and displays/processes/do something on them: maybe you can save all items in a single file per loop and modify data handling application accordingly.
11-05-2021 04:53 AM
Thank You for this! I did make it fast by saving data to one open file, reducing the open/close operations to just 1 sped things enough to save data real time! Thank You again!