04-27-2025 10:03 PM
I'm post processing a bunch of images. Basically converting gray scale to binary and identifying a range of features and reporting out to a spreadsheet.
As the number of images processed increases, so does the processing time.
Below is a graph with the number of images processed in the X axis, and the time for each process in mS in the Y
Note that the variation in processing times becomes wider on top of a general time of average time increasing.
Image complexity is not a factor, running the data set back to front yields the same result. The reporting module is turned off, so no results are being saved except the graph monitoring progress.
I'm open to suggestions on what to look for in the program to get a repeatable predictable time outcome.
04-28-2025 08:49 AM
You have not mentioned the software you are using for "post processing". LabVIEW? Something else?
What is your definition of "spreadsheet"? Excel? csv? Something else?
04-28-2025 11:50 AM
I have a folder on the hard disk of Images, monochrome 8 bit. My mistake for assuming that LabVIEW was a given. Yes LabVIEW 21 32 bit Professional Development system, Thresholding and other vision related tasks are performed by the Vision Development Module, Thresholded images, and blob analyses to find elements of interest..
Spreadsheet is an XLSX file, although that bit of code is turned off. Results are formatted for reporting, but report generation is not performed.
Loop loads an image, processes the image, reports data, then starts over.
More questions welcome!
04-28-2025 02:05 PM
Is Windows Performance monitor showing any constant increase in memory, CPU, and disk activity over time also?
-AK2DM
04-28-2025 05:40 PM
I will run the test again, and record performance stats at beginning and end and post. Might be a while.