11-03-2009 12:26 AM
I have a producer-consumer type of application in which one loop continuously reads data from the card and queues it and another loops reads/saves data. Additionaly one more loop plots the data. The number of channels and real-time analysis is defined by the user so the time it would take to save, display, and analyze each block (one iteration of Analog/Digital Read) of data may vary for different settings.
Currently I am setting the "number of samples to get" large enough on the Analog/Digital Read at the start so I don't get queue overflow. Now I would like to adjust it automatically while the program is running so I don't get any queue overflow and at the same time the data is not display in huge chunks at a time. Has anyone done this sort of thing before? Right now the only idea I have is to read the # of elements in the queue continuously and use some mathematical formula to increase or decrease # of samples.
Solved! Go to Solution.
11-03-2009 05:41 PM
Abdel2,
Reading the number of elements in the queue continuously and using a mathematical formula to increase or decrease the number of samples would be a great method. You could also use the Decimate 1D Array Function to display a smaller grouping of your data if your issue is that you don't want to display huge chunks of data at time.
11-19-2009 10:50 AM
Thanks Ben S. I think I'll pursue this then as soon as I get the time.
I am already decimating the data for displaying purpose, but not by using the "decimate array" function. I need to programmatically decide the decimation factor so I wrote a subroutine to do the job.