12-14-2005 04:26 PM
12-14-2005 04:43 PM
12-15-2005 05:33 AM
I'm not using history buffers. The application acquires data, processes the data and saves one sample at a time ploting in the chart. I tested implementing a buffer to write the buffers, but I still get the delays on the file writing. The data is lost at every 20ms. If there is no plot of data, the file writing is ok. There are 7 chart plots. Decreasing to 4 plots, the loose of data drops to 10ms.
The points when ploted off-line (reading the file) are like this
......... - 10ms of "silence" - .............. - 10ms of "silence" - ...............
The dots represents data points, and "silence" means loose of data.
In this aplication, I cannot rely on the Sample rate of the board, because of the processing, so I implemented a timebase counter (based on the internal "tickcount" function of LAbVIEW).
ricardo
12-15-2005 10:28 AM
12-15-2005 09:53 PM
12-16-2005 02:51 AM
Someone might correct me if I'm wrong:
As you wrote, you try to put the display and the save routines in different threats and pass the data with queues. However if both routines have indicators/Controls showing on the/(a) frontpanel they will both run in the same UI-threat.
Again, it's hard to tell without code.
12-16-2005 02:57 AM
next thing: do you use (double) buffered aquisition? Sounds like a point by point reading. Hello Code
If your apisition fails by enabling the autoscale, it might also fail if there are other system tasks (moving windows with mouse)