LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Storing variably-decimated data in multiple circular buffers

Blindly and with a clear lack of imagination or adequate forethought, I have been happily struggling along to create a set of classes to manage some 'DataQueue' objects following the inspiration received here: Monster Panel V.

 

The idea there is to take your data and append into circular buffers at varying levels of decimation. This provides the last few minutes at a high data rate, and longer times at a slower rate, perfect for plotting.

 

Most of my difficulties have revolved around handling data publication at varying rates, some possibly with non-constant dt, and trying to appropriately manage the queues.

 

However, the problem I'm requesting advice with here is as follows:

This design works nicely so long as the graph that you want to plot acts as a chart, with the most recent data being shown. However, it's not inconceivable that I will want to view a 30s segment from an hour and a half ago - for that duration, I'd want a relatively high data frequency, but the high frequency buffers might well have a shorter duration than an hour and a half, in this made up example.

 

Clearly, if I discard data, I can't conjure it from nowhere - either I should keep data in memory with large arrays, or by abandoning the whole idea, or I should accept that older time periods can't have high resolution data, and check for a buffer containing the appropriate period rather than the best match with frequency/dt.

 

Does anyone have some guidance or tips as to how to best approach this problem? Perhaps saving and reading from disk is a possibility, but I would imagine it will complicate the code quite significantly. If I do that, it might be easier to abandon this implementation, keep all data for the last N minutes/hours, and read anything older as needed.

 

Data is taken from around 5 channels at perhaps 1kHz, and another 3 at around 1Hz.

With a total data rate of 5k*8B/s, I'm occupying 80kB/s with no decimation (also have time arrays) - that comes out at around 280MB/hour by my calculation which is too high. However, a factor of 10 less is not unreasonable and would still feed a 2000 px graph for any time span longer than around 2000*2/100 = 40 seconds (20 seconds for 1S/px?).

 

A typical timespan is probably less than 15 hours, but as soon as I start choosing these limits, I invite disaster when I want a 16 hour run...


GCentral
0 Kudos
Message 1 of 2
(2,496 Views)

If you want to maintain and access higher frequency data for a long period of time, then I think you will certainly have to write it out to disk.  Consider using TDMS files to write it out, and search the examples for a TDMS viewer and other examples.

 

I believe TDMS does a pretty good job of cataloging the data it has so you can go and find segments of data without a whole lot of problems.

0 Kudos
Message 2 of 2
(2,481 Views)