10-07-2008 02:58 PM
If excel is having a hard time handling/graphing this much data, LabVIEW could do it 
You can break the file up into smaller files and/or graph everything in one VI
10-08-2008 08:44 AM
I'm with Cory on this one. LabVIEW is probably a better choice for analyzing the data (though, from what I understand, Excel '07 can handle more data).
One thing I would caution you about with the suggested decimation routine is the danger of producing aliases in the data if you are looking at periodic phenomenon. I think you said you are looking at 1Hz data; you may already be at risk of seeing aliases if there is anything with greater than a 1/2Hz frequency in your data and the signals weren't low pass filtered before being sent to the ADC. 1Hz is a typically sampling rate for things like thermocouples which often have low pass filters built into the signal conditioning hardware; maybe that's what you are looking at. If you then decimate without filtering, you may see frequencies in the data that don't really exist.
If you want some more help looking at the data, please feel free to post back with questions.
Chris
10-08-2008 10:30 AM
C. Minnella wrote:One thing I would caution you about with the suggested decimation routine is the danger of producing aliases in the data if you are looking at periodic phenomenon.
....
If you then decimate without filtering, you may see frequencies in the data that don't really exist.
True. A very simple solution here would be to average N adjacent points. It would only need a few trivial modifications of my code example.
From the problem description:
"Essentially we collected data a 1 second intervals, but for analysis purposes we may only need to look at the data in 5 or 10 second intervals."
it seems this is not a problem, since he could have sampled at 10 second intervals to begin with.