Salutations,
I've used some large data sets, like the ones you've described. I once went through about 1.5 gigs of information, that was pleasant.
It took approximately 45 minutes to complete all the necessary analysis. Operations included, resampling, order extraction, statistical mean and standard deviation calculations, feature calculations (kurtosis, crest, mean, rms, so forth), binning, etc....
I don't know what your task goal is, however, the best technique always involves keeping as little information around as possible. For example, when taking the mean, you only need 3 numbers.
Current mean, new value to add in, and the number of samples.
Hence, finding techniques that don't require all the data to be in at one time, the better off you are. The more data you have in at one point in time, the more you're using up the memory. Then you'll start paging files and use all your CPU to take care of that. Once that occurs, you're boned for the most part, it's best to try again.
If you're entering data, you could have it write to files every so often and clear out the data set when you perform that task, etc...
Without knowing what exactly you desire, the best note (as i repeat myself for the 252th, give or take, time): use as little data as possible.
Sincerely,
ElSmitho