What do you mean by "not accurate"?
A statistical analysis should not depend on the sample size, except that the result becomes more reliable. If you get statistically different results, it means that your decimation is biased. This can for example happen if your signal contains a certain frequency component and your resampling is related to that frequency.
Example: Your data contains a strong component at the Nyquist frequency (half your sampling rate). If you would throw away every other data point you get an overestimate or underestimate (depending on the phase) if you take the mean of the resampled data.
Can you attach some example data? How do you decimate? How different are the averages as a function of sample size? Are the differences you see statistically significant?
Maybe you could add some randomization to your decimation, e.g. pick randomly one element from each consecutive 10 samples for a 1:10 decimation. If you are just interested in averages, you could also average a certain number of points to create one point in the resampled data (here you would significantly change e.g. the standard deviation, but you could still estimate the original standard deviation from the standard deviation of the resampled data and the decimation factor).