I take the mean of every 2 of the data points and thus I have 7,500 means. Their variance is 1.4444 -- about 1/2 of 2.7950, which seems reasonable.

I take the mean of every 5 of the data points and thus I have 3,000 means.

Their variance is 0.5889 -- about 1/5 of 2.7950, which seems reasonable.

Similarly, 10-wise mean: variance 0.2757 -- about 1/10 of 2.7950 -- reasonable.

Now, 20-wise mean: variance 0.1010 -- ONLY about 1/30 of 2.7950.

60-wise mean: variance 0.0133 -- ONLY about 1/200 of 2.7950.

I got the data from sensor measurements in an experiment, so there is no deliberate tampering to the data. How come when means are taken over large samples, the variance of the resultant means shrink much greater than is reasonable? What could be causing the problem?