difference between estimating variance and standard deviation.

#1
In a simulation study, is there any difference between

\(\bullet\) to estimate the variance \(\sigma^2\), \(1000\) times and taking its average,
and

\(\bullet\) to estimate the standard deviation \(\sigma\), \(1000\) times and taking its average?

Can I do anyone of these? Is there any preference of doing a particular one?
 

Dragan

Super Moderator
#5
In a simulation study, is there any difference between

\(\bullet\) to estimate the variance \(\sigma^2\), \(1000\) times and taking its average,
and

\(\bullet\) to estimate the standard deviation \(\sigma\), \(1000\) times and taking its average?

Can I do anyone of these? Is there any preference of doing a particular one?
Just consider the fact that the variance estimate of the population variance is unbiased, whereas the estimate associated with the standard deviation is not an unbiased estimate of the population standard deviation. There's your answer.
 

Dragan

Super Moderator
#7
The purpose is to talk about what Hillary and Roger are talking about after about 40 minutes.

The sample variance is unbiased but the sample standard deviation in not unbiased and I find that so strange.
Well, it's really not strange. In short, taking the square root of the variance is a non-linear transformation.

That said, if we assume that the population is normally distributed, then an unbiased estimate of the standard deviation (s) of the population standard deviation (Sigma) is as follows:

E = [ (4*n - 4) / (4n - 3 ] * Sigma.

Thus, we have:

E { [ 1 + 1 / (4*(n - 1)) ] * s } = Sigma