# Standard deviation of sampling distribution when n=1

#### Frodo/Sociology

##### New Member
Hi everyone,

I'm reading this when it comes to the theory around the sampling distribution: "The standard deviation of the sample means (known as the standard error of the mean) will be smaller than the population standard deviation and will be equal to the standard deviation of the population divided by the square root of the sample size."

But what if we have a sample size of 1? Then the standard deviation of the sampling distribution (the standard error of the mean) would be the same as the standard deviation of the population. (In other words, it can't always be smaller than the population standard deviation, if it would be the same when n=1).

Or is this illogical in this case, because a sampling distribution of all possible samples of size 1 is basically the population itself? Can you even have a sample size of 1? Is this considered a "sample"?

Frodo

#### obh

##### Well-Known Member
Hi Frodo,

Clearly, the standard deviation statistic has practical meaning when you use a sample that is greater than 1.
So why are you interest in this question?

If the entire population contains only one subject then the standard deviation is 0, as there is no deviation from the average.

If your sample size is 1, then the sample standard deviation is not defined ... (clearly not 0)

#### Frodo/Sociology

##### New Member
Hi Frodo,

Clearly, the standard deviation statistic has practical meaning when you use a sample that is greater than 1.
So why are you interest in this question?

If the entire population contains only one subject then the standard deviation is 0, as there is no deviation from the average.

If your sample size is 1, then the sample standard deviation is not defined ... (clearly not 0)
Hi obh,

Thank you for your response. This makes sense to me. However, the question I have is about the standard deviation of the sampling distribution.

Even if the standard deviation of the sample is not defined with a sample size of 1, wouldn't the standard deviation of the sampling distribution end up being the same as (equal to) the standard deviation of the population, when the sample size is 1?

In the quote I provided above, it says that the standard deviation of the population is always greater than the standard deviation of the sampling distribution, because the standard deviation of the sampling distribution is a function of n (it will be equal to the standard deviation of the population divided by the square root of the sample size). But how can that be if our sample size is 1? Then the standard deviation of the sampling distribution would be the same as the standard deviation of the population, not less than. (Because the square root of 1 is 1.)

I hope you can understand what I'm saying. Thanks again for your help.

Frodo

Last edited:

#### katxt

##### Well-Known Member
You are right. They should say "the standard deviation of the population is always greater than the standard deviation of the sampling distribution for any sensible sample."

#### spunky

##### Can't make spagetti
I'm reading this when it comes to the theory around the sampling distribution: "The standard deviation of the sample means (known as the standard error of the mean) will be smaller than the population standard deviation and will be equal to the standard deviation of the population divided by the square root of the sample size."
Was this quote taken from an textbook in intro stats/methodology aimed at social scientists?

#### katxt

##### Well-Known Member
Also, these are only average estimates. With samples of n=2, the calculated SEM is greater than the population SD about 15% of the time.