Does the standard error of the regression not need to be corrected by n?

The standard error of an estimator is defined as the square root of the the estimator variance (or mean squared error (MSE) for unbiased estimators). More specifically, if we wanted to get the standard error of the sample mean \(\bar{X}\), we would divide the variance of the sample that \(\bar{X}\) was derived by \(n\) and then square root them

MSE (\(\bar{X}\)) = \(\frac{s^2}{n}\), where \(s^2\) is the sample variance for the sample of \(X\)s.

S.E. (\(\bar{X}\)) = \(\sqrt\frac{s^2}{n}\) = \(\frac{s}{\sqrt{n}}\).

However, in OLS regressions, the standard error of the regression is defined as

\(\frac{\text{SSR}}{n - K}\) where SSR = Sum of squared residuals.

My question is does this not need to be corrected again by dividing by the sample size n? Surely \(\frac{\text{SSR}}{n - K}\) is just the sample estimate of the population variance and, according to the formula above, it needs to be corrected. I understand that in both cases the standard error is the square root of the MSE but in the first case the MSE is the sample variance that the estimator came from divided by n and in the second example the MSE is just the sample variance. Does anyone have an explanation?

Thanks in advance,

This has been bugging me for a while.


Ambassador to the humans
That is just the estimate for \(\sigma^2\) in the model. To get the standard error for something else we use this estimate but you would need to specify what quantity you're interested in getting the standard error for.
Hi Dason,

Thanks for helping me with this. Yeah the \(\frac{\text{SSR}}{n -K}\) is the estimate of \(\sigma^2\) in the model and if we wanted to get the standard error of one of the \(\beta\)s we would combine it with the \((X'X)^{-1}\) matrix. But if you google/wiki/look up in textbook the standard error of the regression it just gives you \(\sqrt\frac{\text{SSR}}{n -K}\) which is the sample variance of the errors. So I thought that sample variances are corrected when looking at standard errors like the first example?

Considering that there is a standard error of the regression, what is the mean of the regression? The regression line? This would be the point where the residual equal zero, \(E(e_i) = 0\)?