# Cov in multiple linear regression

#### hanelliot

##### New Member
For multiple linear regression model Y = Xβ + e, assume X is non-random.
Assume Gauss-Markov assumptions hold. Show that variance-covariance matrix of b, the least square estimates of β is σ²(X'X)^-1.

Cov(b) = Cov[(X'X)^-1 * X'Y]
= (X'X)^-1 * X'Cov(Y)X(X'X)^-1
= ... I know the rest of the steps

My questions are:
1) notation for variance-covariance matrix of b is both Var(b) or Cov(b), correct?
2) I don't understand how I go from Cov[(X'X)^-1 * X'Y] to (X'X)^-1 * X'Cov(Y)X(X'X)^-1. Can anyone clarify? Thanks

#### vinux

##### Dark Knight
1. right
2. Use this property
Cov(AY) = A* Cov(Y) *A'

Here A = (X'X)^-1 * X'
So A' = X*(X'X)^-1
( Since (AB)' = B'A' )

#### Dragan

##### Super Moderator
For multiple linear regression model Y = Xβ + e, assume X is non-random.
Assume Gauss-Markov assumptions hold. Show that variance-covariance matrix of b, the least square estimates of β is σ²(X'X)^-1.

Cov(b) = Cov[(X'X)^-1 * X'Y]
= (X'X)^-1 * X'Cov(Y)X(X'X)^-1
= ... I know the rest of the steps

My questions are:
1) notation for variance-covariance matrix of b is both Var(b) or Cov(b), correct?
2) I don't understand how I go from Cov[(X'X)^-1 * X'Y] to (X'X)^-1 * X'Cov(Y)X(X'X)^-1. Can anyone clarify? Thanks
Here's a sketch of what you trying to get at:

Bhat = (XX)^-1 Xy
Substituting y=XB +e in this expression gives
Bhat = (XX)^-1 X(XB + e)
= (XX)^-1 XXB + (XX)^-1 Xe
= B + (XX)^1 Xe

Thus,

Bhat – B = (XX)^-1 Xe

Now by definition:

var-cov(Bhat) = E[Bhat – B)(Bhat – B)]
= E{[ (XX)^-1 Xe][XX)^-1 Xe]}
=E[ (XX)^-1 XeeX(XX)^-1 ]
where the last step is made by the fact that (AB)=BA.

var-cov(Bhat) = (XX)^-1 X E[(ee)X(XX)^-1
=(XX)^-1 X Sigma^2 I X(XX)^-1