linear regression model: LSE, Var, Cov etc

#1
Studying for a test.. can anyone check my solutions for (a)-(c) and give me any help on (d)? Any help would be appreciated.

Consider the model Yi = β1Xi + εi, i = 1, 2, ..., n with Xi non-random and εi’s satisfying the usual assumptions i.e., E(εi) = 0 and Var(εi) = sigma^2.

a) Show that the least square estimate of β1 for this model is given by:
b1 = ∑XiYi / ∑Xi^2.
=> Need to find the value of b1 that minimizes S = ∑(Yi - βXi)^2. dS/dβ = -2∑Xi(Yi-βXi). We set this equal to zero and get b1 = ∑XiYi / ∑Xi^2.

b) Show that the estimate b1 = ∑XiYi / ∑Xi^2 is unbiased estimator for β1
=> E(b1) = ∑Xi*E(Yi) / ∑Xi^2 = ∑Xi*(βXi)/∑Xi^2 = β.

c) Show that Var(b1) = sigma^2 / ∑Xi^2.
=> Var(b1) = ∑Xi^2*Var(Yi)/(∑Xi^2)^2 = sigma^2/∑Xi^2.

d) Show that Cov(Ybar, b1) = Xbar*sigma^2 / ∑Xi^2.
=> I don't know how to do this one. Help?
 

Dragan

Super Moderator
#2
d) Show that Cov(Ybar, b1) = Xbar*sigma^2 / ∑Xi^2.
=> I don't know how to do this one. Help?
I’ll sketch the proof of part (d) for you.

Cov[Ybar, Bhat1]= E[(Ybar – E[Ybar])(Bhat1 – E[Bhat1])]
= E[(Ybar – E[Ybar][Bhat1 – Beta1]
=Xbar*E[Bhat1 – Beta1]^2
=Xbar*Var[Bhat1]
=Xbar*Sigma^2/Sum(X^2).

Use the fact that Ybar = Bhat*Xbar because the intercept term is zero.
Thus,
E[Ybar]=Beta1*Xbar
Ybar – E[Ybar] = Bhat1*Xbar – (Beta1*Xbar) = Xbar*(Bhat1 – Beta1).