Sum of the residuals in multiple linear regression

#1
In my textbook, the following results are proved in the context of SIMPLE linear regression:
∑e_i = 0
∑(e_i)(Y_i hat)= 0

I tried to modify the proofs to mutliple linear regression, but I am unable to do so, so I am puzzled...

Are these results also true in MULTIPLE linear regression?

Thanks!
 

vinux

Dark Knight
#2
In my textbook, the following results are proved in the context of SIMPLE linear regression:
∑e_i = 0
∑(e_i)(Y_i hat)= 0

I tried to modify the proofs to mutliple linear regression, but I am unable to do so, so I am puzzled...

Are these results also true in MULTIPLE linear regression?

Thanks!
Yes. In Guess markov set up and Least square estimate and MLE are same. (http://en.wikipedia.org/wiki/Gauss-Markov_theorem)


∑e_i = 0
∑(e_i)Xij= 0 (Assuming model Y = a +a1X1+a2X2...)
these are the part of normal equations.

∑(e_i)Xij= 0 => ∑(e_i)(Y_i hat)= 0

See this for more information.

http://en.wikipedia.org/wiki/Linear_least_squares#Derivation_of_the_normal_equations