# variance in regression

#### PeterVincent

##### New Member
Hi,

Can someone help me fugure this out regarding simple linear regression.

Firstly B is beta and A is alpha

We are told that y = A + Bx + e where e is a random variable representing the error.

Assumption 1. e has a mean of 0 ie E(e) = 0

and

Assumption 2. The variance of e, var(e)= v is constant for all x.

If these two are not met ie E(e) not = 0 and v not constant then the simple linear regression does not apply.

Here is where I have my problem:

Hence E(y) the expected value of y for a given x is A + Bx and the variance var(y) = v which is the same variance for e. I can not see that given 1 and 2 above how var(y) = v for all x.

I feel that it has something to do the fact that if you have a list of numbers and you calculate the variance, then if you add a constant to all the numbers then recalculate the variance it will not change. Using this and say the example of age of mother (x) and weight of baby (y), in the population there will be many 30 year old mothers and many different weights of babys so there will be a non zero variance in y and in e. Plugging in 30 to A + Bx will give a constant so y = a constant + e therefore var(y) = var(e). I have no problems with this for a single given x but what if x = 35 or 25 then the constant will be different for different x so the variance for all the babys seems to be dependant on x.