# Hell Please. Linear Regression Model

##### New Member
Which of the following is NOT assumed to contribute to uncertainty in predicting y from x in a standard Model I linear regression.

a. Uncertainty due to varian of y around model.
b. Uncertainty over identifying the true model.
c. Uncertainty in measuring X.
d. Uncertainty in measuring Y.

##### New Member
Hello, Thank you for your response. This question is actually part of my learning lectures as I am trying to self educate myself on statistics before starting a PhD in the fall. I struggled with this question trying to figure it out and this is what I have come up with so far.

I initially thought the answer would be D (Uncertainty in measuring Y) because we are trying to predict y from x, so the assumption is that we are uncertain about y to begin with, thus it does NOT contribute to the uncertainty. Just a logical guess, but I am not sure my reasoning makes sense?

Any help is greatly appreciated

#### Miner

##### TS Contributor
Most resources only list 4 assumptions for Ordinary Least Squares regression, but there are actually 5. This question pertains to that 5th assumption.

Research the differences between Least Squares regression and Deming regression. That difference pertains to the 5th assumption and will lead you to the correct answer..

#### hlsmith

##### Less is more. Stay pure. Stay poor.
What type of program are you going into?

##### New Member
Thank you Miner for guiding me in the right direction. I followed your advice and read up on the assumptions and based on what I understand the fifth assumption is one where there is No autocorrelation.

Given my extreme limited background knowledge in statistics (hence I am taking this course) I am still a bit confused as to how that helps me answer my question.

Initially I thought that if the fifth assumption is no autocorrelation then the correct answer should be A: Uncertainty due to variation of y around model. But the more I kept reading about the fifth assumption, the more confused I became and unsure about my answer.

Kindly appreciate your help.

#### Miner

##### TS Contributor
OLS regression assumes that there is no variation about the independent variable(s) (i.e., no measurement error), that they can be treated as if they are fixed. OLS regression then minimizes the residuals vertical to the regression line.

That is the distinction between OLS and Deming regression. Deming regression assumes that there is uncertainty about the IVs as well as about the DVs. In Deming regression the residuals perpendicular to the regression line are minimized.