Let y(x) = a + bx be a regression-line with a=intercept and b=slope, with b <> 0 then:
I can proof: y(x) is an optimal regression ==> SST = SSR + SSE (*)
Sketch of Proof: Proof that SST=SSR+SSE under the condition: y(x) is optimal <=> dR²/da=0 (1) and dR²/db=0 (2) for y(x).
(R² := 1 -SSE/SSR "R-Squared"). Then the conditions (1) and (2) let to an equation which proves (*)...
My idea: The proof of the above conjecture is “not easy” and maybe a little hard to understand. Easier would it be to construct a counterexample.
To construct a counterexample: define a Training Set TS= {observation-points}; a sLR-line which has condition (*), but is not an optimal sLR-line.
My question: Exist an easy conterexample? Or any other ideas?
I can proof: y(x) is an optimal regression ==> SST = SSR + SSE (*)
Sketch of Proof: Proof that SST=SSR+SSE under the condition: y(x) is optimal <=> dR²/da=0 (1) and dR²/db=0 (2) for y(x).
(R² := 1 -SSE/SSR "R-Squared"). Then the conditions (1) and (2) let to an equation which proves (*)...
My idea: The proof of the above conjecture is “not easy” and maybe a little hard to understand. Easier would it be to construct a counterexample.
To construct a counterexample: define a Training Set TS= {observation-points}; a sLR-line which has condition (*), but is not an optimal sLR-line.
My question: Exist an easy conterexample? Or any other ideas?