Change regression model ($x^*_i = x_i -10$)

valentina89

New Member
Hi there.:wave:
I'm solving an exercise on multiple linear regression. Near the end I will be asked for the same data as the previous model, it is the maximum likelihood estimates.

The previous model:
I have the matrix $$(X'X)^{-1}$$ and the matrix $$X'y$$ and the model is:

$$Y_i = B_0 + B_1x_i + B_2x^2_i + e_i$$

$$i= 1,...,10$$

Now I have:

$$Y_i = g_0 + g_1x^*_i + g_2(x^*_i)^2 + e_i$$

$$x^*_i = x_i -10$$

$$i=1,...,10$$
To transform the model can I decrease the Matrix data for 10?

The same thing can I make it also for the matrix $$X'y$$?
I have many doubts.
Publishing the text so it is more understandable.

Consider the regression model linear:

$$Y_i = B_0 + B_1x_i + B_2x^2_i + e_i$$

with $$e_1, ...., e_n$$ independent and identically distributed random variables and x_i, i = 1, ..., 0 constant fix.

The only data I have are these:

https://s32.postimg.org/o20r1i5et/Immagine.png

I have to rewrite the whole thing ... I tried so:

$$(X) = \begin{bmatrix} 1 & x_1 & x^2_1\\ . & . & . \\ . & . & . \\ . & . & . \\ 1 & x_{10} & x^2_{10}\\ \end{bmatrix}$$

$$(X^*) = \begin{bmatrix} 1 & x_1-10 & (x_{1}-10)^2\\ . & . & . \\ . & . & . \\ . & . & . \\ 1 & x_{10}-10 & (x_{10}-10)^2\\ \end{bmatrix}$$

$$X^{*'}X^* = \begin{bmatrix} 10 & \sum_{i=1}^{10} x_i-10 & \sum_{i=1}^{10}(x_{i}-10)^2\\ \sum_{i=1}^{10} x_i-10 & \sum_{i=1}^{10} (x_{i}-10)^2 & \sum_{i=1}^{10}(x_{i}-10)^3\\ \sum_{i=1}^{10} (x_{i}-10)^2 & \sum_{i=1}^{10}(x_{i}-10)^3 & \sum_{i=1}^{10}(x_{i}-10)^4\\ \end{bmatrix}$$

Now I have to calculate the inverse? I'm following proper solution? Excuse me , but does not write well in English
Thanks.

Dason

Ambassador to the humans
Are you just trying to find the estimates of the parameters for the model using the transformed data? I would just figure out what the new parameters are in terms of the old parameters and then use the invariance property of MLEs to get the estimates.

valentina89

New Member
The question is:
Consider , for the same data , the model

$$Y_i = g_0 + g_1x^*_i + g_2(x^*_i)^2 + e_i$$ with $$x ^* _i = x_i - 10 , i = 1 , . . . , 10$$ , independent random variables and identical cally distributed N ( 0 , σ2 ) . It is the maximum likelihood estimates γ0 , γ1 , γ2.

Last edited:

valentina89

New Member
and then use the invariance property of MLEs to get the estimates.

In this case the estimate remains the same?

$$B_0 = g_0$$ ecc..
I do not have a real transformation of the initial parameters , but only a "data processing " . What I did is right ?

Dason

Ambassador to the humans
Consider just a simple linear regression and the transformation you had. Our original model would be
$$E[y] = \beta_0 + \beta_1x_i$$

With $$x_i^* = x_i - 10$$ our model is

$$E[y] = \beta_0^* + \beta_1^*x_i^*$$

Replace with our definition of $$x_i^*$$ to get

$$E[y] = \beta_0^* + \beta_1^*(x_i - 10)$$

Expand and rearrange and we have

$$E[y] = \beta_0^* - 10\beta_1^* + \beta_1^*x_i$$

Now since we already know $$E[y] = \beta_0 + \beta_1x_i$$

we can now say that

$$\beta_0 = \beta_0^* - 10\beta_1^*$$
and
$$\beta_1 = \beta_1^*$$

So using this (since we're assuming we know the values for $$\beta_0\( and \(\beta_1$$ we can solve for the new coefficients. It's the process with your problem but since you have a quadratic the equations take a little bit more work.\)\)

valentina89

New Member
Wow!

Thank you so much.
I try now to do the exercise right!