Suppose \(x_{t} = w_{t} + \theta w_{t-1}\) where \(w_t\) is white noise with variance \(\sigma_{w}^2\)
Derive the minimum mean square error one-step forecast based on the infinite past and determine the mean square error of this forecast.
Let \(\tilde{x}_n^{n+1}\) be the truncated one step ahead forecast based on the n previous observations. Show that \(E[(x_{n+1}-\tilde{x}_n^{n+1})^2=\sigma_w^2(1+\theta^{(2+2n)})\)
The solution for the first part states that:
\(x_{n+1} = \sum_{j=1}^{\infty}-\theta^{j}x_{n+1-j} + w_{n+1}\) and \(\tilde{x}_{n+1}=\sum_{j=1}^{\infty}(-(\theta)^{j}x_{n+1-j})\) so the \(MSE = E[x_{n+1}-\tilde{x}_{n+1}] = \sigma_{w}^2\). I am not sure how the expression for \(x_{n+1}\) or \(\tilde{x}_{n}^{n+1}\) was arrived at. Can someone explain how this was derived?
For the second part, the solution states \(\tilde{x}_{n}^{n+1}= \sum_{j=1}^{n}-\theta^{j}x_{n+1-j}\) and \(MSE = E(x_{n+1} - \tilde{x}_{n}^{n+1})^2 =E[\sum_{j=n+1}^{\infty}-\theta^{j}x_{n+1-j} + w_{n+1}]^2\). I am not sure how this was arrived at.
Derive the minimum mean square error one-step forecast based on the infinite past and determine the mean square error of this forecast.
Let \(\tilde{x}_n^{n+1}\) be the truncated one step ahead forecast based on the n previous observations. Show that \(E[(x_{n+1}-\tilde{x}_n^{n+1})^2=\sigma_w^2(1+\theta^{(2+2n)})\)
The solution for the first part states that:
\(x_{n+1} = \sum_{j=1}^{\infty}-\theta^{j}x_{n+1-j} + w_{n+1}\) and \(\tilde{x}_{n+1}=\sum_{j=1}^{\infty}(-(\theta)^{j}x_{n+1-j})\) so the \(MSE = E[x_{n+1}-\tilde{x}_{n+1}] = \sigma_{w}^2\). I am not sure how the expression for \(x_{n+1}\) or \(\tilde{x}_{n}^{n+1}\) was arrived at. Can someone explain how this was derived?
For the second part, the solution states \(\tilde{x}_{n}^{n+1}= \sum_{j=1}^{n}-\theta^{j}x_{n+1-j}\) and \(MSE = E(x_{n+1} - \tilde{x}_{n}^{n+1})^2 =E[\sum_{j=n+1}^{\infty}-\theta^{j}x_{n+1-j} + w_{n+1}]^2\). I am not sure how this was arrived at.