Skip to main content

Linear predictors from noise

Starting from an ARMAARMA process y(t)=W(z)e(t)=C(z)A(z)e(t)y(t)=W(z)e(t)=\frac{C(z)}{A(z)}e(t), wehere e(t)WN(0,λ2)e(t) \sim WN(0,\lambda^2) we know that y(t)y(t) is a steady state solution and y(t)MA()y(t) \sim MA(\infty), so: y(t)=w0e(t)+w1e(t1)+...+wie(ti)+...=i=0+wie(ti)y(t) = w_0e(t)+w_1e(t-1)+...+w_ie(t-i)+...=\sum_{i=0}^{+\infty}{w_ie(t-i)}

Where wi=f(parameters of C(z) and A(z))w_i = f(\text{parameters of }C(z)\text{ and }A(z)).

The predictor: y^(t+kt)=a0y(t)+a1y(t1)+...+aiy(ti)+...\hat{y}(t+k|t) = a_0y(t)+a_1y(t-1)+...+a_iy(t-i)+... can be written as: y^(t+kt)=a0[i=0+wie(ti)]+a1[i=0+wie(t1i)]+...=\hat{y}(t+k|t) = a_0[\sum_{i=0}^{+\infty}{w_ie(t-i)}] + a_1[\sum_{i=0}^{+\infty}{w_ie(t-1-i)}] + ... = =a0[w0e(t)+w1e(t1)+...]+a1[w0e(t1)+w1e(t2)+...]+...= a_0[w_0e(t)+w_1e(t-1)+...]+a_1[w_0e(t-1)+w_1e(t-2)+...] + ...

Arranging the terms: y^(t+kt)=β0e(t)+β1e(t1)+...=i=0+βie(ti)\hat{y}(t+k|t) = \beta_0 e(t) + \beta_1 e(t-1) + ... = \sum_{i=0}^{+\infty}{\beta_ie(t-i)} Where: β0=a0w0\newlinwβ1=a0w1+a1w1... \begin{align} \beta_0 = a_0w_0 \newlinw \beta_1 = a_0w_1 + a_1w_1 ... \end{align}