EC3062 ECONOMETRICS
IDENTIFICATION OF ARMA MODELS
A stationary stochastic process can be characterised, equivalently, by its
autocovariance function or its partial autocovariance function.
It can also be characterised by is spectral density function, which is
the Fourier transform of the autocovariances {γτ ; τ = 0, ±1, ±2, . . .} :
f(ω) =
∞

τ=−∞
γτ cos(ωτ) = γ0 + 2
∞

τ=1
γτ cos(ωτ).
Here, ω ∈ [0, π] is an angular velocity, or frequency value, in radians per
period.
The empirical counterpart of the spectral density function is the
periodogram I(ωj), which may be defined as
1
2
I(ωj) =
T −1

τ=1−T
cτ cos(ωjτ) = c0 + 2
T −1

τ=1
cτ cos(ωjτ),
where ωj = 2πj/T; j = 0, 1, . . . , [T/2] are the Fourier frequencies and
{cτ ; τ = 0, ±1, . . . , ±(T − 1)}, with cτ = T−1
T −1
t=τ (yt − ȳ)(yt−τ − ȳ), are
the empirical autocovariances.
1
EC3062 ECONOMETRICS
The Periodogram and the Autocovariances
We need to show this definition of the peridogram is equivalent to the
previous definition, which was based on the following frequency decompo-
sition of the sample variance:
1
T
T −1

t=0
(yt − ȳ)2
=
1
2
[T/2]

j=0
(α2
j + β2
j ),
where
αj =
2
T

t
yt cos(ωjt) =
2
T

t
(yt − ȳ) cos(ωjt),
βj =
2
T

t
yt sin(ωjt) =
2
T

t
(yt − ȳ) sin(ωjt).
Substituting these into the term T(α2
j + β2
j )/2 gives the periodogram
I(ωj) =
2
T
 T −1

t=0
cos(ωjt)(yt − ȳ)
2
+
 T −1

t=0
sin(ωjt)(yt − ȳ)
2

.
2
EC3062 ECONOMETRICS
The quadratic terms may be expanded to give
I(ωj) =
2
T
 
t

s
cos(ωjt) cos(ωjs)(yt − ȳ)(ys − ȳ)

+
2
T
 
t

s
sin(ωjt) sin(ωjs)(yt − ȳ)(ys − ȳ)

,
Since cos(A) cos(B) + sin(A) sin(B) = cos(A − B), this can be written as
I(ωj) =
2
T
 
t

s
cos(ωj[t − s])(yt − ȳ)(ys − ȳ)

On defining τ = t − s and writing cτ =

t(yt − ȳ)(yt−τ − ȳ)/T, we can
reduce the latter expression to
I(ωj) = 2
T −1

τ=1−T
cos(ωjτ)cτ ,
which is a Fourier transform of the empirical autocovariances.
3
EC3062 ECONOMETRICS
0
2.5
5
7.5
10
0 π/4 π/2 3π/4 π
Figure 1. The spectral density function of an MA(2) process
y(t) = (1 + 1.250L + 0.800L2
)ε(t) .
4
EC3062 ECONOMETRICS
0
20
40
60
0 π/4 π/2 3π/4 π
Figure 2. The graph of a periodogram calculated from 160
observations on a simulated series generated by an MA(2) process
y(t) = (1 + 1.250L + 0.800L2
)ε(t).
5
EC3062 ECONOMETRICS
0
10
20
30
0 π/4 π/2 3π/4 π
Figure 3. The spectral density function of an AR(2) process
(1 − 0.273L + 0.810L2
)y(t) = ε(t).
6
EC3062 ECONOMETRICS
0
25
50
75
100
125
0 π/4 π/2 3π/4 π
Figure 4. The graph of a periodogram calculated from 160
observations on a simulated series generated by an AR(2) process
(1 − 0.273L + 0.810L2
)y(t) = ε(t).
7
EC3062 ECONOMETRICS
0
20
40
60
0 π/4 π/2 3π/4 π
Figure 5. The spectral density function of an ARMA(2, 1)
process (1 − 0.273L + 0.810L2
)y(t) = (1 + 0.900L)ε(t).
8
EC3062 ECONOMETRICS
0
25
50
75
100
0 π/4 π/2 3π/4 π
Figure 6. The graph of a periodogram calculated from 160
observations on a simulated series generated by an ARMA(2, 1)
process (1 − 0.273L + 0.810L2
)y(t) = (1 + 0.900L)ε(t).
9
EC3062 ECONOMETRICS
The Methodology of Box and Jenkins
Box and Jenkins proposed to use the autocorrelation and partial autocor-
relation functions for identifying the orders of ARMA models. They paid
little attention to the periodogram.
Autocorrelation function (ACF). Given a sample y0, y1, . . . , yT −1 of
T observations, the sample autocorrelation function {rτ } is the sequence
rτ = cτ /c0, τ = 0, 1, . . . ,
where cτ = T−1

(yt −ȳ)(yt−τ −ȳ) is the empirical autocovariance at lag
τ and c0 is the sample variance.
As the lag increases, the number of observations comprised in the
empirical autocovariances diminishes.
Partial autocorrelation function (PACF). The sample partial au-
tocorrelation function {pτ } gives the correlation between the two sets of
residuals obtained from regressing the elements yt and yt−τ on the set
of intervening values yt−1, yt−2, . . . , yt−τ+1. The partial autocorrelation
measures the dependence between yt and yt−τ after the effect of the in-
tervening values has been removed.
10
EC3062 ECONOMETRICS
Reduction to Stationarity.
The first step is to examine the plot of the data to judge whether or not
the process is stationary. A trend can be removed by fitting a parametric
curve or a spline function to create a stationary sequence of residuals to
which an ARMA model can be applied.
Box and Jenkins believed that many empirical series can be modelled
by taking a sufficient number of differences to make it stationary. Thus,
the process might be modelled by the ARIMA(p, d, q) equation
α(L)∇d
y(t) = µ(L)ε(t),
where ∇d
= (I − L)d
is the dth power of the difference operator.
Then, z(t) = ∇d
y(t) will be described by a stationary ARMA(p, q)
model. The inverse operator ∇−1
is the summing or integrating operator,
which is why the model described an autoregressive integrated moving-
average.
11
EC3062 ECONOMETRICS
15.5
16.0
16.5
17.0
17.5
18.0
18.5
0 50 100 150
Figure 7. The plot of 197 concentration readings from a chemical
process taken at 2-hour intervals.
12
EC3062 ECONOMETRICS
0.00
0.25
0.50
0.75
1.00
0 5 10 15 20 25
Figure 8. The autocorrelation function of the concentration readings
from a chemical process.
13
EC3062 ECONOMETRICS
0.00
0.25
0.50
0.75
1.00
−0.25
−0.50
0 5 10 15 20 25
Figure 9. The autocorrelation function of the differences of the con-
centration readings from the chemical process.
14
EC3062 ECONOMETRICS
When Stationarity has been achieved, the autocorrelation sequence of the
resulting series should converge rapidly to zero as the value of the lag
increases. (See Figure 9.)
The characteristics of pure autoregressive and pure moving-average
process are easily spotted. Those of a mixed autoregressive moving-
average model are not so easily unravelled.
Moving-average processes. The theoretical autocorrelation function
{ρτ } of an M(q) process has ρτ = 0 for all τ  q. The partial autocorre-
lation function {πτ } is liable to decay towards zero gradually.
To determine whether the parent autocorrelations are zero after lag
q, we may use a result of Bartlett [1946] which shows that, for a sample
of size T, the standard deviation of rτ is approximately
(4)
1
√
T

1 + 2(r2
1 + r2
2 + · · · + r2
q )
1/2
for τ  q.
A measure of the scale of the autocorrelations is provided by the limits
of ±1.96/
√
T, which are the approximate 95% confidence bounds for the
autocorrelations of a white-noise sequence. These bounds are represented
by the dashed horizontal lines on the accompanying graphs.
15
EC3062 ECONOMETRICS
0
1
2
3
4
0
−1
−2
−3
−4
−5
0 25 50 75 100
Figure 10. The graph of 120 observations on a simulated series
generated by the MA(2) process y(t) = (1 + 0.90L + 0.81L2
)ε(t).
16
EC3062 ECONOMETRICS
0.00
0.25
0.50
0.75
1.00
−0.25
0 5 10 15 20 25
Figure 11. The theoretical autocorrelation function (ACF) of the
MA(2) process y(t) = (1 + 0.90L + 0.81L2
)ε(t) (the solid bars)
together with its empirical counterpart, calculated from a simulated
series of 120 observations.
17
EC3062 ECONOMETRICS
0.00
0.25
0.50
0.75
1.00
−0.25
−0.50
−0.75
0 5 10 15 20 25
Figure 12. The theoretical partial autocorrelation function (PACF)
of the MA(2) process y(t) = (1+0.90L+0.81L2
)ε(t) (the solid bars)
together with its empirical counterpart, calculated from a simulated
series of 120 observations.
18
EC3062 ECONOMETRICS
Autoregressive processes. The theoretical autocorrelation function
{ρτ } of an AR(p) process obeys a homogeneous difference equation based
upon the autoregressive operator α(L) = 1 + α1L + · · · + αpLp
:
(5) ρτ = −(α1ρτ−1 + · · · + αpρτ−p) for all τ ≥ p.
The autocorrelation sequence will be a mixture of damped exponential
and sinusoidal functions. If the sequence is of a sinusoidal nature, then
the presence of complex roots in the operator α(L) is indicated.
The partial autocorrelation function {πτ } serves most clearly to iden-
tify a pure AR process. An AR(p) process has πτ = 0 for all τ  p.
The significance of the values of the empirical partial autocorrelations
is judged by the fact that, for a pth order process, their standard deviations
for all lags greater that p are approximated by 1/
√
T. The bounds of
±1.96/
√
T are plotted on the graph of the partial autocorrelation function.
19
EC3062 ECONOMETRICS
0
5
10
15
0
−5
−10
−15
0 25 50 75 100
Figure 13. The graph of 120 observations on a simulated series
generated by the AR(2) process (1 − 1.69L + 0.81L2
)y(t) = ε(t).
20
EC3062 ECONOMETRICS
0.00
0.25
0.50
0.75
1.00
−0.25
−0.50
0 5 10 15 20 25
Figure 14. The theoretical autocorrelation function (ACF) of the
AR(2) process (1 − 1.69L + 0.81L2
)y(t) = ε(t) (the solid bars)
together with its empirical counterpart, calculated from a simulated
series of 120 observations.
21
EC3062 ECONOMETRICS
0.00
0.25
0.50
0.75
1.00
−0.25
−0.50
−0.75
−1.00
0 5 10 15 20 25
Figure 15. The theoretical partial autocorrelation function (PACF)
of the AR(2) process (1−1.69L+0.81L2
)y(t) = ε(t) (the solid bars)
together with its empirical counterpart, calculated from a simulated
series of 120 observations.
22
EC3062 ECONOMETRICS
Mixed processes. Neither the theoretical autocorrelation or partial au-
tocorrelation functions of an ARMA(p, q) process have abrupt cutoffs.
The autocovariances an ARMA(p, q) process satisfy the same difference
equation as that of a pure AR model for all values of τ  max(p, q).
A rational transfer function is more effective in approximating an
arbitrary impulse response than is an AR or an MA transfer function
The sum of any two mutually independent AR processes gives rise to
an ARMA process. Let y(t) and z(t) be AR processes of orders p and r
respectively described by α(L)y(t) = ε(t) and ρ(L)z(t) = η(t), wherein
ε(t) and η(t) are mutually independent white-noise processes. Then their
sum will be
(6)
y(t) + z(t) =
ε(t)
α(L)
+
η(t)
ρ(L)
=
ρ(L)ε(t) + α(L)η(t)
α(L)ρ(L)
=
µ(L)ζ(t)
α(L)ρ(L)
,
where µ(L)ζ(t) = ρ(L)ε(t)+α(L)η(t) constitutes a moving-average process
of order max(p, r).
23
EC3062 ECONOMETRICS
0
10
20
30
40
0
−10
−20
−30
0 20 50 75 100
Figure 16. The graph of 120 observations on a simulated series generated by
the ARMA(2, 2) process (1−1.69L+0.81L2
)y(t) = (1+0.90L+0.81L2
)ε(t).
24
EC3062 ECONOMETRICS
0.00
0.25
0.50
0.75
1.00
−0.25
−0.50
0 5 10 15 20 25
Figure 17. The theoretical autocorrelation function (ACF) of the ARMA(2,
2) process (1 − 1.69L + 0.81L2
)y(t) = (1 + 0.90L + 0.81L2
)ε(t) (the solid
bars) together with its empirical counterpart, calculated from a simulated series
of 120 observations.
25
EC3062 ECONOMETRICS
0.00
0.25
0.50
0.75
1.00
−0.25
−0.50
−0.75
−1.00
0 5 10 15 20 25
Figure 18. The theoretical partial autocorrelation function (PACF) of the
ARMA(2, 2) process (1 − 1.69L + 0.81L2
)y(t) = (1 + 0.90L + 0.81L2
)ε(t)
(the solid bars) together with its empirical counterpart, calculated from a sim-
ulated series of 120 observations..
26
EC3062 ECONOMETRICS
FORECASTING WITH ARMA MODELS
The Coefficients of the Moving-Average Expansion
The ARMA model α(L)y(t) = µ(L)ε(t) can be cast in the form of
y(t) = {µ(L)/α(L)}ε(t) = ψ(L)ε(t) where
ψ(L) = {ψ0 + ψ1L + ψ2L2
+ · · ·}
is from the expansion of the rational function.
The method of finding the coefficients of the series expansion can be
illustrated by the second-order case:
µ0 + µ1z
α1 + α1z + α2z2
=

ψ0 + ψ1z + ψ2z2
+ · · ·

.
We rewrite this equation as
µ0 + µ1z =

α1 + α1z + α2z2

ψ0 + ψ1z + ψ2z2
+ · · ·

.
27
EC3062 ECONOMETRICS
The following table assists us in multipling togther the two polyomials:
ψ0 ψ1z ψ2z2
· · ·
α0 α0ψ0 α0ψ1z α0ψ2z2
· · ·
α1z α1ψ0z α1ψ1z2
α1ψ2z3
· · ·
α2z2
α2ψ0z2
α2ψ1z3
α2ψ2z4
· · ·
Performing the multiplication on the RHS of the equation, and by equating
the coefficients of the same powers of z on the two sides, we find that
µ0 = α0ψ0,
µ1 = α0ψ1 + α1ψ0,
0 = α0ψ2 + α1ψ1 + α2ψ0,
.
.
.
0 = α0ψn + α1ψn−1 + α2ψn−2,
ψ0 = µ0/α0,
ψ1 = (µ1 − α1ψ0)/α0,
ψ2 = −(α1ψ1 + α2ψ0)/α0,
.
.
.
ψn = −(α1ψn−1 + α2ψn−2)/α0.
28
EC3062 ECONOMETRICS
The optimal (minimum mean-square error) forecast of yt+h is the con-
ditional expectation of yt+h given the information set It comprising the
values of {εt, εt−1, εt−2, . . .} or equally the values of {yt, yt−1, yt−2, . . .}.
On taking expectations y(t) and ε(t) conditonal on It, we find that
(21)
E(yt+k|It) = ŷt+k if k  0,
E(yt−j|It) = yt−j if j ≥ 0,
E(εt+k|It) = 0 if k  0,
E(εt−j|It) = εt−j = yt−j − ŷt−j if j ≥ 0.
In this notation, the forecast h periods ahead is
(22)
E(yt+h|It) =
h

k=1
ψh−kE(εt+k|It) +
∞

j=0
ψh+jE(εt−j|It)
=
∞

j=0
ψh+jεt−j.
29
EC3062 ECONOMETRICS
In practice, the forecasts are generated recursively via the equation
(23)
y(t) = −

α1y(t − 1) + α2y(t − 2) + · · · + αpy(t − p)

+ µ0ε(t) + µ1ε(t − 1) + · · · + µqε(t − q).
By taking the conditional expectation of this function, we get
(24)
ŷt+h = −{α1ŷt+h−1 + · · · + αpyt+h−p}
+ µhεt + · · · + µqεt+h−q when 0  h ≤ p, q,
(25) ŷt+h = −{α1ŷt+h−1 + · · · + αpyt+h−p} if q  m ≤ p,
(26)
ŷt+h = −{α1ŷt+h−1 + · · · + αpŷt+h−p}
+ µhεt + · · · + µqεt+h−q if p  h ≤ q,
and
(27) ŷt+h = −{α1ŷt+h−1 + · · · + αpŷt+h−p} when p, q  h.
30
EC3062 ECONOMETRICS
Equation (27) ashows that, whn h  p, q, the forecasting function becomes
a pth-order homogeneous difference equation in y. The p values of y(t)
from t = r = max(p, q) to t = r − p + 1 serve as the starting values for the
equation.
The behaviour of the forecast function beyond the reach of the start-
ing values is determind the roots of the autoregressive operator α(L) = 0
If all of the roots of α(z) = 0 are less than unity, then ŷt+h will
converge to zero as h increases.
If one of the roots is unity, then the forecast function will converge
to a nonzero consant.
If the are two unit roots, then the forecast function will converg to a
linear trend.
In general, if d of the roots are unity, then the general solution will
comprise a polynomial in t of order d − 1.
31
EC3062 ECONOMETRICS
The forecasts can be updated easily once the coefficients in the expansion
of ψ(L) = µ(L)/α(L) have been obtained. Consider
(28)
ŷt+h|t+1 = {ψh−1εt+1 + ψhεt + ψh+1εt−1 + · · ·} and
ŷt+h|t = {ψhεt + ψh+1εt−1 + ψh+2εt−2 + · · ·}.
The first of these is the forecast for h−1 periods ahead made at time t+1
whilst the second is the forecast for h periods ahead made at time t. It
can be seen that
(29) ŷt+h|t+1 = ŷt+h|t + ψh−1εt+1,
where εt+1 = yt+1 − ŷt+1 is the current disturbance at time t + 1. The
later is also the prediction error of the one-step-ahead forecast made at
time t.
32

IDENTIFICATION OF ARMA MODELSSSSSSSSSSSS

  • 1.
    EC3062 ECONOMETRICS IDENTIFICATION OFARMA MODELS A stationary stochastic process can be characterised, equivalently, by its autocovariance function or its partial autocovariance function. It can also be characterised by is spectral density function, which is the Fourier transform of the autocovariances {γτ ; τ = 0, ±1, ±2, . . .} : f(ω) = ∞ τ=−∞ γτ cos(ωτ) = γ0 + 2 ∞ τ=1 γτ cos(ωτ). Here, ω ∈ [0, π] is an angular velocity, or frequency value, in radians per period. The empirical counterpart of the spectral density function is the periodogram I(ωj), which may be defined as 1 2 I(ωj) = T −1 τ=1−T cτ cos(ωjτ) = c0 + 2 T −1 τ=1 cτ cos(ωjτ), where ωj = 2πj/T; j = 0, 1, . . . , [T/2] are the Fourier frequencies and {cτ ; τ = 0, ±1, . . . , ±(T − 1)}, with cτ = T−1 T −1 t=τ (yt − ȳ)(yt−τ − ȳ), are the empirical autocovariances. 1
  • 2.
    EC3062 ECONOMETRICS The Periodogramand the Autocovariances We need to show this definition of the peridogram is equivalent to the previous definition, which was based on the following frequency decompo- sition of the sample variance: 1 T T −1 t=0 (yt − ȳ)2 = 1 2 [T/2] j=0 (α2 j + β2 j ), where αj = 2 T t yt cos(ωjt) = 2 T t (yt − ȳ) cos(ωjt), βj = 2 T t yt sin(ωjt) = 2 T t (yt − ȳ) sin(ωjt). Substituting these into the term T(α2 j + β2 j )/2 gives the periodogram I(ωj) = 2 T T −1 t=0 cos(ωjt)(yt − ȳ) 2 + T −1 t=0 sin(ωjt)(yt − ȳ) 2 . 2
  • 3.
    EC3062 ECONOMETRICS The quadraticterms may be expanded to give I(ωj) = 2 T t s cos(ωjt) cos(ωjs)(yt − ȳ)(ys − ȳ) + 2 T t s sin(ωjt) sin(ωjs)(yt − ȳ)(ys − ȳ) , Since cos(A) cos(B) + sin(A) sin(B) = cos(A − B), this can be written as I(ωj) = 2 T t s cos(ωj[t − s])(yt − ȳ)(ys − ȳ) On defining τ = t − s and writing cτ = t(yt − ȳ)(yt−τ − ȳ)/T, we can reduce the latter expression to I(ωj) = 2 T −1 τ=1−T cos(ωjτ)cτ , which is a Fourier transform of the empirical autocovariances. 3
  • 4.
    EC3062 ECONOMETRICS 0 2.5 5 7.5 10 0 π/4π/2 3π/4 π Figure 1. The spectral density function of an MA(2) process y(t) = (1 + 1.250L + 0.800L2 )ε(t) . 4
  • 5.
    EC3062 ECONOMETRICS 0 20 40 60 0 π/4π/2 3π/4 π Figure 2. The graph of a periodogram calculated from 160 observations on a simulated series generated by an MA(2) process y(t) = (1 + 1.250L + 0.800L2 )ε(t). 5
  • 6.
    EC3062 ECONOMETRICS 0 10 20 30 0 π/4π/2 3π/4 π Figure 3. The spectral density function of an AR(2) process (1 − 0.273L + 0.810L2 )y(t) = ε(t). 6
  • 7.
    EC3062 ECONOMETRICS 0 25 50 75 100 125 0 π/4π/2 3π/4 π Figure 4. The graph of a periodogram calculated from 160 observations on a simulated series generated by an AR(2) process (1 − 0.273L + 0.810L2 )y(t) = ε(t). 7
  • 8.
    EC3062 ECONOMETRICS 0 20 40 60 0 π/4π/2 3π/4 π Figure 5. The spectral density function of an ARMA(2, 1) process (1 − 0.273L + 0.810L2 )y(t) = (1 + 0.900L)ε(t). 8
  • 9.
    EC3062 ECONOMETRICS 0 25 50 75 100 0 π/4π/2 3π/4 π Figure 6. The graph of a periodogram calculated from 160 observations on a simulated series generated by an ARMA(2, 1) process (1 − 0.273L + 0.810L2 )y(t) = (1 + 0.900L)ε(t). 9
  • 10.
    EC3062 ECONOMETRICS The Methodologyof Box and Jenkins Box and Jenkins proposed to use the autocorrelation and partial autocor- relation functions for identifying the orders of ARMA models. They paid little attention to the periodogram. Autocorrelation function (ACF). Given a sample y0, y1, . . . , yT −1 of T observations, the sample autocorrelation function {rτ } is the sequence rτ = cτ /c0, τ = 0, 1, . . . , where cτ = T−1 (yt −ȳ)(yt−τ −ȳ) is the empirical autocovariance at lag τ and c0 is the sample variance. As the lag increases, the number of observations comprised in the empirical autocovariances diminishes. Partial autocorrelation function (PACF). The sample partial au- tocorrelation function {pτ } gives the correlation between the two sets of residuals obtained from regressing the elements yt and yt−τ on the set of intervening values yt−1, yt−2, . . . , yt−τ+1. The partial autocorrelation measures the dependence between yt and yt−τ after the effect of the in- tervening values has been removed. 10
  • 11.
    EC3062 ECONOMETRICS Reduction toStationarity. The first step is to examine the plot of the data to judge whether or not the process is stationary. A trend can be removed by fitting a parametric curve or a spline function to create a stationary sequence of residuals to which an ARMA model can be applied. Box and Jenkins believed that many empirical series can be modelled by taking a sufficient number of differences to make it stationary. Thus, the process might be modelled by the ARIMA(p, d, q) equation α(L)∇d y(t) = µ(L)ε(t), where ∇d = (I − L)d is the dth power of the difference operator. Then, z(t) = ∇d y(t) will be described by a stationary ARMA(p, q) model. The inverse operator ∇−1 is the summing or integrating operator, which is why the model described an autoregressive integrated moving- average. 11
  • 12.
    EC3062 ECONOMETRICS 15.5 16.0 16.5 17.0 17.5 18.0 18.5 0 50100 150 Figure 7. The plot of 197 concentration readings from a chemical process taken at 2-hour intervals. 12
  • 13.
    EC3062 ECONOMETRICS 0.00 0.25 0.50 0.75 1.00 0 510 15 20 25 Figure 8. The autocorrelation function of the concentration readings from a chemical process. 13
  • 14.
    EC3062 ECONOMETRICS 0.00 0.25 0.50 0.75 1.00 −0.25 −0.50 0 510 15 20 25 Figure 9. The autocorrelation function of the differences of the con- centration readings from the chemical process. 14
  • 15.
    EC3062 ECONOMETRICS When Stationarityhas been achieved, the autocorrelation sequence of the resulting series should converge rapidly to zero as the value of the lag increases. (See Figure 9.) The characteristics of pure autoregressive and pure moving-average process are easily spotted. Those of a mixed autoregressive moving- average model are not so easily unravelled. Moving-average processes. The theoretical autocorrelation function {ρτ } of an M(q) process has ρτ = 0 for all τ q. The partial autocorre- lation function {πτ } is liable to decay towards zero gradually. To determine whether the parent autocorrelations are zero after lag q, we may use a result of Bartlett [1946] which shows that, for a sample of size T, the standard deviation of rτ is approximately (4) 1 √ T 1 + 2(r2 1 + r2 2 + · · · + r2 q ) 1/2 for τ q. A measure of the scale of the autocorrelations is provided by the limits of ±1.96/ √ T, which are the approximate 95% confidence bounds for the autocorrelations of a white-noise sequence. These bounds are represented by the dashed horizontal lines on the accompanying graphs. 15
  • 16.
    EC3062 ECONOMETRICS 0 1 2 3 4 0 −1 −2 −3 −4 −5 0 2550 75 100 Figure 10. The graph of 120 observations on a simulated series generated by the MA(2) process y(t) = (1 + 0.90L + 0.81L2 )ε(t). 16
  • 17.
    EC3062 ECONOMETRICS 0.00 0.25 0.50 0.75 1.00 −0.25 0 510 15 20 25 Figure 11. The theoretical autocorrelation function (ACF) of the MA(2) process y(t) = (1 + 0.90L + 0.81L2 )ε(t) (the solid bars) together with its empirical counterpart, calculated from a simulated series of 120 observations. 17
  • 18.
    EC3062 ECONOMETRICS 0.00 0.25 0.50 0.75 1.00 −0.25 −0.50 −0.75 0 510 15 20 25 Figure 12. The theoretical partial autocorrelation function (PACF) of the MA(2) process y(t) = (1+0.90L+0.81L2 )ε(t) (the solid bars) together with its empirical counterpart, calculated from a simulated series of 120 observations. 18
  • 19.
    EC3062 ECONOMETRICS Autoregressive processes.The theoretical autocorrelation function {ρτ } of an AR(p) process obeys a homogeneous difference equation based upon the autoregressive operator α(L) = 1 + α1L + · · · + αpLp : (5) ρτ = −(α1ρτ−1 + · · · + αpρτ−p) for all τ ≥ p. The autocorrelation sequence will be a mixture of damped exponential and sinusoidal functions. If the sequence is of a sinusoidal nature, then the presence of complex roots in the operator α(L) is indicated. The partial autocorrelation function {πτ } serves most clearly to iden- tify a pure AR process. An AR(p) process has πτ = 0 for all τ p. The significance of the values of the empirical partial autocorrelations is judged by the fact that, for a pth order process, their standard deviations for all lags greater that p are approximated by 1/ √ T. The bounds of ±1.96/ √ T are plotted on the graph of the partial autocorrelation function. 19
  • 20.
    EC3062 ECONOMETRICS 0 5 10 15 0 −5 −10 −15 0 2550 75 100 Figure 13. The graph of 120 observations on a simulated series generated by the AR(2) process (1 − 1.69L + 0.81L2 )y(t) = ε(t). 20
  • 21.
    EC3062 ECONOMETRICS 0.00 0.25 0.50 0.75 1.00 −0.25 −0.50 0 510 15 20 25 Figure 14. The theoretical autocorrelation function (ACF) of the AR(2) process (1 − 1.69L + 0.81L2 )y(t) = ε(t) (the solid bars) together with its empirical counterpart, calculated from a simulated series of 120 observations. 21
  • 22.
    EC3062 ECONOMETRICS 0.00 0.25 0.50 0.75 1.00 −0.25 −0.50 −0.75 −1.00 0 510 15 20 25 Figure 15. The theoretical partial autocorrelation function (PACF) of the AR(2) process (1−1.69L+0.81L2 )y(t) = ε(t) (the solid bars) together with its empirical counterpart, calculated from a simulated series of 120 observations. 22
  • 23.
    EC3062 ECONOMETRICS Mixed processes.Neither the theoretical autocorrelation or partial au- tocorrelation functions of an ARMA(p, q) process have abrupt cutoffs. The autocovariances an ARMA(p, q) process satisfy the same difference equation as that of a pure AR model for all values of τ max(p, q). A rational transfer function is more effective in approximating an arbitrary impulse response than is an AR or an MA transfer function The sum of any two mutually independent AR processes gives rise to an ARMA process. Let y(t) and z(t) be AR processes of orders p and r respectively described by α(L)y(t) = ε(t) and ρ(L)z(t) = η(t), wherein ε(t) and η(t) are mutually independent white-noise processes. Then their sum will be (6) y(t) + z(t) = ε(t) α(L) + η(t) ρ(L) = ρ(L)ε(t) + α(L)η(t) α(L)ρ(L) = µ(L)ζ(t) α(L)ρ(L) , where µ(L)ζ(t) = ρ(L)ε(t)+α(L)η(t) constitutes a moving-average process of order max(p, r). 23
  • 24.
    EC3062 ECONOMETRICS 0 10 20 30 40 0 −10 −20 −30 0 2050 75 100 Figure 16. The graph of 120 observations on a simulated series generated by the ARMA(2, 2) process (1−1.69L+0.81L2 )y(t) = (1+0.90L+0.81L2 )ε(t). 24
  • 25.
    EC3062 ECONOMETRICS 0.00 0.25 0.50 0.75 1.00 −0.25 −0.50 0 510 15 20 25 Figure 17. The theoretical autocorrelation function (ACF) of the ARMA(2, 2) process (1 − 1.69L + 0.81L2 )y(t) = (1 + 0.90L + 0.81L2 )ε(t) (the solid bars) together with its empirical counterpart, calculated from a simulated series of 120 observations. 25
  • 26.
    EC3062 ECONOMETRICS 0.00 0.25 0.50 0.75 1.00 −0.25 −0.50 −0.75 −1.00 0 510 15 20 25 Figure 18. The theoretical partial autocorrelation function (PACF) of the ARMA(2, 2) process (1 − 1.69L + 0.81L2 )y(t) = (1 + 0.90L + 0.81L2 )ε(t) (the solid bars) together with its empirical counterpart, calculated from a sim- ulated series of 120 observations.. 26
  • 27.
    EC3062 ECONOMETRICS FORECASTING WITHARMA MODELS The Coefficients of the Moving-Average Expansion The ARMA model α(L)y(t) = µ(L)ε(t) can be cast in the form of y(t) = {µ(L)/α(L)}ε(t) = ψ(L)ε(t) where ψ(L) = {ψ0 + ψ1L + ψ2L2 + · · ·} is from the expansion of the rational function. The method of finding the coefficients of the series expansion can be illustrated by the second-order case: µ0 + µ1z α1 + α1z + α2z2 = ψ0 + ψ1z + ψ2z2 + · · · . We rewrite this equation as µ0 + µ1z = α1 + α1z + α2z2 ψ0 + ψ1z + ψ2z2 + · · · . 27
  • 28.
    EC3062 ECONOMETRICS The followingtable assists us in multipling togther the two polyomials: ψ0 ψ1z ψ2z2 · · · α0 α0ψ0 α0ψ1z α0ψ2z2 · · · α1z α1ψ0z α1ψ1z2 α1ψ2z3 · · · α2z2 α2ψ0z2 α2ψ1z3 α2ψ2z4 · · · Performing the multiplication on the RHS of the equation, and by equating the coefficients of the same powers of z on the two sides, we find that µ0 = α0ψ0, µ1 = α0ψ1 + α1ψ0, 0 = α0ψ2 + α1ψ1 + α2ψ0, . . . 0 = α0ψn + α1ψn−1 + α2ψn−2, ψ0 = µ0/α0, ψ1 = (µ1 − α1ψ0)/α0, ψ2 = −(α1ψ1 + α2ψ0)/α0, . . . ψn = −(α1ψn−1 + α2ψn−2)/α0. 28
  • 29.
    EC3062 ECONOMETRICS The optimal(minimum mean-square error) forecast of yt+h is the con- ditional expectation of yt+h given the information set It comprising the values of {εt, εt−1, εt−2, . . .} or equally the values of {yt, yt−1, yt−2, . . .}. On taking expectations y(t) and ε(t) conditonal on It, we find that (21) E(yt+k|It) = ŷt+k if k 0, E(yt−j|It) = yt−j if j ≥ 0, E(εt+k|It) = 0 if k 0, E(εt−j|It) = εt−j = yt−j − ŷt−j if j ≥ 0. In this notation, the forecast h periods ahead is (22) E(yt+h|It) = h k=1 ψh−kE(εt+k|It) + ∞ j=0 ψh+jE(εt−j|It) = ∞ j=0 ψh+jεt−j. 29
  • 30.
    EC3062 ECONOMETRICS In practice,the forecasts are generated recursively via the equation (23) y(t) = − α1y(t − 1) + α2y(t − 2) + · · · + αpy(t − p) + µ0ε(t) + µ1ε(t − 1) + · · · + µqε(t − q). By taking the conditional expectation of this function, we get (24) ŷt+h = −{α1ŷt+h−1 + · · · + αpyt+h−p} + µhεt + · · · + µqεt+h−q when 0 h ≤ p, q, (25) ŷt+h = −{α1ŷt+h−1 + · · · + αpyt+h−p} if q m ≤ p, (26) ŷt+h = −{α1ŷt+h−1 + · · · + αpŷt+h−p} + µhεt + · · · + µqεt+h−q if p h ≤ q, and (27) ŷt+h = −{α1ŷt+h−1 + · · · + αpŷt+h−p} when p, q h. 30
  • 31.
    EC3062 ECONOMETRICS Equation (27)ashows that, whn h p, q, the forecasting function becomes a pth-order homogeneous difference equation in y. The p values of y(t) from t = r = max(p, q) to t = r − p + 1 serve as the starting values for the equation. The behaviour of the forecast function beyond the reach of the start- ing values is determind the roots of the autoregressive operator α(L) = 0 If all of the roots of α(z) = 0 are less than unity, then ŷt+h will converge to zero as h increases. If one of the roots is unity, then the forecast function will converge to a nonzero consant. If the are two unit roots, then the forecast function will converg to a linear trend. In general, if d of the roots are unity, then the general solution will comprise a polynomial in t of order d − 1. 31
  • 32.
    EC3062 ECONOMETRICS The forecastscan be updated easily once the coefficients in the expansion of ψ(L) = µ(L)/α(L) have been obtained. Consider (28) ŷt+h|t+1 = {ψh−1εt+1 + ψhεt + ψh+1εt−1 + · · ·} and ŷt+h|t = {ψhεt + ψh+1εt−1 + ψh+2εt−2 + · · ·}. The first of these is the forecast for h−1 periods ahead made at time t+1 whilst the second is the forecast for h periods ahead made at time t. It can be seen that (29) ŷt+h|t+1 = ŷt+h|t + ψh−1εt+1, where εt+1 = yt+1 − ŷt+1 is the current disturbance at time t + 1. The later is also the prediction error of the one-step-ahead forecast made at time t. 32