Copyright(© MTS-2002GG): You are free to use and modify these s
Introduction to Time Series AnalysisIntroduction to Time Series Analysis
Gloria González-Rivera
University of California, Riverside
and
Jesús Gonzalo U. Carlos III de Madrid
Spring 2002
• Sample Space: , the set of possible outcomes of some random experiment
• Outcome: , a single element of the Sample Space
• Event: , a subset of the Sample Space
• Field: , the collection of Events we will be considering
• Random Variables: , a function from the Sample Space Ω to a State Space S
• State Space: S, a space containing the possible values of a random variables –common choices are
the integers N, reals R, k-vectors Rk
, complex numbers C, positive reals R+, etc
• Probability: , obeying the three rules that you must very well know
• Distribution: is the Borel sets
(intervals, etc)
Brief Review of ProbabilityBrief Review of Probability
Ω∈ω
}{Ω ω=
Ω⊂E
}E:E Ω⊂{=F
S:Z →Ω
]1,0[:P →F
}RA:A{where,]1,0[: ⊂⊂→µ BB
•Random Vectors: Z= (Z1, Z2 , ..., Zn) is a n-dimensional random vector if its components Z1 , ..., Zn
are one-dimensional real-valued random variables
If we interpret t=1, ..., n as equidistant instants of time, Zt can stand for the outcome of an experiment
at time t . Such a time series may, for example, consists of Toyota share prices Zt at n succeeding days.
The new aspect now, compared to a one-dimensional radnom variable, is that now we can talk about
the dependence structure of the random vector.
•Distribution function FZ of Z : It is the collection of the probabilities
Brief Review (cont)Brief Review (cont)
})nz)(nZ,...,1z)(1Z:({P
)nznZ,...,1z1Z(P)z(ZF
≤ω≤ωω=
≤≤=
We suppose that the exchange rate €/$ at every fixed instant t
between 5p.m and 6p.m. this afternoon is random. Therefore we
can interpret it as a realization Zt(ω) of the random variable Zt, and
so we observe Zt(ω), 5<t<6. In order to make a guess at 6 p.m.
about the exchange rate Z19(ω) at 7 p.m. it is reasonable to look at
the whole evolution of Zt(ω) between 5 and 6 p.m. A
mathematical model describing this evolution is called a stochastic
process.
Stochastic ProcessesStochastic Processes
RZZ tt →Ω:),(ω
Suppose that
(1) For a fixed t
Changing the time index, we can generate several random variables:
)(),.......(),( 21
ωωω nttt ZZZ
(2) For fixed ω
This is just a random variable.
RTZ →:ω This is a realization or sample function
This collection of random variables is called a STOCHASTIC
PROCESSA realization of the stochastic process is called a TIME SERIES
From which a realization is:
nt t tz z z,.... ,2 1
A stochastic process is a collection of time indexed random variables
defined on some space Ω.
),Tt),(tZ()Tt,tZ( Ω∈ω∈ω=∈
Stochastic Processes (cont)Stochastic Processes (cont)
Examples of stochastic processesExamples of stochastic processes
E1: Let the index set be T={1, 2, 3} and let the space of outcomes (Ω) be the possible
outcomes associated with tossing one dice:
Ω={1, 2, 3, ,4 ,5, 6}
Define
Z(t, ω)= t + [value on dice]2
t
Therefore for a particular ω, say ω3={3}, the realization or path would be (10, 20, 30).
Q1: Draw all the different realizations (six) of this stochastic process.
Q2: Think on an economic relevant variable as an stochastic process and write down
an example similar to E1 with it. Specify very clear the sample space and the “rule”
that generates the stochastic process.
E2: A Brownian Motion B=(Bt, t ∈[0, infty]):
• It starts at zero: Bo=0
• It has stationary, independent increments
• For evey t>0, Bt has a normal N(0, t) distribution
• It has continuous sample paths: “no jumps”.
Distribution of a Stochastic ProcessDistribution of a Stochastic Process
In analogy to random variables and random vectors we want to introduce non-random
characteristics of a stochastic process such as its distribution, expectation, etc. and
describe its dependence structure. This is a task much more complicated that the
description of a random vector. Indeed, a non-trivial stochastic process Z=(Zt, t ∈ T)
with infinite index set T is an infinite-dimensional object; it can be inderstood as the
infinite collection of the random variables Zt, t ∈ T. Since the values of Z are functions
on T, the distribution of Z should be defined on subsets of a certain “function space”,
i.e.
P(X ∈ A), A ∈ F,
where F is a collection of suitable subsets of this space of functions. This approach is
possible, but requires advanced mathematics, and so we will try something simpler.
The finite-dimensional distributions (fidis) of the stochastic process Z are the
distributions of the finite-dimensional vectors
(Zt1,..., Ztn), t1, ..., tn ∈T,
for all possible choices of times t1, ..., tn ∈ T and every n ≥ 1.
StationarityStationarity
Consider the joint probability distribution of the collection of
random variables
),...,(),.....,( 221121 nnn ttttttttt zZzZzZPzzzF ≤≤≤=
1st
order stationary process if
ktanyforzFzF ktt ,)()( 111 +=
n-order stationary process if
kttanyforzzFzzF ktkttt ,,),(),( 212121 ++=
kttanyforzzFzzF nktkttt nn
,,).....().....( 111 ++=
Definition.
A process is strongly (strictly) stationary if it is a n-order stationary
process for any n.
2nd
order stationary process if
MomentsMoments
2
t
2
t
)tZ,tZcov(
)2t,1t(
)]ttZ)(ttZ[(E)tZ,tZ(Cov
tdz)tz(f2)ttZ(2)ttZ(E2
t)tZ(Var
tdz)tz(ftZt)tZ(E
21
21
221121
σσ
=ρ
µ−µ−=
µ−=µ−=σ=
=µ=
∫
∫
Moments (cont)Moments (cont)
For strictly stationary
process:
22
σσ
µµ
=
=
t
t
because
µµµ ==→= ++ kttktt zFzF 1111
)()(
provided that ∞< ∞<) ( , ) (2
t tZ E Z E
k
ktkttt
ktkttt
ktttkttt
ttktt
ktkttt
zzzz
zzFzzF
ρρρρ
ρρ
=+=−=
=−=
++=
⇒=
⇒=
++
++
),(),(),(
then,andlet
),(),(
),cov(),cov(
),(),(
21
21
2121
2121
2121
The correlation between any two random variables depends on the
time difference
Weak StationarityWeak Stationarity
A process is said to be n-order weakly stationary if all its joint
moments up to order n exist and are time invariant.
Covariance stationary process (2nd
order weakly stationary):
• constant mean
• constant variance
• covariance function depends on time difference between R.V.
Autocovariance and Autocorrelation FunctionsAutocovariance and Autocorrelation Functions
For a covariance stationary process:
0
2
2
)var()var(
),cov(
),(
)(
)(
γ
γ
σ
γ
ρ
γ
σ
µ
kk
ktt
ktt
k
tsst
t
t
ZZ
ZZ
ZZCov
ZVar
ZE
===
=
=
=
+
+
−
]1,1[k:
(ACF)functionationautocorrel:k
Rk:
functionanceautocovari:k
−→ρ
ρ
→γ
γ
Properties of the autocorrelation functionProperties of the autocorrelation function
1.
2.
3.
1then)var(If 00 == ργ tZ
0
k
1
t,coefficienncorrelatioaisSince
γγρ
ρ
≤⇒≤ kk
ktkt
kktktk
kk
kk
ZZE
ZZE
−−
+−−
−
−
=−−=
=−−=
=
=
γµµ
µµγ
ρρ
γγ
))((
))((since )(
Partial Autocorrelation Function (conditional correlation)Partial Autocorrelation Function (conditional correlation)
This function gives the correlation between two random variables
that are k periods apart when the in-between linear dependence
(between t and t+k ) is removed.
),......|,(bygivenisPACFthe
,variablesrandomtwobeandLet
11 −+++
+
kttktt
ktt
ZZZZ
ZZ
ρ
Motivation Think about a regression model
(without loss of generality, assume that
E(Z)=0)
kjkk......2j2k1j1kj
nsexpectatiotake)2(
jktZktejktZtZkk......jktZ2ktZ2kjktZ1ktZ1kktZjktZ
jktZbymultiply(1)
1jjktZwitheduncorrelatisktewhere
ktetZkk......2ktZ2k1ktZ1kkt
Z
−γφ+−γφ+−γφ=γ
−+++−+φ+−+−+φ+−+−+φ=+−+
−+
≥−++
++φ+−+φ+−+φ=
+
kjkkjkjk −−− ++= ρφρφρφρ ......2211j
Dividing by the variance of the process:
kj ,...2,1=
011
2112
1011
.......
.......
.......
ρφρφρ
ρφρφρ
ρφρφρ
kkkkk
kkkk
kkkk
++=
++=
++=
−
−
−

Yule-Walker
equations
0331322313
1330321312
2331320311
0221212
1220211
1110111
3
2
1
ρφρφρφρ
ρφρφρφρ
ρφρφρφρ
ρφρφρ
ρφρφρ
ρφρφρ
++=
++=
++==
+=
+==
=⇒==
k
k
k
1
1
1
1
1
21
1
22
ρ
ρ
ρρ
ρ
φ =⇒
1
1
1
1
1
12
11
21
312
21
11
33
ρρ
ρρ
ρρ
ρρρ
ρρ
ρρ
φ =⇒
E4: Zt=
Yt if t is even
Yt+1 if t is odd
where Yt is a stationary time series. Is Zt weak stationary?
E5: Define the process
St = X1+ ... + Xn ,
where Xi is iid (0, σ2
). Show that for h>0
Cov (St+h, St) = t σ2
,
and therefore St is not weak stationary.
Examples of stochastic processesExamples of stochastic processes
Examples of stochastic processesExamples of stochastic processes (cont)(cont)
E6: White Noise Process
A sequence of uncorrelated random variables is called a white noise
process. { }
0for0),(
)(
)0(normally)(:
2
≠=
=
==
+ kaaCov
aVar
aEa
ktt
at
aatt
σ
µµ



≠
=
=



≠
=
=



≠
=
=
00
01
00
01
00
0
ationautocorrelandanceAutocovari
2
k
k
k
k
k
k
kk
k
a
k
φ
ρ
σ
γ
. . . .1 2 3 4 k
kρ
Dependence: ErgodicityDependence: Ergodicity
• See Reading 1 from Leo Breiman (1969) “Probability and Stochastic Processes: With
a View Toward Applications”
• We want to allow as much dependence as the Law of Large Numbers (LLN) let us do it
• Stationarity is not enough as the following example shows:
E7: Let {Ut} be a sequence of iid r.v uniformly distributed on [0, 1] and let Z be N(0,1)
independent of {Ut}.
Define Yt=Z+Ut . Then Yt is stationary (why?), but
The problem is that there is too much dependence in the sequence {Yt}. In fact the
correlation between Y1 and Yt is always positive for any value of t.
2
1
ZnY
2
1
)tY(E
no
n
1t
tY
n
1
nY
→−
=→
=
=
∑
Ergodicity for the meanErgodicity for the mean
Need to distinguishing between:
1. Ensemble average 2. Time average
Objective: estimate the mean of the
process
Which estimator is the most appropriate? Ensemble average
Problem: It is impossible to calculate
Under which circumstances we can use the time average?
m
Z
z
m
i
i∑=
= 1
n
Z
z
n
t
t∑=
= 1
{ }tZ
Is the time average an unbiased and consistent estimator of the mean?
)( tZE=µ
Ergodicity for the mean (cont)Ergodicity for the mean (cont)
Reminder. Sufficient conditions for consistency of an estimator.
0)ˆvar(limand)ˆ(lim
T
==
∞→∞→
TT
T
E θθθ
1. Time average is asymptotically unbiased
∑ ∑ ===
t t
t
n
ZE
n
zE µµ
1
)(
1
)(
2. Time average is consistent for the mean
=+++++
+++++++++=
=++=
===
−−−−
−−−
=
−−−
= =
−
= =
∑
∑∑∑∑
)](
)()[(
)(
),cov(
1
)var(
0)2()1(
21011102
0
1
212
0
1 1
2
0
1 1
2
ρρρ
ρρρρρρρ
γ
ρρρ
γ
ρ
γ



nn
nn
n
t
nttt
n
t
n
s
st
n
t
n
s
st
n
n
n
ZZ
n
z
Ergodicity for the mean (cont)Ergodicity for the mean (cont)
∑
∑
∑ ∑
↓↓
→−=
−=−=
∞→∞→
−
−−=
k
k
k
nn
n
nk k
kk
n
k
n
z
n
k
n
kn
n
z
k
0
1
)1(
0
2
0
0
0)1(lim)var(lim
)1()()var(
ρ
ρ
γ
ρ
γ
ρ
γ
A covariance-stationary process is ergodic for the mean if
µ== )(lim tZEzp
A sufficient condition for ergodicity for the mean is
0asisthat
or
0k
k
0
→∞→
∞<∞< ∑∑
∞
=
∞
=
k
k
k
k ρ
ργ
Ergodicity under GaussanityErgodicity under Gaussanity
If { }tZ is a stationary Gaussian process, ∞<∑k
kρ
is sufficient to ensure ergodicity for all moments
Where are We?Where are We?
The Prediction Problem as a Motivating Problem:
Predict Zt+1 given some information set It at time t.
The conditional expectation can be modeled in a parametric way or
in a non-parametric way. We will choose in this course the former.
Parametric models can be linear or non-linear. We will choose in
this course the former way too. Summarizing the models we are
going to study and use in this course will be
Parametric and linear models
]tI|1tZ[E1tZ:Solution
2]1tZ1tZ[EMin
+=+
+−+


Some ProblemsSome Problems
P1: Let {Zt} be a sequence of uncorrelated real-valued variables with zero means and unit variances,
and define the “moving average”
for constants α0, α1, ... , αρ . Show that Y is weak stationary and find its autocovariance function
P2: Show that a Gaussian process is strongly stationary if and only if it is weakly stationary
P3: Let X be a stationary Gaussian process with zero mean, unit variance, and autocovariance function
c. Find the autocovariance functions of the process
itZ
r
0i
itY −
=
α=
∑
}t:3)t(X{3Xand}t:2)t(X{2X ∞<<−∞=∞<<−∞=
Appendix: TransformationsAppendix: Transformations
•Goal: To lead to a more manageable process
•Log transformation reduces certain type of
heteroskedasticity. If we assume µt=E(Xt) and V(Xt) = k µ2
t,
the delta method shows that the variance of the log is roughly
constant:
•Differencing eliminates the trend (not very informative about
the nature of the trend)
•Differencing + Log = Relative Change
k)tZ(Var2)t/1()tZ(log(Var)Z(Var2)('f))Z(f(Var =µ≈⇒µ≈
1tZ
1tZtZ
)
1tZ
1tZtZ
1log()
1tZ
tZ
log()1tZlog()tZlog(
−
−−
≈
−
−−
+=
−
=−−

More Related Content

PPTX
【DL輪読会】Representational Continuity for Unsupervised Continual Learning ( ICLR...
PDF
机器学习概述
PPTX
fuzzy set
PDF
Unsupervised Anomaly Detection with Generative Adversarial Networks to Guide ...
PPTX
深層学習の数理
PDF
カメラ間人物照合サーベイ
PPT
Forcasting Techniques
PPT
Matlab tme series benni
【DL輪読会】Representational Continuity for Unsupervised Continual Learning ( ICLR...
机器学习概述
fuzzy set
Unsupervised Anomaly Detection with Generative Adversarial Networks to Guide ...
深層学習の数理
カメラ間人物照合サーベイ
Forcasting Techniques
Matlab tme series benni

Viewers also liked (6)

PPT
Chap15 time series forecasting & index number
PDF
Princing insurance contracts with R
PPTX
Time Series
PPT
PPTX
time series analysis
Chap15 time series forecasting & index number
Princing insurance contracts with R
Time Series
time series analysis
Ad

Similar to Introduction - Time Series Analysis (20)

PPT
stochastic processes and properties -2.ppt
PPT
stochastic processes-2.ppt
PDF
ch9.pdf
PDF
Time Series Analysis with R
PDF
Introduction to queueing systems with telecommunication applications
PDF
Introduction to queueing systems with telecommunication applications
PPT
Elements Of Stochastic Processes
PPT
4 stochastic processes
PDF
Unit 1 PPT .d ocx.pdf ppt presentation
PPTX
Av 738- Adaptive Filtering - Background Material
PPTX
Random vibrations
PDF
Eonometrics for acct and finance ch 6 2023 (2).pdf
PPTX
Dsp presentation
PDF
Lecture_Random Process_Part-1_July-Dec 2023.pdf
PDF
160511 hasegawa lab_seminar
PDF
Stochastic Processes - part 3
PPTX
Introduction to time series.pptx
PDF
Interest Rate Modelling Lecture_Part1.pdf
PDF
Time Series for FRAM-Second_Sem_2021-22 (1).pdf
PPT
Timeseries_presentation.ppt
stochastic processes and properties -2.ppt
stochastic processes-2.ppt
ch9.pdf
Time Series Analysis with R
Introduction to queueing systems with telecommunication applications
Introduction to queueing systems with telecommunication applications
Elements Of Stochastic Processes
4 stochastic processes
Unit 1 PPT .d ocx.pdf ppt presentation
Av 738- Adaptive Filtering - Background Material
Random vibrations
Eonometrics for acct and finance ch 6 2023 (2).pdf
Dsp presentation
Lecture_Random Process_Part-1_July-Dec 2023.pdf
160511 hasegawa lab_seminar
Stochastic Processes - part 3
Introduction to time series.pptx
Interest Rate Modelling Lecture_Part1.pdf
Time Series for FRAM-Second_Sem_2021-22 (1).pdf
Timeseries_presentation.ppt
Ad

Recently uploaded (20)

PPT
LEC Synthetic Biology and its application.ppt
PDF
Packaging materials of fruits and vegetables
PPT
THE CELL THEORY AND ITS FUNDAMENTALS AND USE
PDF
Chapter 3 - Human Development Poweroint presentation
PDF
Science Form five needed shit SCIENEce so
PPTX
limit test definition and all limit tests
PDF
The Future of Telehealth: Engineering New Platforms for Care (www.kiu.ac.ug)
PDF
Unit 5 Preparations, Reactions, Properties and Isomersim of Organic Compounds...
PDF
Cosmology using numerical relativity - what hapenned before big bang?
PPTX
ELISA(Enzyme linked immunosorbent assay)
PPTX
Platelet disorders - thrombocytopenia.pptx
PPT
Enhancing Laboratory Quality Through ISO 15189 Compliance
PDF
Is Earendel a Star Cluster?: Metal-poor Globular Cluster Progenitors at z ∼ 6
PPTX
Cells and Organs of the Immune System (Unit-2) - Majesh Sir.pptx
PPT
Mutation in dna of bacteria and repairss
PPTX
SCIENCE 4 Q2W5 PPT.pptx Lesson About Plnts and animals and their habitat
PPTX
Introcution to Microbes Burton's Biology for the Health
PDF
Worlds Next Door: A Candidate Giant Planet Imaged in the Habitable Zone of ↵ ...
PDF
5.Physics 8-WBS_Light.pdfFHDGJDJHFGHJHFTY
PDF
From Molecular Interactions to Solubility in Deep Eutectic Solvents: Explorin...
LEC Synthetic Biology and its application.ppt
Packaging materials of fruits and vegetables
THE CELL THEORY AND ITS FUNDAMENTALS AND USE
Chapter 3 - Human Development Poweroint presentation
Science Form five needed shit SCIENEce so
limit test definition and all limit tests
The Future of Telehealth: Engineering New Platforms for Care (www.kiu.ac.ug)
Unit 5 Preparations, Reactions, Properties and Isomersim of Organic Compounds...
Cosmology using numerical relativity - what hapenned before big bang?
ELISA(Enzyme linked immunosorbent assay)
Platelet disorders - thrombocytopenia.pptx
Enhancing Laboratory Quality Through ISO 15189 Compliance
Is Earendel a Star Cluster?: Metal-poor Globular Cluster Progenitors at z ∼ 6
Cells and Organs of the Immune System (Unit-2) - Majesh Sir.pptx
Mutation in dna of bacteria and repairss
SCIENCE 4 Q2W5 PPT.pptx Lesson About Plnts and animals and their habitat
Introcution to Microbes Burton's Biology for the Health
Worlds Next Door: A Candidate Giant Planet Imaged in the Habitable Zone of ↵ ...
5.Physics 8-WBS_Light.pdfFHDGJDJHFGHJHFTY
From Molecular Interactions to Solubility in Deep Eutectic Solvents: Explorin...

Introduction - Time Series Analysis

  • 1. Copyright(© MTS-2002GG): You are free to use and modify these s Introduction to Time Series AnalysisIntroduction to Time Series Analysis Gloria González-Rivera University of California, Riverside and Jesús Gonzalo U. Carlos III de Madrid Spring 2002
  • 2. • Sample Space: , the set of possible outcomes of some random experiment • Outcome: , a single element of the Sample Space • Event: , a subset of the Sample Space • Field: , the collection of Events we will be considering • Random Variables: , a function from the Sample Space Ω to a State Space S • State Space: S, a space containing the possible values of a random variables –common choices are the integers N, reals R, k-vectors Rk , complex numbers C, positive reals R+, etc • Probability: , obeying the three rules that you must very well know • Distribution: is the Borel sets (intervals, etc) Brief Review of ProbabilityBrief Review of Probability Ω∈ω }{Ω ω= Ω⊂E }E:E Ω⊂{=F S:Z →Ω ]1,0[:P →F }RA:A{where,]1,0[: ⊂⊂→µ BB
  • 3. •Random Vectors: Z= (Z1, Z2 , ..., Zn) is a n-dimensional random vector if its components Z1 , ..., Zn are one-dimensional real-valued random variables If we interpret t=1, ..., n as equidistant instants of time, Zt can stand for the outcome of an experiment at time t . Such a time series may, for example, consists of Toyota share prices Zt at n succeeding days. The new aspect now, compared to a one-dimensional radnom variable, is that now we can talk about the dependence structure of the random vector. •Distribution function FZ of Z : It is the collection of the probabilities Brief Review (cont)Brief Review (cont) })nz)(nZ,...,1z)(1Z:({P )nznZ,...,1z1Z(P)z(ZF ≤ω≤ωω= ≤≤=
  • 4. We suppose that the exchange rate €/$ at every fixed instant t between 5p.m and 6p.m. this afternoon is random. Therefore we can interpret it as a realization Zt(ω) of the random variable Zt, and so we observe Zt(ω), 5<t<6. In order to make a guess at 6 p.m. about the exchange rate Z19(ω) at 7 p.m. it is reasonable to look at the whole evolution of Zt(ω) between 5 and 6 p.m. A mathematical model describing this evolution is called a stochastic process. Stochastic ProcessesStochastic Processes
  • 5. RZZ tt →Ω:),(ω Suppose that (1) For a fixed t Changing the time index, we can generate several random variables: )(),.......(),( 21 ωωω nttt ZZZ (2) For fixed ω This is just a random variable. RTZ →:ω This is a realization or sample function This collection of random variables is called a STOCHASTIC PROCESSA realization of the stochastic process is called a TIME SERIES From which a realization is: nt t tz z z,.... ,2 1 A stochastic process is a collection of time indexed random variables defined on some space Ω. ),Tt),(tZ()Tt,tZ( Ω∈ω∈ω=∈ Stochastic Processes (cont)Stochastic Processes (cont)
  • 6. Examples of stochastic processesExamples of stochastic processes E1: Let the index set be T={1, 2, 3} and let the space of outcomes (Ω) be the possible outcomes associated with tossing one dice: Ω={1, 2, 3, ,4 ,5, 6} Define Z(t, ω)= t + [value on dice]2 t Therefore for a particular ω, say ω3={3}, the realization or path would be (10, 20, 30). Q1: Draw all the different realizations (six) of this stochastic process. Q2: Think on an economic relevant variable as an stochastic process and write down an example similar to E1 with it. Specify very clear the sample space and the “rule” that generates the stochastic process. E2: A Brownian Motion B=(Bt, t ∈[0, infty]): • It starts at zero: Bo=0 • It has stationary, independent increments • For evey t>0, Bt has a normal N(0, t) distribution • It has continuous sample paths: “no jumps”.
  • 7. Distribution of a Stochastic ProcessDistribution of a Stochastic Process In analogy to random variables and random vectors we want to introduce non-random characteristics of a stochastic process such as its distribution, expectation, etc. and describe its dependence structure. This is a task much more complicated that the description of a random vector. Indeed, a non-trivial stochastic process Z=(Zt, t ∈ T) with infinite index set T is an infinite-dimensional object; it can be inderstood as the infinite collection of the random variables Zt, t ∈ T. Since the values of Z are functions on T, the distribution of Z should be defined on subsets of a certain “function space”, i.e. P(X ∈ A), A ∈ F, where F is a collection of suitable subsets of this space of functions. This approach is possible, but requires advanced mathematics, and so we will try something simpler. The finite-dimensional distributions (fidis) of the stochastic process Z are the distributions of the finite-dimensional vectors (Zt1,..., Ztn), t1, ..., tn ∈T, for all possible choices of times t1, ..., tn ∈ T and every n ≥ 1.
  • 8. StationarityStationarity Consider the joint probability distribution of the collection of random variables ),...,(),.....,( 221121 nnn ttttttttt zZzZzZPzzzF ≤≤≤= 1st order stationary process if ktanyforzFzF ktt ,)()( 111 += n-order stationary process if kttanyforzzFzzF ktkttt ,,),(),( 212121 ++= kttanyforzzFzzF nktkttt nn ,,).....().....( 111 ++= Definition. A process is strongly (strictly) stationary if it is a n-order stationary process for any n. 2nd order stationary process if
  • 10. Moments (cont)Moments (cont) For strictly stationary process: 22 σσ µµ = = t t because µµµ ==→= ++ kttktt zFzF 1111 )()( provided that ∞< ∞<) ( , ) (2 t tZ E Z E k ktkttt ktkttt ktttkttt ttktt ktkttt zzzz zzFzzF ρρρρ ρρ =+=−= =−= ++= ⇒= ⇒= ++ ++ ),(),(),( then,andlet ),(),( ),cov(),cov( ),(),( 21 21 2121 2121 2121 The correlation between any two random variables depends on the time difference
  • 11. Weak StationarityWeak Stationarity A process is said to be n-order weakly stationary if all its joint moments up to order n exist and are time invariant. Covariance stationary process (2nd order weakly stationary): • constant mean • constant variance • covariance function depends on time difference between R.V.
  • 12. Autocovariance and Autocorrelation FunctionsAutocovariance and Autocorrelation Functions For a covariance stationary process: 0 2 2 )var()var( ),cov( ),( )( )( γ γ σ γ ρ γ σ µ kk ktt ktt k tsst t t ZZ ZZ ZZCov ZVar ZE === = = = + + − ]1,1[k: (ACF)functionationautocorrel:k Rk: functionanceautocovari:k −→ρ ρ →γ γ
  • 13. Properties of the autocorrelation functionProperties of the autocorrelation function 1. 2. 3. 1then)var(If 00 == ργ tZ 0 k 1 t,coefficienncorrelatioaisSince γγρ ρ ≤⇒≤ kk ktkt kktktk kk kk ZZE ZZE −− +−− − − =−−= =−−= = = γµµ µµγ ρρ γγ ))(( ))((since )(
  • 14. Partial Autocorrelation Function (conditional correlation)Partial Autocorrelation Function (conditional correlation) This function gives the correlation between two random variables that are k periods apart when the in-between linear dependence (between t and t+k ) is removed. ),......|,(bygivenisPACFthe ,variablesrandomtwobeandLet 11 −+++ + kttktt ktt ZZZZ ZZ ρ Motivation Think about a regression model (without loss of generality, assume that E(Z)=0) kjkk......2j2k1j1kj nsexpectatiotake)2( jktZktejktZtZkk......jktZ2ktZ2kjktZ1ktZ1kktZjktZ jktZbymultiply(1) 1jjktZwitheduncorrelatisktewhere ktetZkk......2ktZ2k1ktZ1kkt Z −γφ+−γφ+−γφ=γ −+++−+φ+−+−+φ+−+−+φ=+−+ −+ ≥−++ ++φ+−+φ+−+φ= +
  • 15. kjkkjkjk −−− ++= ρφρφρφρ ......2211j Dividing by the variance of the process: kj ,...2,1= 011 2112 1011 ....... ....... ....... ρφρφρ ρφρφρ ρφρφρ kkkkk kkkk kkkk ++= ++= ++= − − −  Yule-Walker equations 0331322313 1330321312 2331320311 0221212 1220211 1110111 3 2 1 ρφρφρφρ ρφρφρφρ ρφρφρφρ ρφρφρ ρφρφρ ρφρφρ ++= ++= ++== += +== =⇒== k k k 1 1 1 1 1 21 1 22 ρ ρ ρρ ρ φ =⇒ 1 1 1 1 1 12 11 21 312 21 11 33 ρρ ρρ ρρ ρρρ ρρ ρρ φ =⇒
  • 16. E4: Zt= Yt if t is even Yt+1 if t is odd where Yt is a stationary time series. Is Zt weak stationary? E5: Define the process St = X1+ ... + Xn , where Xi is iid (0, σ2 ). Show that for h>0 Cov (St+h, St) = t σ2 , and therefore St is not weak stationary. Examples of stochastic processesExamples of stochastic processes
  • 17. Examples of stochastic processesExamples of stochastic processes (cont)(cont) E6: White Noise Process A sequence of uncorrelated random variables is called a white noise process. { } 0for0),( )( )0(normally)(: 2 ≠= = == + kaaCov aVar aEa ktt at aatt σ µµ    ≠ = =    ≠ = =    ≠ = = 00 01 00 01 00 0 ationautocorrelandanceAutocovari 2 k k k k k k kk k a k φ ρ σ γ . . . .1 2 3 4 k kρ
  • 18. Dependence: ErgodicityDependence: Ergodicity • See Reading 1 from Leo Breiman (1969) “Probability and Stochastic Processes: With a View Toward Applications” • We want to allow as much dependence as the Law of Large Numbers (LLN) let us do it • Stationarity is not enough as the following example shows: E7: Let {Ut} be a sequence of iid r.v uniformly distributed on [0, 1] and let Z be N(0,1) independent of {Ut}. Define Yt=Z+Ut . Then Yt is stationary (why?), but The problem is that there is too much dependence in the sequence {Yt}. In fact the correlation between Y1 and Yt is always positive for any value of t. 2 1 ZnY 2 1 )tY(E no n 1t tY n 1 nY →− =→ = = ∑
  • 19. Ergodicity for the meanErgodicity for the mean Need to distinguishing between: 1. Ensemble average 2. Time average Objective: estimate the mean of the process Which estimator is the most appropriate? Ensemble average Problem: It is impossible to calculate Under which circumstances we can use the time average? m Z z m i i∑= = 1 n Z z n t t∑= = 1 { }tZ Is the time average an unbiased and consistent estimator of the mean? )( tZE=µ
  • 20. Ergodicity for the mean (cont)Ergodicity for the mean (cont) Reminder. Sufficient conditions for consistency of an estimator. 0)ˆvar(limand)ˆ(lim T == ∞→∞→ TT T E θθθ 1. Time average is asymptotically unbiased ∑ ∑ === t t t n ZE n zE µµ 1 )( 1 )( 2. Time average is consistent for the mean =+++++ +++++++++= =++= === −−−− −−− = −−− = = − = = ∑ ∑∑∑∑ )]( )()[( )( ),cov( 1 )var( 0)2()1( 21011102 0 1 212 0 1 1 2 0 1 1 2 ρρρ ρρρρρρρ γ ρρρ γ ρ γ    nn nn n t nttt n t n s st n t n s st n n n ZZ n z
  • 21. Ergodicity for the mean (cont)Ergodicity for the mean (cont) ∑ ∑ ∑ ∑ ↓↓ →−= −=−= ∞→∞→ − −−= k k k nn n nk k kk n k n z n k n kn n z k 0 1 )1( 0 2 0 0 0)1(lim)var(lim )1()()var( ρ ρ γ ρ γ ρ γ A covariance-stationary process is ergodic for the mean if µ== )(lim tZEzp A sufficient condition for ergodicity for the mean is 0asisthat or 0k k 0 →∞→ ∞<∞< ∑∑ ∞ = ∞ = k k k k ρ ργ
  • 22. Ergodicity under GaussanityErgodicity under Gaussanity If { }tZ is a stationary Gaussian process, ∞<∑k kρ is sufficient to ensure ergodicity for all moments
  • 23. Where are We?Where are We? The Prediction Problem as a Motivating Problem: Predict Zt+1 given some information set It at time t. The conditional expectation can be modeled in a parametric way or in a non-parametric way. We will choose in this course the former. Parametric models can be linear or non-linear. We will choose in this course the former way too. Summarizing the models we are going to study and use in this course will be Parametric and linear models ]tI|1tZ[E1tZ:Solution 2]1tZ1tZ[EMin +=+ +−+  
  • 24. Some ProblemsSome Problems P1: Let {Zt} be a sequence of uncorrelated real-valued variables with zero means and unit variances, and define the “moving average” for constants α0, α1, ... , αρ . Show that Y is weak stationary and find its autocovariance function P2: Show that a Gaussian process is strongly stationary if and only if it is weakly stationary P3: Let X be a stationary Gaussian process with zero mean, unit variance, and autocovariance function c. Find the autocovariance functions of the process itZ r 0i itY − = α= ∑ }t:3)t(X{3Xand}t:2)t(X{2X ∞<<−∞=∞<<−∞=
  • 25. Appendix: TransformationsAppendix: Transformations •Goal: To lead to a more manageable process •Log transformation reduces certain type of heteroskedasticity. If we assume µt=E(Xt) and V(Xt) = k µ2 t, the delta method shows that the variance of the log is roughly constant: •Differencing eliminates the trend (not very informative about the nature of the trend) •Differencing + Log = Relative Change k)tZ(Var2)t/1()tZ(log(Var)Z(Var2)('f))Z(f(Var =µ≈⇒µ≈ 1tZ 1tZtZ ) 1tZ 1tZtZ 1log() 1tZ tZ log()1tZlog()tZlog( − −− ≈ − −− += − =−−

Editor's Notes

  • #6: Add graph to generate intuition
  • #12: Write graphs showing a process with time-variant mean, and time-variant variance
  • #13: Make a picture of the autocorrelogram