Time Series Econometrics
1. Stochastic Process
A random/stochastic process {x(t)} is a family of random variables indexed by the symbol t, where t belongs to some given index set T.
If it takes a continuous range of values {x(t)} is said to be a continuous parameters process. And if it takes a discrete set of values, it is said to be a discrete parameter process.
A random or stochastic process is a collection of random variables ordered in time. An example of the continuous stochastic process is an electrocardiogram, and an example the discrete stochastic process is GDP. Here GDP is a stochastic process and the actual values we observe in reality are particular realizations of that process. Thus, the distinction between the stochastic process and its realization is just like the distinction between population and sample in cross-sectional data. Just as we use sample data to draw inferences about a population, in time series we use the realization to draw inferences about the underlying stochastic process.
2. Stationarity
Strict Stationarity:
The process {x(t)} is said to be strictly stationary if its joint probability density function at time t1, t2, t3......tn is identical with the distribution at t1+k, t2+k,........,tn+k. In other words, its probability distribution does not change under the shift of the time origin. This is a very strict concept of stationarity.
Weak Stationarity/Second Order Stationarity/Covariance Stationarity
A process is said to be weakly stationarity if its first two moments do not depend on the time index.
Properties of Second Order Stationary Process
Let x(t) be the second order stationary process. Then :
A. E[x(t)]=μ (a constant)
B. E[x²(t)]=μ2’ (a constant)
C. E[x(t)x(s)]= is a function of t-s only. i.e the expected value of the product of two random variables at distinct points depend on the time interval only.
Implications :
A and B imply that var{x(t)]= μ2’-μ2=σ² does not depend on t .
A and C imply that covariance function does not depend on time variable t.
cov [x(t)x(s)]=E[x(t)x(s)]-μ2 is a function of t-s only.
To summarize, a stochastic process is said to be stationary if its mean and variance are constant over time and the value of the covariance between the two time periods depends only on the distance or gap or lag between the two time periods and not the actual time at which the covariance is computed.
In the time series literature, such a stochastic process is known as a weakly stationary, or covariance stationary, or second-order stationary, or wide sense, stochastic process.
In the time series literature, such a stochastic process is known as a weakly stationary, or covariance stationary, or second-order stationary, or wide sense, stochastic process.
If a time series is stationary, its mean, variance, and auto covariance (at various lags) remain the same no matter at what point we measure them; that is, they are time invariant. Such a time series will tend to return to its mean and fluctuations around this mean (measured by its variance) will have a broadly constant amplitude.
Figure 1 below illustrates how a stationary process looks like where its mean and variance seems to be constant whereas figure 2 depicts an example of non-stationary process where both mean and variance seem to increase over time.
![]() |
| Fig. 1 : Staionary Process |
![]() |
| Non-stationary Process |
Let Xt = Xt-1 +ut .
Where u is an i.i.d. Upon recursive substitutions, we get :
Xt = Xt-2 +ut +ut-1
Xt=Xt-3+ut+ut-1+ut-2
Xt=Xt-4+ut+ut-1+ut-2+ut-3
Xt=X0+ut+ut-1+ut-2+ut-3+...............
Taking expectation,
E(Xt=)=E(X0)+E(∑ut)
E(Xt=)=X0
The mean is constant.
var(Xt)=σ²+σ²+σ²+......
var(Xt)=tσ²
The variances increases with t . Thus a pure random walk is a non-stationary stochastic process.
If we consider a random walk with drift,
Xt=δ+Xt-1+ut
we can show that
E(Xt) =X0+tδ
Var(Xt)=tσ²
In this case, both mean and variance of the process varies with time. Thus, it is a non-stationary stochastic process.
Relation between Stationarity and Strict Stationarity
In general, strict stationarity does not imply stationarity and vice versa.
But if the moments of Xt exists up to order 2, strict stationarity implies stationarity. If Xt is a stationary normal process, then it is strictly stationary as the Gaussian distribution is completely characterized by its mean and variance.
Elimination of Trend to Achieve Stationarity
A classical decomposition is yt= mt +st +xt
Where
mt =trend component
st =seasonal component
xt = cyclical component that is weakly stationary .
We want to remove mt and st so that what remains is stationary.
A. Least Square Estimation of mt
Here we estimate a function of the type :
mt=a0+a1t (Simplest case-we can even estimate a quadratic function in t)
And estimate by OLS to get the residuals which is the stationary part of the series.
In general, strict stationarity does not imply stationarity and vice versa.
But if the moments of Xt exists up to order 2, strict stationarity implies stationarity. If Xt is a stationary normal process, then it is strictly stationary as the Gaussian distribution is completely characterized by its mean and variance.
Elimination of Trend to Achieve Stationarity
A classical decomposition is yt= mt +st +xt
Where
mt =trend component
st =seasonal component
xt = cyclical component that is weakly stationary .
We want to remove mt and st so that what remains is stationary.
A. Least Square Estimation of mt
Here we estimate a function of the type :
mt=a0+a1t (Simplest case-we can even estimate a quadratic function in t)
And estimate by OLS to get the residuals which is the stationary part of the series.
B. Method of Differencing
Let the first lag operator be denoted by :
Δyt=(1-L)yt=yt-yt-1
The method consists of applying an order d difference operation on yt i.e.
Δdyt=(1-L)dyt =(1-L)dmt +(1-L)dxtWe pick d such that (1-L)dmt is constant and xt is stationary.
In general d=1 in economics and in few cases d=2.
Autocovariance and Autocorrelation Function
Consider that x(t) is a stationary process with mean μ and variance σ2. Then, the autocovariance function is defined by :
R(τ)=E[xt-μ)(xt-τ-μ)
Special case : when τ=0
R(0)= var(xt)=σ2
The autocorrelation function at lag τ is given by
ρ(τ)=R(τ)/R(0)
Properties of R(τ) and ρ(τ)
1. R(0)=σ2
2. |R(τ)|≤R(0) since |ρ(τ)|≤1.
3. If xt is a real valued R(-τ)=R(τ) denoted an even function.
4. ρ(0)=1
5. |ρ(τ)|≤1
6. If x is real valued, ρ(-τ)=ρ(τ).
Remark
: The autocorrelation function has ρ(τ) has all the same properties as
the autocovariance function in addition to the fact that ρ(0)=1
Wold's Theorem
This theorem states that any stationary stochastic process can be represented in terms of a linear combination of independent and identically distributed random variables with mean zero.
Formally, it can be expressed as :
Any stationary stochastic process can be expressed in the form
xt=dt+vt
where
dt and vt are uncorrelated processes
vt is a regular one sided representation of the form ∑0∞guet-u with g0=1 and ∑∘∞gu2 <∞ and et is a white noise process uncorrelated with dt.
Short Memeory and Long Memory Processes
This theorem states that any stationary stochastic process can be represented in terms of a linear


No comments:
Post a Comment