Đang tải... (xem toàn văn)
Review: Spectral density estimation, sample autocovariance.. The periodogram and sample autocovariance.[r]
(1)Introduction to Time Series Analysis Lecture 19.
1 Review: Spectral density estimation, sample autocovariance The periodogram and sample autocovariance
(2)Estimating the Spectrum: Outline
• We have seen that the spectral density gives an alternative view of stationary time series
• Given a realization x1, , xn of a time series, how can we estimate
the spectral density?
• One approach: replace γ(·) in the definition
f(ν) =
∞
X
h=−∞
γ(h)e−2πiνh,
with the sample autocovariance γˆ(·)
(3)Estimating the spectrum: Outline
• These two approaches are identical at the Fourier frequencies ν = k/n
• The asymptotic expectation of the periodogram I(ν) is f(ν) We can derive some asymptotic properties, and hence hypothesis testing
(4)Review: Spectral density estimation
If a time series {Xt} has autocovariance γ satisfying
P∞
h=−∞ |γ(h)| < ∞, then we define its spectral density as f(ν) =
∞
X
h=−∞
γ(h)e−2πiνh
(5)Review: Sample autocovariance
Idea: use the sample autocovariance γˆ(·), defined by
ˆ
γ(h) =
n
n−|h|
X
t=1
(xt+|h| − x¯)(xt − x¯), for −n < h < n, as an estimate of the autocovariance γ(·), and then use
ˆ
f(ν) =
n−1
X
h=−n+1
ˆ
γ(h)e−2πiνh
(6)Discrete Fourier transform
For a sequence (x1, , xn), define the discrete Fourier transform (DFT) as
(X(ν0), X(ν1), , X(νn−1)), where
X(νk) = √1
n
n
X
t=1
xte−2πiνkt,
and νk = k/n (for k = 0,1, , n − 1) are called the Fourier frequencies. (Think of {νk : k = 0, , n − 1} as the discrete version of the frequency range ν ∈ [0,1].)
(7)Discrete Fourier transform
Consider the space Cn of vectors of n complex numbers, with inner product
ha, bi = a∗b, where a∗ is the complex conjugate transpose of the vector
a ∈ Cn.
Suppose that a set {φj : j = 0,1, , n − 1} of n vectors in Cn are
orthonormal:
hφj, φki =
1 if j = k,
0 otherwise
Then these {φj} span the vector space Cn, and so for any vector x, we can
write x in terms of this new orthonormal basis,
x =
n−1
X
j=0
(8)Discrete Fourier transform
Consider the following set of n vectors in Cn:
ej = √1
n e
2πiνj, e2πi2νj, , e2πinνj′
: j = 0, , n −
It is easy to check that these vectors are orthonormal:
hej, eki =
n
n
X
t=1
e2πit(νk−νj) =
n
n
X
t=1
e2πi(k−j)/nt
=
1 if j = k,
1
ne2πi(k−j)/n
1−(e2πi(k−j)/n)n
1−e2πi(k−j)/n otherwise =
1 if j = k,
(9)Discrete Fourier transform
where we have used the fact that Sn = Pn
t=1 αt satisfies
αSn = Sn + αn+1 − α and so Sn = α(1 − αn)/(1 − α) for α 6=
So we can represent the real vector x = (x1, , xn)′ ∈ Cn in terms of this
orthonormal basis,
x =
n−1
X
j=0
hej, xiej =
n−1
X
j=0
X(νj)ej
That is, the vector of discrete Fourier transform coefficients
(10)Discrete Fourier transform
An alternative way to represent the DFT is by separately considering the real and imaginary parts,
X(νj) = hej, xi = √1
n
n
X
t=1
e−2πitνjxt
= √1
n
n
X
t=1
cos(2πtνj)xt − i√1 n
n
X
t=1
sin(2πtνj)xt
= Xc(νj) − iXs(νj),
(11)Periodogram
The periodogram is defined as
I(νj) = |X(νj)|2 = n n X t=1
e−2πitνjxt
= Xc2(νj) + Xs2(νj)
Xc(νj) = √ n n X t=1
cos(2πtνj)xt,
Xs(νj) = √ n n X t=1
(12)Periodogram
Since I(νj) = |X(νj)|2 for one of the Fourier frequencies νj = j/n (for
j = 0,1, , n − 1), the orthonormality of the ej implies that we can write
x∗x =
n−1
X
j=0
X(νj)ej
∗ n−1 X j=0
X(νj)ej
= n−1 X j=0
|X(νj)|2 =
n−1
X
j=0
I(νj)
For x¯ = 0, we can write this as
ˆ
σx2 =
n
n
X
t=1
x2t =
n
n−1
X
j=0
(13)Periodogram
This is the discrete analog of the identity
σx2 = γx(0) =
Z 1/2
−1/2
fx(ν)dν
(Think of I(νj) as the discrete version of f(ν) at the frequency νj = j/n, and think of (1/n)P
νj · as the discrete version of
R
(14)Estimating the spectrum: Periodogram
Why is the periodogram at a Fourier frequency (that is, ν = νj) the same as
computing f(ν) from the sample autocovariance?
Almost the same—they are not the same at ν0 = when x¯ 6= But if either x¯ = 0, or we consider a Fourier frequency νj with
(15)Estimating the spectrum: Periodogram
I(νj) = n n X t=1
e−2πitνjxt
= n n X t=1
e−2πitνj(xt − x¯)
= n n X t=1
e−2πitνj(xt − x¯)
! n
X
t=1
e2πitνj(xt − x¯)
!
=
n
X
s,t
e−2πi(s−t)νj(xs − x¯)(xt − x¯) =
n−1
X
h=−n+1
ˆ
γ(h)e−2πihνj,
where the fact that νj 6= implies Pnt=1 e−2πitνj = (we showed this
(16)Asymptotic properties of the periodogram
We want to understand the asymptotic behavior of the periodogram I(ν) at a particular frequency ν, as n increases We’ll see that its expectation
converges to f(ν)
We’ll start with a simple example: Suppose that X1, , Xn are i.i.d N(0, σ2) (Gaussian white noise) From the definitions,
Xc(νj) =
√ n
n
X
t=1
cos(2πtνj)xt, Xs(νj) =
√ n
n
X
t=1
sin(2πtνj)xt,
we have that Xc(νj) and Xs(νj) are normal, with
(17)Asymptotic properties of the periodogram
Also,
Var(Xc(νj)) = σ
2 n
n
X
t=1
cos2(2πtνj)
= σ
2
2n
n
X
t=1
(cos(4πtνj) + 1) =
σ2
2
(18)Asymptotic properties of the periodogram
Also,
Cov(Xc(νj), Xs(νj)) = σ
2 n
n
X
t=1
cos(2πtνj) sin(2πtνj)
= σ
2
2n
n
X
t=1
sin(4πtνj) = 0,
Cov(Xc(νj), Xc(νk)) =
Cov(Xs(νj), Xs(νk)) =
Cov(Xc(νj), Xs(νk)) =
(19)Asymptotic properties of the periodogram
That is, if X1, , Xn are i.i.d N(0, σ2)
(Gaussian white noise; f(ν) = σ2), then the Xc(νj) and Xs(νj) are all i.i.d N(0, σ2/2) Thus,
2
σ2 I(νj) =
2
σ2 X
c(νj) + Xs2(νj)
∼ χ22
So for the case of Gaussian white noise, the periodogram has a chi-squared distribution that depends on the variance σ2 (which, in this case, is the
(20)Asymptotic properties of the periodogram
Under more general conditions (e.g., normal {Xt}, or linear process {Xt}
with rapidly decaying ACF), the Xc(νj), Xs(νj) are all asymptotically
independent and N(0, f(νj)/2)
Consider a frequency ν For a given value of n, let νˆ(n) be the closest Fourier frequency (that is, νˆ(n) = j/n for a value of j that minimizes
|ν − j/n|) As n increases, νˆ(n) → ν, and (under the same conditions that ensure the asymptotic normality and independence of the sine/cosine
transforms), f(ˆν(n)) → f(ν) (picture)
In that case, we have
2
f(ν)I(ˆν
(n)) = f(ν)
(21)
Asymptotic properties of the periodogram
Thus,
EI(ˆν(n)) = f(ν)
2 E
2
f(ν)
Xc2(ˆν(n)) + Xs2(ˆν(n))
→ f(2ν)E(Z12 + Z22) = f(ν),
(22)Introduction to Time Series Analysis Lecture 19.
1 Review: Spectral density estimation, sample autocovariance The periodogram and sample autocovariance