FOURIER TRANSFORMS OF STATIONARY PROCESSES 1. k=1

Similar documents
Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535

Central Limit Theorem for Non-stationary Markov Chains

STA205 Probability: Week 8 R. Wolpert

UNIT ROOT TESTING FOR FUNCTIONALS OF LINEAR PROCESSES

Mi-Hwa Ko. t=1 Z t is true. j=0

MOMENT CONVERGENCE RATES OF LIL FOR NEGATIVELY ASSOCIATED SEQUENCES

A note on the growth rate in the Fazekas Klesov general law of large numbers and on the weak law of large numbers for tail series

Gaussian Processes. 1. Basic Notions

Goodness-of-Fit Tests for Time Series Models: A Score-Marked Empirical Process Approach

ENEE 621 SPRING 2016 DETECTION AND ESTIMATION THEORY THE PARAMETER ESTIMATION PROBLEM

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Faithful couplings of Markov chains: now equals forever

Random Bernstein-Markov factors

Citation Osaka Journal of Mathematics. 41(4)

Probability and Measure

A CLT FOR MULTI-DIMENSIONAL MARTINGALE DIFFERENCES IN A LEXICOGRAPHIC ORDER GUY COHEN. Dedicated to the memory of Mikhail Gordin

A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES

The Codimension of the Zeros of a Stable Process in Random Scenery

A GENERAL THEOREM ON APPROXIMATE MAXIMUM LIKELIHOOD ESTIMATION. Miljenko Huzak University of Zagreb,Croatia

The Hilbert Transform and Fine Continuity

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

A regeneration proof of the central limit theorem for uniformly ergodic Markov chains

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Soo Hak Sung and Andrei I. Volodin

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Practical conditions on Markov chains for weak convergence of tail empirical processes

Conditional independence, conditional mixing and conditional association

Complete Moment Convergence for Sung s Type Weighted Sums of ρ -Mixing Random Variables

Stochastic integration. P.J.C. Spreij

On the Set of Limit Points of Normed Sums of Geometrically Weighted I.I.D. Bounded Random Variables

1 Math 241A-B Homework Problem List for F2015 and W2016

Mixing time for a random walk on a ring

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

BERNOULLI DYNAMICAL SYSTEMS AND LIMIT THEOREMS. Dalibor Volný Université de Rouen

Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals

GAUSSIAN MEASURE OF SECTIONS OF DILATES AND TRANSLATIONS OF CONVEX BODIES. 2π) n

Wittmann Type Strong Laws of Large Numbers for Blockwise m-negatively Associated Random Variables

Independence of some multiple Poisson stochastic integrals with variable-sign kernels

Lecture 12. F o s, (1.1) F t := s>t

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

An almost sure central limit theorem for the weight function sequences of NA random variables

Regularity of the density for the stochastic heat equation

Weak invariance principles for sums of dependent random functions

Convergence Rates for Renewal Sequences

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments

ON THE COMPLETE CONVERGENCE FOR WEIGHTED SUMS OF DEPENDENT RANDOM VARIABLES UNDER CONDITION OF WEIGHTED INTEGRABILITY

An almost sure invariance principle for additive functionals of Markov chains

PROBABILISTIC METHODS FOR A LINEAR REACTION-HYPERBOLIC SYSTEM WITH CONSTANT COEFFICIENTS

The main results about probability measures are the following two facts:

Logarithmic scaling of planar random walk s local times

3. WIENER PROCESS. STOCHASTIC ITÔ INTEGRAL

Chapter 4. The dominated convergence theorem and applications

A Single-Pass Algorithm for Spectrum Estimation With Fast Convergence Han Xiao and Wei Biao Wu

STAT 7032 Probability Spring Wlodek Bryc

Exercises Measure Theoretic Probability

Functional central limit theorem for super α-stable processes

Part II Probability and Measure

SPACE AVERAGES AND HOMOGENEOUS FLUID FLOWS GEORGE ANDROULAKIS AND STAMATIS DOSTOGLOU

arxiv: v1 [math.pr] 1 May 2014

SUPPLEMENT TO PAPER CONVERGENCE OF ADAPTIVE AND INTERACTING MARKOV CHAIN MONTE CARLO ALGORITHMS

Martingale approximations for random fields

On the Central Limit Theorem for an ergodic Markov chain

Applied Mathematics Letters

Almost sure limit theorems for U-statistics

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

1. Stochastic Processes and filtrations

ASYMPTOTICALLY NONEXPANSIVE MAPPINGS IN MODULAR FUNCTION SPACES ABSTRACT

Zeros of lacunary random polynomials

Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes

A central limit theorem for randomly indexed m-dependent random variables

Some Results on the Ergodicity of Adaptive MCMC Algorithms

Stochastic Optimization with Inequality Constraints Using Simultaneous Perturbations and Penalty Functions

Two remarks on normality preserving Borel automorphisms of R n

ELEMENTS OF PROBABILITY THEORY

Some basic elements of Probability Theory

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

ON CLASSES OF EVERYWHERE DIVERGENT POWER SERIES

The Kadec-Pe lczynski theorem in L p, 1 p < 2

Pointwise convergence rates and central limit theorems for kernel density estimators in linear processes

Empirical Processes: General Weak Convergence Theory

ON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS

Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

Some functional (Hölderian) limit theorems and their applications (II)

JOINT LIMIT THEOREMS FOR PERIODIC HURWITZ ZETA-FUNCTION. II

The Essential Equivalence of Pairwise and Mutual Conditional Independence

CHAPTER 3: LARGE SAMPLE THEORY

Lecture 17 Brownian motion as a Markov process

On Linear Processes with Dependent Innovations

On Kesten s counterexample to the Cramér-Wold device for regular variation

Central-limit approach to risk-aware Markov decision processes

THEOREMS, ETC., FOR MATH 515

AN EXTENSION OF THE HONG-PARK VERSION OF THE CHOW-ROBBINS THEOREM ON SUMS OF NONINTEGRABLE RANDOM VARIABLES

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

ERRATA: Probabilistic Techniques in Analysis

Applied Mathematics Letters. Stationary distribution, ergodicity and extinction of a stochastic generalized logistic system

Invariance principles for fractionally integrated nonlinear processes

LONG TIME BEHAVIOUR OF PERIODIC STOCHASTIC FLOWS.

On Reparametrization and the Gibbs Sampler

Optimal Stopping under Adverse Nonlinear Expectation and Related Games

Transcription:

FOURIER TRANSFORMS OF STATIONARY PROCESSES WEI BIAO WU September 8, 003 Abstract. We consider the asymptotic behavior of Fourier transforms of stationary and ergodic sequences. Under sufficiently mild conditions, central limit theorems are established for almost all frequencies as well as for a given frequency. Applications to the widely used linear processes and iterated random functions are discussed. Our results shed new light on the foundation of spectral analysis in that the asymptotic distribution of periodogram, the fundamental quantity in the frequency-domain analysis, is obtained.. Introduction Let (X n ) n Z be a stationary and ergodic Markov chain on the state space X ; let g be a real-valued function on X such that E[g(X 0 )] = 0 and E[ g (X 0 ) ] <. Define the Fourier transform S n (θ) = S n (θ; g) = g(x k )e ikθ, θ R, () k= where i = is the imaginary unit. The setup () is general enough to allow one to consider S Y n (θ) = n k= Y ke ikθ for any stationary and ergodic process (Y n ) n Z by constructing a Markov chain X n = (..., Y n, Y n ) and g(x n ) = Y n. The quantity () is of fundamental importance in the spectral analysis of stationary processes. This paper considers asymptotic normality of (). history. This problem has a substantial Rosenblatt (Theorem 5.3, p 3, 985) considers mixing processes; Brockwell and Davis (Theorem 0.3.., p 347, 99), Walker (965) and Terrin and Hurvich (994) discuss linear processes. Other contributions can be found in Olshen (967), Rootzén (976), Yajima (989) and Walker (000) among others. Mathematical Subject Classification (99): Primary 60F05, 60F7; secondary 60G35 Key words and phrases. Spectral analysis, linear process, martingale Central Limit Theorem, periodogram, Fourier transformation, nonlinear time series

Under surprisingly weak conditions (cf ()), we show that the real and imaginary parts of S n (θ)/ n are asymptotically iid normal for almost all frequencies θ R. Thus the periodogram n S n (θ) is asymptotically distributed as a multiple of a χ () random variable. For a fixed θ, we present sufficient conditions for the asymptotic normality of S n (θ)/ n. Our general results go beyond earlier ones by providing mild and verifiable conditions. In our proof, we apply the celebrated Carleson s theorem in Fourier analysis to construct approximating martingales. Applications to special sequences such as linear processes and iterated random functions which are widely used in time series modelling are discussed.. Main Results Let (Ω, B(Ω), P) be the probability space on which the sequence X n is defined, where B(Ω) is a sigma algebra on Ω. For a random variable ξ on the space (Ω, B(Ω), P), we denote its L (P) norm by ξ = E( ξ ). Define the projection operator P k ξ = E(ξ..., X k, X k ) E(ξ..., X k ). Then by the Markovian property, P g(x n ) = E[g(X n ) X ] E[g(X n ) X 0 ] for n N. Let Rz and Iz be the real and the imaginary parts of the complex number z, namely z = Rz+iIz, and let z = (Rz) + (Iz). Denote by N(µ, Σ) the multivariate normal distribution with mean vector µ and covariance matrix Σ; denote by Id p the p p identity matrix. Theorems and concern the asymptotic normality for almost all θ and for a given θ respectively. Theorem. Assume that n= n E[g(X n) X 0 ] <. () Then (i) for almost all θ R (Lebesgue), there exists 0 σ(θ) < such that R S n(θ) N[0, σ (θ)id ]. (3) I n

(ii) Moreover, for almost all pairs (θ, ϕ) (Lebesgue), S n (θ)/ n and S n (ϕ)/ n are asymptotically independent. Theorem. For a given θ [0, π), suppose there exists ξ(θ, X 0, X ) L (P) such that as N, and N e inθ P g(x n ) ξ(θ, X 0, X ) (L (P)), (4) n= E[S n (θ) X 0 ] = o(n). (5) Then (i) if θ 0, π, R I S n(θ) n N[0, σ (θ)id ], (6) where σ (θ) = ξ(θ, X 0, X ) /, and (ii) S n (θ)/ n N(0, σ ) if θ = 0 or π, where σ (θ) = ξ(θ, X 0, X ). Proposition. Assume that P g(x n ) P g(x n+ ) <. (7) n= Then (4) and (5), and consequently (6), hold for all 0 < θ < π. Condition () is fairly mild and it imposes very weak decay rate of E[g(X n ) X 0 ]. Rootzén (976) obtained a central limit theorem under conditions which are not easily verifiable. In comparison, our Conditions (4), (5) and (7) are tractable in many cases. It is easily seen that () is equivalent to k= k k E[g(X i ) X 0 ] < (8) i= by exchanging the order of summation. Notice that E[g(X n ) X 0 ] is non-increasing in n in view of P n g(x 0 ) = E[g(X 0 ) X n ] E[g(X 0 ) X n ]. So another equivalent condition of () is k= E[g(X k) X 0] <. Inequality (8) is needed to establish a connection between σ(θ) in (3) and spectral densities (cf Proposition ). 3

Let r k = E[g(X 0 )g(x k )] be the covariance function and introduce the spectral distribution function F (θ), 0 θ π via Herglotz s Theorem r k = π 0 exp(ikθ)df (θ). Assume that F is absolutely continuous with the spectral density function f, namely F (θ) = θ 0 f(u)du. Proposition. Assume (). Then for almost all θ [0, π] (Lebesgue), f(θ) = σ (θ)/π. Example. Iterated Random Functions. Let (X, ρ) be a complete and separable metric space and let X n = F εn (X n ), where F ε ( ) = F (, ε) is the ε-section of a jointly measurable function F : X Υ X and ε, ε n, n Z are iid random variables which take values in a second measurable space Υ. Define L ε = sup x x ρ[f ε (x), F ε (x )]/ρ(x, x ). Diaconis and Freedman (999) prove that X n admits a unique stationary distribution (say Π) if E(L α ε ) <, E(log L ε ) < 0, and E[ρ α (x 0, F ε (x 0 ))] < (9) for some α > 0 and x 0 X. Let g (δ) = sup{ [g(x) g(x )] [ρ(x,x ) δ] : X, X Π}. Corollary. Assume (9), E[g(X )] = 0 and E[ g(x ) l ] < for some l >. (a) If then () holds. (b) If / 0 / Then n= E[g(X n) X ] < and hence (7) hold. 0 g(t) dt <, (0) t log t g (t) dt <, () t Proof. Let X 0 Π and X 0 be independent of X 0 and (ε k ) k Z. For n let X n = F εn F εn... F ε (X 0). Then (9) implies that there exists β, C > 0 and 0 < r < such that E[ρ β (X n, X n)] Cr n hold for all n 0 (cf Lemma 3 in Wu and Woodroofe (000)). Let p = l/, q = p/(p ) and δ n = (Cr n ) /(q+β). Since E[g(X n) X 0 ] = 0, E[g(X n ) X 0 ] [g(x n ) g(x n)] ρ(xn,x n)<δ n + [g(x n ) g(x n)] ρ(xn,x n) δ n g (δ n ) + [g(x n ) g(x n)] / p [P(ρ(X n, X n) δ n )] q g (δ n ) + C δ / n 4

So () follows since (0) entails n= g(δ n )/n <. (b) similarly follows. Example. Linear processes. Let ε k, k Z be iid random variables with mean 0 and finite variance; let X n = (..., ε n, ε n ) and g(x n ) = i=0 a iε n i, where a i are real numbers such that i=0 a i <. Then (7) is reduced to i= a i a i < and the central limit theorem (6) holds for all 0 < θ < π. Notice that this can not be extended to θ = 0 if a n is not summable. For example, a n = n β, / < β < for n. 3. Proofs Let U be the uniform probability measure on Θ = [0, π) and B(Θ) be the class of the Borel sets of Θ. Denote by (Θ Ω, B(Θ) B(Ω), U P) the product probability space of (Θ, B(Θ), U) and (Ω, B(Ω), P). Lemma. Assume (). Then for almost all θ Θ (U), n E[S n (θ) X 0 ] 0 almost surely (P). Proof of Lemma. Let A B(Θ) B(Ω) be the set on which N n= n e inθ E[g(X n ) X 0 ] does not converge as N. Let A ω and A θ be the ω section and θ-section of A respectively. Define Ω 0 = { ω Ω : n= } n E[g(X n) X 0 ] <. () By (), P(Ω 0 ) =. For ω Ω 0, by Carleson s theorem U(A ω ) = 0. Thus, by Fubini s theorem, for almost all θ Θ (U), P(A θ ) = 0, i.e., n= einθ E[g(X n ) X 0 ]/ n converges almost surely (P). Hence by Kronecker s lemma (cf. Lemma 5.., Chow and Teicher 988), as N, N N n= e inθ E[g(X n ) X 0 ] 0 a.s. (P). 5

Lemma. For almost all θ Θ (U), there exists ξ(θ, X 0, X ) L such that P S n (θ) ξ(θ, X 0, X ) almost surely (P). (3) Proof of Lemma. Since the sequence E[g(X 0 ) X n ] E[g(X 0 ) X n ], n = 0,,... forms martingale differences, P g(x n ) = n= P n g(x 0 ) = g(x 0 ) <. n= Hence for almost all ω Ω (P), n= P g(x n ) <, which yields (3) by Carleson s theorem and the same argument as in Lemma. Next we show that ξ(θ, X 0, X ) L. To this end, since the trigonometric bases e in, n Z are orthogonal, P k g(x 0 ) = k= By Fatou s lemma, g(x 0 ) lim inf E P S n (θ) U(dθ) n Θ E lim inf P S n (θ) U(dθ) = n Θ Hence for almost all θ Θ (U), ξ(θ, X 0, X ) <. Lemma 3. Assume (). Then for almost all θ Θ (U), Θ E P S n (θ) U(dθ). Θ E[ ξ(θ, X 0, X ) ]U(dθ). E[ξ(θ, X 0, X ) X 0 ] = 0 a.s. (P) (4) Proof of Lemma 3. The almost sure convergence (3) in Lemma alone does not guarantee the L convergence. To prove (4), we shall show below that, under (), the Cesàro average N n= P S n (θ)/n converges to ξ(θ, X 0, X ) in L. The relation (4) reveals the martingale structure and we can apply the martingale central limit theorem. Let the tail R N (θ, X 0, X ) = n=n+ e inθ {E[g(X n ) X ] E[g(X n ) X 0 ]}. 6

Then R 0 (θ, X 0, X ) = ξ(θ, X 0, X ). Again by the orthogonality of the bases e in, R N (θ, X 0, X ) U(dθ) = P g(x n ) <. Θ n=+n Hence by () N= E R N (θ, X 0, X ) U(dθ) = N Θ = N= N= N n=+n E P g(x n ) N E[g(X 0) X N ] <. So for almost all θ Θ (U), N= N E R N(θ, X 0, X ) <. It follows from Kronecker s lemma that as N which yields N N E R j (θ, X 0, X ) 0, (5) j= E N N R j (θ, X 0, X ) j= by Cauchy s inequality. In other words, 0 s N (θ, X 0, X ) := N N P S n (θ) ξ(θ, X 0, X ) in L. n= Clearly, E[ s N (θ, X 0, X ) X 0 ] = 0 a.s. (P). Hence as N, E E[ξ(θ, X 0, X ) X 0 ] = E E[ξ(θ, X 0, X ) s N (θ, X 0, X ) X 0 ] E ξ(θ, X 0, X ) s N (θ, X 0, X ) ξ(θ, X 0, X ) s N (θ, X 0, X ) 0, which proves (4). 7

Lemma 4. Suppose (5) holds for θ for which ξ(θ, X 0, X ) L (P). Then R I S n(θ) E[S n (θ) X 0 ] n N[0, Σ(θ)], (6) where Σ(θ) = ξ(θ, X 0, X ) Id if θ 0, π, and Σ(θ) = ξ(θ, X 0, X ) 0 0 0, if θ = 0, π. Proof of Lemma 4. We generalize the arguments in Woodroofe (99) where the asymptotic normality for the case θ = 0 was obtained. Let ξ n (θ, x 0, x ) = E[S n (θ) X = x ] E[S n (θ) X 0 = x 0 ] and write S n (θ) E[S n (θ) X 0 ] = + e iθ(j ) ξ(θ, X j, X j ) j= e iθ(j ) [ξ n j+ (θ, X j, X j ) ξ(θ, X j, X j )]. (7) j= Now we claim that (5) implies that the second part in the proceeding display is o P ( n). Actually, by Lemma 3, E[ξ n j+ (θ, X j, X j ) ξ(θ, X j, X j ) X j ] = 0 almost surely P. Then it suffices to establish n ξ n j+ (θ, X j, X j ) ξ(θ, X j, X j ) 0, (8) j= which follows from (5) via the stationarity of X j. By Lemma 3, ξ(θ, X j, X j ), j Z is a martingale difference sequence, hence we can use the martingale central limit theorem to deduce (6) from (7). To this end, let ξ(θ, X j, X j ) = A j (θ) + ib j (θ), α j (θ) = A j (θ) cos[(j )θ] B j (θ) sin[(j )θ], β j (θ) = B j (θ) cos[(j )θ] + A j (θ) sin[(j )θ]. 8

Then A j (θ), B j (θ), α j (θ) and β j (θ) form martingale difference sequences, and e iθ(j ) ξ(θ, X j, X j ) = j= [α j (θ) + iβ j (θ)]. j= For a fixed 0 θ 0 < π let D j = α j (θ) cos θ 0 + β j (θ) sin θ 0. Then D j are again martingale differences. By the Cramer-Wold device, it suffices to show that D j N[0, σ (θ, θ 0 )], where σ (θ, θ 0 ) = (cos θ 0, sin θ 0 )Σ(θ) n j= cos θ 0 sin θ 0 We now apply the martingale central limit theorem to n j= D j/ n. The Lindeberg condition is automatically satisfied. Since E[ξ (θ, X j, X j ) F j ], j Z forms a stationary and ergodic sequence in L (P), by Lemma 5, the limit V (θ) = lim n n. E[ξ (θ, X j, X j ) F j ]e (j )iθ (9) j= exists and equals to 0 almost surely (P) if θ 0, π, and equals to ξ(θ, X 0, X ) if θ = 0 or π. After some algebra, Σ (θ) = lim n n Σ (θ) = lim n n Σ (θ) = lim n n E[αj(θ) F j ] = ξ(θ, X 0, X ) E[βj (θ) F j ] = ξ(θ, X 0, X ) IV (θ) E[α j (θ)β j (θ) F j ] = j= j= j= + RV (θ), RV (θ), almost surely (P), which completes the proof. Lemma 5. Let η n, n Z be a real-valued, stationary and ergodic sequence with E η <. Then for all 0 < α < π, almost surely and in L. n e itα η t 0 (0) t= 9

Proof of Lemma 5. This result seems to be implied but not explicitly stated in the literature (see for example Doob (953, pp 469 470), Rozanov (967, pp 60 6), Wiener and Wintner (94)). Since n t= eitα E(η t )/n 0, we can assume without loss of generality that E(η t ) = 0. If α/(π) is a rational number, then (0) clearly follows from the classical ergodic theorem. In the case that α/(π) is not a rational number, define the rotation ρ α : Θ Θ by ρ α (θ) = θ + α mod π. Then ρ α is measure-preserving and ergodic. Let the random variable ϑ Uniform(Θ) be independent of η n, n Z. Since η t is stationary and ergodic, the sequence ϕ t = e itα+ϑ η t, t Z is then stationary (cf. Rozanov, p. 6) as well as ergodic on the product space (Θ Ω, B(Θ) B(Ω), U P). Therefore the classical ergodic theorem asserts that n n t= ϕ t 0 almost surely (U P) and in L, which clearly entails (0) since U is a uniform distribution. Proof of Theorem. (i) It follows from Lemmas and 4. (ii) By Lemmas, 3 and 4, for almost all θ, ϕ Θ (U), martingale differences ξ(τ, X t, X t ) = A t (τ) + ib t (τ), τ = θ, ϕ can be constructed and it suffices to show that n / n t= v t N[0, Σ(θ, ϕ)], where random vectors v t = (α t (θ), β t (θ), α t (ϕ), β t (ϕ)) and the covariance matrix Σ(θ, ϕ) = Diag(w(θ), w(θ), w(ϕ), w(ϕ)), w(τ) = ξ(τ, X 0, X ). To this end, by (i) and the Cramer-Wold device, it follows from α t(θ)α t (ϕ) n β t (θ)α t (ϕ) t= α t (θ)β t (ϕ) β t (θ)β t (ϕ) 0 a.s. (P), which in view of Lemma 5 is valid provided that ϕ θ and ϕ + θ π. Proof of Theorem. Observe that (6) in Lemma 4 holds for θ for which (4) is satisfied. Therefore (6) follows in view of (5). Proof of Proposition. Let h n = n t= eitθ. Since 0 < θ < π, h := sup n h n <. By (7), n= h n[p g(x n ) P g(x n+ )] converges in L (P). Thus (4) holds since P g(x n ) 0. To prove (5), let φ t = P g(x t ) P g(x t+ ) and Φ k = t=k φ t. Then 0

(7) entails j=0 [ n t= φ t+j] Φ j=0 n t= φ t+j = o(n). So (5) follows from P j S n (θ) h P j g(x t ) P j g(x t+ ) + h P j g(x n+ ), t= E[S n (θ) X 0 ] = j=0 P js n (θ) and j=0 P jg(x n+ ) g(x 0 ) by the orthogonality of P k. Proof of Proposition. Let K n (λ) = sin (nλ/)/[πn sin (λ/)] be the Fejér kernel. It is well known that S n (θ) /(πn) = π 0 K n (λ θ)f(λ)dλ (cf Theorem 8..7 in Anderson (97, p. 454)), which by the Fejér-Lebesgue Theorem (cf Bary (964, p. 39)) converges almost everywhere to f(θ). Therefore, by (7) and (8), for almost all θ (Lebesgue), lim n n E[S n(θ) X 0 ] = lim n n S n(θ) lim n n S n(θ) E[S n (θ) X 0 ] = πf(θ) ξ(θ, X 0, X ). () Notice that by (8), π k E[S k(θ) X 0 ] U(dθ) = k= 0 k= k k E[g(X i ) X 0 ] <. i= Hence by the Borel-Cantelli Lemma, for almost all θ (Lebesgue), k E[S k(θ) X 0 ] 0 as k, which by () entails πf(θ) = ξ(θ, X 0, X ) = σ (θ) almost surely. Acknowledgment The author is grateful to the referee for many helpful suggestions. REFERENCES Anderson, T. W. (97). The Statistical Analysis of Time Series. New York, Wiley. Bary, N. K. (964). A Treatise on Trigonometric Series. New York, Macmillan. Brockwell, P. J. and Davis, R. A. (99). Time Series: Theory and Methods. New York, Springer.

Carleson, L. (966). On convergence and growth of partial sums of Fourier series. Acta Math. 6 35 57. Chow, Y. S. and Teicher, H. (988). Probability Theory. nd ed. Springer, New York. Diaconis, P. and Freedman. D. (999). Iterated random functions. SIAM Rev. 4 4 76. Doob, J. (953). Stochastic Processes. Wiley. Gray, H. L., Zhang, N.-F. and Woodward, W.A. (989). On generalized fractional processes. J. Time Ser. Anal. 0 33-57. Ibragimov, I. A. and Yu. V. Linnik. (97). Independent and stationary sequences of random variables. Groningen, Wolters-Noordhoff. Meyn, S. P. and Tweedie, R. L. (993). Markov chains and stochastic stability. Springer, London; New York. Olshen, R. A. (967). Asymptotic properties of the periodogram of a discrete stationary process. J. Appl. Probab. 4 508 58. Rootzén, H. (976). Gordin s theorem and the periodogram. J. Appl. Probab. 3 365 370. Rosenblatt, M. (98). Limit theorems for Fourier transforms of functionals of Gaussian sequences. Z. Wahrsch. Verw. Gebiete 55 3 3. Rosenblatt, M. (985). Stationary sequences and random fields. Birkhäuser, Boston. Rozanov, Yu. A. (967). Stationary random processes. San Francisco, Holden-Day. Terrin, N. and Hurvich, C. M. (994). An asymptotic Wiener-Ito representation for the low frequency ordinates of the periodogram of a long memory time series. Stochastic Process. Appl. 54 97 307. Tong, H. (990). Non-linear time series: a dynamical system approach. Oxford University Press. Walker, A. M. (965). Some asymptotic results for the periodogram of a stationary time series. J. Austral. Math. Soc. 5 07 8. Walker, A. M. (000). Some results concerning the asymptotic distribution of sample Fourier transforms and periodograms for a discrete-time stationary process with a continuous spectrum. J. Time Ser. Anal. 95 09.

Woodroofe, M. (99). A central limit theorem for functions of a Markov chain with applications to shifts. Stochastic Process. Appl. 4 33 44. Wu, W. B. and Woodroofe, M. (000). A central limit theorem for iterated random functions. J. Appl. Probab. 37 748 755. Yajima, Y. (989). A central limit theorem of Fourier transforms of strongly dependent stationary processes. J. Time Ser. Anal. 0 375 383. Department of Statistics The University of Chicago 5734 S. University Avenue Chicago, IL 60637, USA wbwu@galton.uchicago.edu 3