Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes

Similar documents
SMSTC (2007/08) Probability.

SUBGEOMETRIC RATES OF CONVERGENCE FOR A CLASS OF CONTINUOUS-TIME MARKOV PROCESS

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS

The Transition Probability Function P ij (t)

A regeneration proof of the central limit theorem for uniformly ergodic Markov chains

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Simultaneous drift conditions for Adaptive Markov Chain Monte Carlo algorithms

SIMILAR MARKOV CHAINS

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Convergence Rates for Renewal Sequences

THE VARIANCE CONSTANT FOR THE ACTUAL WAITING TIME OF THE PH/PH/1 QUEUE. By Mogens Bladt National University of Mexico

A NOTE ON THE COMPLETE MOMENT CONVERGENCE FOR ARRAYS OF B-VALUED RANDOM VARIABLES

Data analysis and stochastic modeling

arxiv: v2 [math.pr] 4 Sep 2017

Statistics 150: Spring 2007

MARKOV PROCESSES. Valerio Di Valerio

Irregular Birth-Death process: stationarity and quasi-stationarity

CHUN-HUA GUO. Key words. matrix equations, minimal nonnegative solution, Markov chains, cyclic reduction, iterative methods, convergence rate

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

A Method for Evaluating the Distribution of the Total Cost of a Random Process over its Lifetime

Lecture 20: Reversible Processes and Queues

STABILIZATION OF AN OVERLOADED QUEUEING NETWORK USING MEASUREMENT-BASED ADMISSION CONTROL

Lecture 1: Brief Review on Stochastic Processes

Geometric ρ-mixing property of the interarrival times of a stationary Markovian Arrival Process

Runge-Kutta Method for Solving Uncertain Differential Equations

Lecture 4a: Continuous-Time Markov Chain Models

Some Results on the Ergodicity of Adaptive MCMC Algorithms

Stochastic process. X, a series of random variables indexed by t

On Differentiability of Average Cost in Parameterized Markov Chains

The SIS and SIR stochastic epidemic models revisited

IEOR 6711: Stochastic Models I Professor Whitt, Thursday, November 29, Weirdness in CTMC s

Two-sided Bounds for the Convergence Rate of Markov Chains

Quasi-Stationary Distributions in Linear Birth and Death Processes

Applied Mathematics Letters. Stationary distribution, ergodicity and extinction of a stochastic generalized logistic system

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

AARMS Homework Exercises

Generalized Resolvents and Harris Recurrence of Markov Processes

Markov Chains. Arnoldo Frigessi Bernd Heidergott November 4, 2015

Operations Research Letters. Instability of FIFO in a simple queueing system with arbitrarily low loads

Other properties of M M 1

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

RELATING TIME AND CUSTOMER AVERAGES FOR QUEUES USING FORWARD COUPLING FROM THE PAST

Part I Stochastic variables and Markov chains

Non-homogeneous random walks on a semi-infinite strip

Bisection Ideas in End-Point Conditioned Markov Process Simulation

Markov processes and queueing networks

Random Bernstein-Markov factors

Censoring Technique in Studying Block-Structured Markov Chains

Integrals for Continuous-time Markov chains

Strong approximation for additive functionals of geometrically ergodic Markov chains

Lectures on Stochastic Stability. Sergey FOSS. Heriot-Watt University. Lecture 4. Coupling and Harris Processes

Advanced Computer Networks Lecture 2. Markov Processes

A Note on the Central Limit Theorem for a Class of Linear Systems 1

Oscillation Criteria for Delay Neutral Difference Equations with Positive and Negative Coefficients. Contents

University of Twente. Faculty of Mathematical Sciences. The deviation matrix of a continuous-time Markov chain

IEOR 6711, HMWK 5, Professor Sigman

STOCHASTIC PROCESSES Basic notions

Continuous-Time Markov Chain

N.G.Bean, D.A.Green and P.G.Taylor. University of Adelaide. Adelaide. Abstract. process of an MMPP/M/1 queue is not a MAP unless the queue is a

Supermodular ordering of Poisson arrays

STAT STOCHASTIC PROCESSES. Contents

Zdzis law Brzeźniak and Tomasz Zastawniak

CONTROL SYSTEMS, ROBOTICS AND AUTOMATION Vol. XI Stochastic Stability - H.J. Kushner

ON THE COMPLETE CONVERGENCE FOR WEIGHTED SUMS OF DEPENDENT RANDOM VARIABLES UNDER CONDITION OF WEIGHTED INTEGRABILITY

Asymptotic Irrelevance of Initial Conditions for Skorohod Reflection Mapping on the Nonnegative Orthant

A Type of Shannon-McMillan Approximation Theorems for Second-Order Nonhomogeneous Markov Chains Indexed by a Double Rooted Tree

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

Asymptotics and Simulation of Heavy-Tailed Processes

Boundedly complete weak-cauchy basic sequences in Banach spaces with the PCP

Iterative common solutions of fixed point and variational inequality problems

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

arxiv: v1 [math.pr] 1 Jan 2013

Stein s Method for Steady-State Approximations: Error Bounds and Engineering Solutions

General Glivenko-Cantelli theorems

Kesten s power law for difference equations with coefficients induced by chains of infinite order

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

PITMAN S 2M X THEOREM FOR SKIP-FREE RANDOM WALKS WITH MARKOVIAN INCREMENTS

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Approximation algorithms for nonnegative polynomial optimization problems over unit spheres

Stationary distribution and pathwise estimation of n-species mutualism system with stochastic perturbation

Quantitative Model Checking (QMC) - SS12

Introduction to self-similar growth-fragmentations

The Continuity of SDE With Respect to Initial Value in the Total Variation

LIST OF MATHEMATICAL PAPERS

A review of Continuous Time MC STA 624, Spring 2015

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ),

A Functional Central Limit Theorem for an ARMA(p, q) Process with Markov Switching

Characterizations on Heavy-tailed Distributions by Means of Hazard Rate

Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals

THE GEOMETRICAL ERGODICITY OF NONLINEAR AUTOREGRESSIVE MODELS

STRONG CONVERGENCE OF AN IMPLICIT ITERATION PROCESS FOR ASYMPTOTICALLY NONEXPANSIVE IN THE INTERMEDIATE SENSE MAPPINGS IN BANACH SPACES

HAIYUN ZHOU, RAVI P. AGARWAL, YEOL JE CHO, AND YONG SOO KIM

Optimal scaling of average queue sizes in an input-queued switch: an open problem

A COMPOUND POISSON APPROXIMATION INEQUALITY

Transcription:

Front. Math. China 215, 1(4): 933 947 DOI 1.17/s11464-15-488-5 Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes Yuanyuan LIU 1, Yuhui ZHANG 2 1 School of Mathematics and Statistics, New Campus, Central South University, Changsha 4183, China 2 School of Mathematical Sciences, Beijing Normal University, Laboratory of Mathematics and Complex Systems, Ministry of Education, Beijing 1875, China c Higher Education Press and Springer-Verlag Berlin Heidelberg 215 Abstract We obtain sufficient criteria for central limit theorems (CLTs) for ergodic continuous-time Markov chains (CTMCs). We apply the results to establish CLTs for continuous-time single birth processes. Moreover, we present an explicit expression of the time average variance constant for a single birth process whenever a CLT exists. Several examples are given to illustrate these results. Keywords Markov process, single birth processes, central limit theorem (CLT), ergodicity MSC 6J27, 6F5 1 Introduction Let {X t,t R + } be a continuous-time Markov chain (CTMC) on a countable state space E with the Q-matrix Q =(q ij ) and the transition function P t (i, j). Throughout this paper, we assume that Q is irreducible, totally stable, and regular, which implies that Q is conservative and the Q-process is unique. We further assume that X t is positive recurrent with the unique invariant distribution π. Then we have P t (i, ) π := P t (i, j) π j, t +, j E for any i E. Define τ i =inf{t >: X t = i} Received January 26, 215; accepted April 1, 215 Corresponding author: Yuanyuan LIU, E-mail: liuyy@csu.edu.cn

934 Yuanyuan LIU, Yuhui ZHANG to be the first hitting time on the state i. Let J 1 be the first jump time of the process X t, and define δ i =inf{t >J 1 : X t = i} to be the first return time on the state i. Define π(g) = i E π i g i for a function g on E. When π( f ) < +, by [19, Proposition 4.2 (ii), we know that the sample mean S(t) is well defined by S(t) = 1 t t f Xs ds, t. Moreover, it is known that the strong law of large numbers holds, i.e., S(t) π(f) with probability 1 as t +. We say that a CLT holds if there exists a scaling constant σ 2 (f) < + such that t 1/2 [S(t) π(f) N(,σ 2 (f)), t +, for any initial distribution, where N(, 1) denotes the standard normal distribution, and means convergence in distribution. The (deterministic) constant σ 2 (f) is called the time average variance constant. It is a fundamental issue to find conditions on X t and f, under which a CLT holds. Let f := f π(f). From [8, we immediately know that if π( f ) < +, then a CLT holds if and only if [( δi ) 2 E i f Xs ds < + for any i E. Obviously, this condition holds whenever E i [( δi ) 2 f Xs ds < +. (1.1) It is hard to check the above two conditions directly to verify a CLT for Markov models, since both expressions involve the function f and the first return time δ i simultaneously. It is well known that the moments of the first return time are closely related with ergodicity (e.g., [6). We will establish three easily checked criteria in Theorem 3.1 for (1.1) in terms of the conditions on the function f and ergodicity separately, in which, the first one is new and the other two are analogous to those for discrete-time Markov chains (DTMCs) (see [13,2).

Central limit theorems for ergodic continuous-time Markov chains 935 This current research is also motivated by investigating CLTs for a continuous-time single birth process (see [6) on the state space E = Z +, which is a CTMC with the following Q-matrix: q q 1 q 1 q 11 q 12 Q = q 2 q 21 q 22 q 23........ The single birth processes, also called continuous-time Markov processes skipfree to the right ([1), are a class of important Markov processes, which cover many interesting Markov processes, such as the birth-death processes, the population processes, and the level-dependent queues. It has been shown by [18 that the single birth processes can be used to investigate generalized Markov branching processes and multidimensional CTMCs. The single birth processes have been applied to modeling real problems in queues, biology, and so on. However, to model a real-world phenomenon, we have to judge whether this model is appropriate or not. That is to say, we need to test this model through the sample path, which raises a problem of statistical hypothesis testing. To attack this problem, we apply Theorem 3.1 and the known ergodicity results to obtain sufficient criteria for a CLT for a single birth process. Moreover, we get the explicit expression of the variance constant, which is extremely important since it is a key parameter for determining the confidence interval for statistical hypothesis testing. In the literature, the variance constant has been investigated for continuous-time birth-death process on a finite state space (see, e.g., [21), the continuous-time queues driven by a Markovian marked point process ([3), the discrete-time waiting time of the M/G/1 queue [7 and the PH/PH/1 queue [4, the discrete-time birth-death process ([14) and single birth process ([12), the continuous-time matrixanalytical models ([15), and so on. 2 Preliminaries on ergodicity In this section, we review the definitions and known criteria of several types of ergodicities. We also state the known ergodicity results for continuous-time single birth processes. A positive recurrent CTMC X t is said to be strongly ergodic if sup P t (i, ) π, i E t +. From [6, we know that X t is strongly ergodic if and only if sup E k [τ i < + k E

936 Yuanyuan LIU, Yuhui ZHANG for some i E. A positive recurrent CTMC X t is said to be exponentially ergodic if e γt P t (i, ) π, t +, for some γ>andforanyi E. From [6 again, we know that the chain X t is exponentially ergodic if and only if E i [e rδ i < + for some r>andsomei E. For a positive real number l such that l 1, a positive recurrent CTMC X t is said to be l-ergodic (see, [11,17) if t l 1 P t (i, ) π, t +, for any i E. It follows from [16 that the chain X t is l-ergodic if and only if E i [δ l i < + for some i E. The equivalent criteria of the three types of ergodicity can be characterized in terms of different drift functions on the intensity matrix Q (see, e.g., [6,16). The sufficient and necessary criteria for strong ergodicity, exponential ergodicity, and l-ergodicity have been established for single-birth processes by [22, [18, and [24, respectively. The invariant distribution is obtained by [23. We do not present those results here in order to avoid introducing too many notations. We note that strong ergodicity = exponential ergodicity = l-ergodicity for any l R + and l 1. Generally speaking, it is easiest to investigate strong ergodicity for a CTMC among the three types of ergodicities. Strong ergodicity is a restrictive condition for DTMCs, since a DTMC on a infinitely countable state space fails to be strongly ergodic whenever its transition matrix P =(P ij ) is a Feller transition one (see [1, Section 2 for details), i.e., lim P ij = i + for any fixed j E. However, it is rather different for CTMCs with unbounded Q-matrices, for which, strong ergodicity may hold under feasible conditions. 3 CLTs for general CTMCs In the following, we will present sufficient criteria of a CLT for polynomially, exponentially, or strongly ergodic CTMCs. The proofs of these results are postponed to Section 5.

Central limit theorems for ergodic continuous-time Markov chains 937 Theorem 3.1 Let X t be a positive recurrent CTMC with the invariant distribution π. Let f be a function on E such that π( f ) < + and i be any fixed state in E. Then any one of the following conditions is sufficient for a CLT to hold: (i) X t is l-ergodic and π( f 2+η ) < + for some η> and l>2+ 4 η ; (ii) X t is exponentially ergodic and π( f 2+η ) < + for some η>; (iii) X t is strongly ergodic and π( f 2 ) < +. Moreover, if any of the above condition holds, then the variance constant is given by σ 2 (f) =2π(f ˆf) =2 π i f i ˆfi, (3.1) i E where the function ˆf, given by ˆf k = E k [ δi f Xs ds, k E, is a solution of the Poisson equation Qx = f. Note that ˆf i =. Remark 3.2 (1) The discrete-time analogue of (ii) and (iii) can be found in the survey papers [13 or [2. However, the arguments in the proof are new. (2) As we note in Section 2, strong ergodicity rarely holds for discretetime Markov chains, which is however the easiest checked ergodicity condition for CTMCs. Hence, assertion (iii) has a wider application potential than the corresponding discrete-time result. (3) Result (i) is new, which is useful for polynomially but not exponentially ergodic Markov processes. The proof of (i) is modified from the proof of [2, Lemma 34, by extending it to the continuous-time case and weakening the geometric ergodicity condition there with the polynomial ergodicity condition. We now give an example to illustrate Theorem 3.1 under various ergodic situations. Example 3.3 Let X t be a birth-death process with birth coefficients b i and death coefficients a i given by b =1, and b i = a i = i γ,i 1. From [5,17, we know that X t is l-ergodic if and only if γ (1, 2) and γ>2 1 l ; X t is exponentially ergodic if and only if γ =2;andX t is strongly ergodic if and only if γ>2. Now, take f i = i α, where α is a positive constant to be determined. Let η be an arbitrarily given positive constant. It is easy to obtain the following results. (1) Let l, γ, and α be such that l>2+ 4 η, 2 1 l <γ<2, α < γ 1 2+η. Then Theorem 3.1 (i) holds.

938 Yuanyuan LIU, Yuhui ZHANG (2) Let γ and α be such that γ =2andα<1/(2 + η). Then Theorem 3.1 (ii) holds. (3) Let γ and α be such that γ>2andα<(γ 1)/2. Then Theorem 3.1 (iii) holds. 4 Application to single birth processes In this section, we will apply Theorem 3.1 to single birth processes. Define k n = q nj, q (k) j= F (i) i =1, F (i) n = 1 q n,n+1 n 1 k=i k<n, q (k) n F (i) k, n > i, Theorem 4.1 Let X t be a positive recurrent single birth process with the invariant distribution π. Let f be a function on E such that π( f ) < +. Suppose that any one of the conditions in Theorem 3.1 (i) (iii) is satisfied. Then a CLT holds and the variance constant is given by + ( n σ 2 F (k) ) n n f (f) =2 k π k f k. (4.1) n= k= q k,k+1 Remark 4.2 (i) [12, Theorem 3.1 derives the explicit expression of the variance constant for discrete-time single birth processes. Note that there is a difference between both expressions, specifically, one more item, n E π nf 2 n, appears in the discrete-time expression. (ii) The proof of Theorem 4.1 is a little different from the one of [12, which can be modified to give a shorter and more complete proof of [12, Theorem 2.1. Remark 4.3 When X t is a birth-death process under the condition of Theorem 4.1, we obtain the variance constant + ( n ) σ 2 1 2 (f) =2 μ k f b n μ k, n n= where μ =1, μ i = b b 1 b i 1, i 1. a 1 a 2 a i Example 4.4 Let q n,n+1 =1foralln, q 1 =1,q n,n 2 =1foralln 2, and q ij =forotheri j. The single birth process is exponentially ergodic but not strongly ergodic (refer to [18). By computing, we know that {F n (k) } n k are Fibonacci numbers for every k : k= k= F (k) k+n = 1 5 (A n+1 ( B) n+1 ), n,k,

Central limit theorems for ergodic continuous-time Markov chains 939 where 5+1 A =, B = 2 The stationary distribution is 5 1. 2 π n =(1 B)B n, n, which is obtained in [23. Let f i = i. Then we have π(f) =π( f ) =A, π( f 3 )=6A 4 + A. So by Theorem 4.1, (4.1) holds. Now, n k= F n (k) f k = n +( B) n+3 + A 3, q k,k+1 n π k f k = (n +1)B n+1, n. k= Hence, from (4.1), one gets σ 2 (f) = 44 5+98. 5 Example 4.4 is the so-called uniform catastrophes in population models. Example 4.5 Let q n,n+1 = n +1 for all n, q nj =1forall j<n,and q ij =forotheri j. The single birth process is strongly ergodic (see [25). By computing, we know that F (k) k =1, F n (k) n k 1 k +1 =2, n > k. n +1 The stationary distribution is Let f i = i. Then we have π n =2 n 1, n. π(f) =π( f ) =1, π( f 2 )=3. So by Theorem 4.1, (4.1) holds. Now, we have n k= F n (k) f k = 1 q k,k+1 n +1, Hence, we derive that n π k f k = (n +1)2 (n+1), n. k= σ 2 (f) =2.

94 Yuanyuan LIU, Yuhui ZHANG 5 Proof of Theorem 3.1 Note that assertion (ii) follows from (i) directly by observing that if E i [e rδ i < + for some r>, then E i [δi l < + for any l>. Hence, we only need to prove (i), (iii), and the last part of Theorem 3.1. We prove Theorem 3.1 (i) and (iii) by verifying condition (1.1). To proceed our arguments, we need the following lemma, which presents a functional Cauchy-Schwarz inequality for CTMCs. Lemma 5.1 Let V be a nonnegative function. If + + V (t, X t )dt, Ex [V 2 (t, X t ) dt exist, then we have E x [( + ) 2 V (t, X t )dt ( + ) 2 Ex [V 2 (t, X t ) dt. Proof Fix any b>. Divide the interval [,binton +1parts: [s,s 1, [s 1,s 2,..., [s n 1,s n, where s =ands n = b. Let Δs i = s i s i 1, 1 i n, d = max Δs i. 1 i n We simply write V i = V (t i,x ti ). Since both + + V (t, X t )dt, Ex [V 2 (t, X t ) dt exist, we have E x [( b ) 2 [( n ) 2 V (t, X t )dt = E x lim V i Δs i d i= [( n ) 2 lim inf E x V i Δs i d lim inf d ( b = ( n i= i= E x [V 2 i ) 2 ) 2 Ex [V 2 (t, X t ) dt, (5.1) where the first inequality follows from the Fatou lemma, and the second inequality follows from the Cauchy-Schwarz inequality, i.e., [( + ) 2 ( + ) 2 E X n E[X 2 n. n= n=

Central limit theorems for ergodic continuous-time Markov chains 941 We obtain the assertion by letting b + in (5.1). 5.1 Proof of Theorem 3.1 (i) Let V (t, X t )=I t<δi f Xt. By Lemma 5.1, we have [( δi ) 2 ( + 2 E i f Xs ds E i [I t<δi f Xt dt) 2. (5.2) Let It follows from Hölder inequality that Since π( f 2+η ) < +, we have p =1+ 2 η, q =1+η 2. E i [I t<δi f Xt 2 (E i [I t<δi ) 1/p (E i [ f Xt 2q ) 1/q. (5.3) (E i [ f Xt 2q ) 1/q ( 1 π i E π [ f Xt 2q ) 1/q = ( 1 π i π( f 2q )) 1/q < +. (5.4) It follows from the Markov inequality that E i [I t<δi =P i [t<δ i E i[δi l t l. (5.5) From (5.2) (5.5), we obtain (1.1). 5.2 Proof of Theorem 3.1 (iii) To bound (1.1), we let the process start at the state i. Recall that J 1 is the first jump time. Define T =, T k := inf{t T k 1 + J 1 : X t = i}, i 1. Then T k denotes the k-th return time on i. Obviously, X t is a undelayed regenerative process with T i,i, as the regenerative time points, i.e., {X t,t i t<t i+1,i } are independently identically distributed random variables. We divide the proof of this assertion into three steps. (a) Since we know that if E x [τ i = + tdp x [τ i t sp x [τ i >s, sup E x [τ i < +, x E s>,

942 Yuanyuan LIU, Yuhui ZHANG then there exists constants h<+ and β<1 such that sup P x [τ i h <β. x E (b) Now, we can use the time length h given by (a) to divide this interval [J 1,δ i. Let t n = J 1 + nh, n. Let us define a Bernoulli random sequence {Z n,n } as follows: { 1, Xt = i for some t [t n 1,t n, Z =, Z n =, otherelse. Let N =inf{n 1: Z n =1} be the first hitting time of the state 1. Then we have P [N = n = n 1 m=1 where we make the convention that Thus, we have P [Z m =P [Z n =1 β n 1 1=β n 1, n 1, P [Z m ==1. m=1 J 1 +(N 1)h = t N 1 δ i t N = J 1 + Nh. (5.6) (c) Now, we use (5.6) to bound E i [( δ i f X s ds) 2. E i [( δi ) 2 [( t δi f Xs ds = E i f Xt dt + t ) 2 [( t E i f Xt dt +2E i [ t ( 1 λ + 1 ) λ 2 ) 2 f Xt dt + E i [( δi t [ δi f Xt dt E i f Xt dt t f 2 (i)+ 2 f(i) λ ) 2 f Xt dt π( f ) π i + D, (5.7) where D = E i [( δi t ) 2 f Xt dt.

Central limit theorems for ergodic continuous-time Markov chains 943 Now, we focus on bounding on D. Let Then we have D E i [( tn t S k = ) 2 f Xt dt tk t k 1 f Xt dt, k 1. = E i [ N k=1 S 2 k [ N +2E i N S k k=1 m=k+1 Conditioning on J 1 and applying Lemma 5.1 gives that for any k 1, + E π [Sk 2 = E π [Sk 2 J 1 = s dp [J 1 s h π(f 2 ), which implies By conditioning on N, we have [ N [ [ N E i = E i E Sk 2 k=1 Similarly, we have [ N E i N S k k=1 m=k+1 E i [S 2 k 1 π i E π [S 2 k 1 π i h π(f 2 ) < +. k=1 S 2 k + S m =2 n=1 + + N n=1 n E i[ n=1 k=1 + ( n n 2 2 n=1 nh π(f 2 ) π i n S k m=k+1 k=1 m=k+1 n 2 n π i S m. β n 1 < +. S m P [N = n E i [S 2 k E i [S 2 m h π(f 2 ) β n 1 ) β n 1 < +, (5.8) whereweusethehölder inequality with p = q = 2 in the second inequality. The assertion follows immediately from (5.7) and (5.8). 5.3 Proof of last part of Theorem 3.1 It follows from [8 and the similar arguments in the proof of [3, Lemma 3.1, we have [ σ 2 2 δi (f) = E i [δ i E i f Xt ˆfXt dt =2π(f ˆf). [9, Theorem 9.5.2 shows that the sequence {x k,k E}, given by [ δi x k = E k g Xt dt,

944 Yuanyuan LIU, Yuhui ZHANG is a solution of the following equations: k i q ik x k = g i, i E, (5.9) where g is a nonnegative function. [2, Theorem VI.1.2 shows that E i [ δi f Xt dt = π(f)e i [δ i, which implies that x i =. For a real number a, define Obviously, a + =max{a, }, a =max{ a, }. a = a + a. By considering [ δi [ δi x + k = E k (f Xt ) + dt, x k = E k (f Xt ) dt, we know that the sequence {x k,k E} satisfies the Poisson equation Qx = f. The proof of the theorem is finished. 6 Proof of Theorem 4.1 The fist part of Theorem 4.1 follows from Theorem 3.1 directly. We only need to prove the expression (4.1) of the variance constant. Lemma 6.1 Give a function f on E. The sequence of functions {h n,n Z + }, defined by h i = f i, h n = 1 [ n 1 f n + q n (k) h k, n i +1, (6.1) q i,i+1 q n,n+1 has the the following equivalent presentation: h n = n k=i Proof We use the induction. For n = i, k=i F n (k) f k, n i. (6.2) q k,k+1 h i = f i = F (i) i f i q i,i+1 q i,i+1 = i k=i F (k) i f k q k,k+1.

Central limit theorems for ergodic continuous-time Markov chains 945 Assume that (6.2) holds for all n m. When n = m +1, from (6.1), we see that h m+1 = f m+1 1 m k + q (k) F (l) k f l m+1 q m+1,m+2 = f m+1 q m+1,m+2 + = f m+1 q m+1,m+2 + = m+1 l=i F (l) m+1 f l q l,l+1. q m+1,m+2 k=i m ( l=i m l=i 1 m q m+1,m+2 k=l F (l) m+1 f l q l,l+1 l=i q l,l+1 ) fl q (k) m+1 F (l) k q l,l+1 Hence, (6.2) holds for n = m +1. By induction, we know that (6.2) hold for all n i. Lemma 6.2 For a single birth Q-matrix Q =(q ij ), a finite solution x = (x i,i E) of Poisson s equation Qx = g with x j =is given by j 1 ( n F (k) ) n g k, i < j, q n=i k,k+1 k= x i =, i = j, (6.3) i 1 ( n F (k) ) n g k, i > j. n=j k= q k,k+1 Proof From Poisson s equation Qx = g, one easily sees that x x 1 = g, x n x n+1 = 1 [ n 1 g n + q n (k) (x k x k+1 ), n 1. q 1 q n,n+1 So, by Lemma 6.1, we know that x n x n+1 = n k= k= F n (k) g k, n. q k,k+1 Note that i 1 i 1 ( n x i = (x n x n+1 )+x = n= Since x j =, we know that n= j 1 ( n x = n= F n (k) g k q k= k,k+1 F (k) n g k q k,k+1 k= ). ) + x, i 1.

946 Yuanyuan LIU, Yuhui ZHANG Then we obtain the assertion. Proof of Theorem 4.1 It is known that the sequence x =( x i,i E), defined by [ δj x i =E i f Xt dt, is a solution of Poisson s equation with x j =. Due to the special structure of the single birth processes, we know that Poisson s equation has a unique finite solution. Hence, x = x, where x is given by (6.3). By (3.1), we obtain + σ 2 (f) = 2 n=1 n 1 ( k π n f n k= m= F (m) k f m q m,m+1 ). (6.4) The assertion (4.1) follows by exchanging the order of summation in the righthand side of (6.4) and using the fact that + k= π k f(k) =. References 1. Abate J, Whitt W. Spectral theory for skip-free Markov chains. Probab Engrg Inform Sci, 1989, 3(1): 77 88 2. Asmussen S. Applied Probability and Queues. 2nd ed. New York: Springer-Verlag, 23 3. Asmussen S, Bladt M. Poisson s equation for queues driven by a Markovian marked point process. Queueing Syst, 1994, 17: 235 274 4. Bladt M. The variance constant for the actual waiting time of the PH/PH/1 queue. Ann Appl Probab, 1996, 6(3): 766 777 5. Chen M F. Equivalence of exponential ergodicity and L 2 -exponential convergence for Markov chains. Stochastic Process Appl, 2, 87: 281 297 6. Chen M F. From Markov Chains to Non-equilibrium Particle Systems. 2nd ed. Singapore: World Scientific, 23 7. Glynn P W. Poisson s equation for the recurrent M/G/1 queue. Adv Appl Probab, 1994, 26: 144 162 8. Glynn P W, Whitt W. Necessary conditions for limit theorems in cumulative processes. Stochastic Process Appl, 22, 98: 199 29 9. Hou Z, Guo Q. Homogeneous Denumerable Markov Processes. New York/Beijing: Springer-Verlag/Science Press, 1988 1. Hou Z, Liu Y. Explicit criteria for several types of ergodicity of the embedded M/G/1 and GI/M/n queues. J Appl Probab, 24, 41(3): 778 79 11. Hou Z, Liu Y, Zhang H. Subgeometric rates of convergence for a class of continuoustime Markov processes. J Appl Probab, 25, 42(3): 698 712 12. Jiang S, Liu Y, Yao S. Poisson s equation for discrete-time single-birth processes. Statist Probab Lett, 214, 85: 78 83 13. Jones G L. On the Markov chain central limit theorem. Probab Surv, 24, 1: 299 32 14. Liu Y. Additive functionals for discrete-time Markov chains with applications to birthdeath processes. J Appl Probab, 211, 48(4): 925 937

Central limit theorems for ergodic continuous-time Markov chains 947 15. Liu Y, Wang P, Xie Y. Deviation matrix and asymptotic variance for GI/M/1-type Markov chains. Front Math China, 214, 9(4): 863 88 16. Liu Y, Zhang H, Zhao Y. Subgeometric ergodicity for continuous-time Markov chains. J Math Anal Appl, 21, 368: 178 189 17. Mao Y H. Ergodic degrees for continuous-time Markov chains. Sci China Ser A, 24, 47: 161 174 18. Mao Y H, Zhang Y H. Exponential ergodicity for single birth processes. J Appl Probab, 24, 41(4): 122 132 19. Meyn S P, Tweedie R L. Generalized resolvents and Harris recurrence of Markov processes. In: Cohn H, ed. Doeblin and Modern Probability: Proceedings of the Doeblin Conference. 5 Years After Doeblin: Development in the Theory of Markov Chains, Markov Processes, and Sums of Random Variables. Contemp Math, 149. Providence: Amer Math Soc, 1993, 227 25 2. Roberts G O, Rosenthal J S. General state space Markov chains and MCMC algorithms. Probab Surv, 24, 1: 2 71 21. Whitt W. Asymptotic formulas for Markov processes with applications to simulations. Oper Res, 1992, 4: 279 291 22. Zhang Y H. Strong ergodicity for single birth processes. J Appl Probab, 21, 38(1): 27 277 23. Zhang Y H. The hitting time and stationary distribution for single birth processes. J Beijing Normal Univ, 24, 4(2): 157 161 (in Chinese) 24. Zhang Y H. A note on exponential ergodicity and l-ergodicity of single birth processes. J Beijing Normal Univ, 21, 46(1): 1 5 (in Chinese) 25. Zhang Y H, Zhao Q Q. Some single birth Q-matrices (I). J Beijing Normal Univ, 26, 42(2): 111 115 (in Chinese)