On Existence of Limiting Distribution for Time-Nonhomogeneous Countable Markov Process

Similar documents
arxiv:math/ v4 [math.pr] 12 Apr 2007

arxiv:math.pr/ v1 24 May 2005

Two-sided Bounds for the Convergence Rate of Markov Chains

(2m)-TH MEAN BEHAVIOR OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS UNDER PARAMETRIC PERTURBATIONS

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Lecture 10: Semi-Markov Type Processes

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Statistics 150: Spring 2007

Continuous-Time Markov Chain

Exact Simulation of the Stationary Distribution of M/G/c Queues

STABILIZATION OF AN OVERLOADED QUEUEING NETWORK USING MEASUREMENT-BASED ADMISSION CONTROL

Asymptotic Coupling of an SPDE, with Applications to Many-Server Queues

N.G.Bean, D.A.Green and P.G.Taylor. University of Adelaide. Adelaide. Abstract. process of an MMPP/M/1 queue is not a MAP unless the queue is a

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

A Joining Shortest Queue with MAP Inputs

Synchronized Queues with Deterministic Arrivals

Queueing Networks and Insensitivity

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

EXTINCTION PROBABILITY IN A BIRTH-DEATH PROCESS WITH KILLING

LECTURE #6 BIRTH-DEATH PROCESS

NEW FRONTIERS IN APPLIED PROBABILITY

SENSITIVITY OF HIDDEN MARKOV MODELS

PERTURBATION ANALYSIS FOR DENUMERABLE MARKOV CHAINS WITH APPLICATION TO QUEUEING MODELS

SIMILAR MARKOV CHAINS

Control of Fork-Join Networks in Heavy-Traffic

Chapter 3 Balance equations, birth-death processes, continuous Markov Chains

Stochastic modelling of epidemic spread

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

Contents Preface The Exponential Distribution and the Poisson Process Introduction to Renewal Theory

Solving the Poisson Disorder Problem

On Reflecting Brownian Motion with Drift

FINAL REPORT. Project Title: Extreme Values of Queues, Point Processes and Stochastic Networks

Stochastic relations of random variables and processes

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes

Model reversibility of a two dimensional reflecting random walk and its application to queueing network

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

An M/M/1 Queue in Random Environment with Disasters

THE VARIANCE CONSTANT FOR THE ACTUAL WAITING TIME OF THE PH/PH/1 QUEUE. By Mogens Bladt National University of Mexico

Stochastic modelling of epidemic spread

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Stochastic process. X, a series of random variables indexed by t

Part I Stochastic variables and Markov chains

MODELING WEBCHAT SERVICE CENTER WITH MANY LPS SERVERS

Modelling Complex Queuing Situations with Markov Processes

MULTISERVER QUEUEING SYSTEMS WITH RETRIALS AND LOSSES

THE SYNCHRONIZATION OF POISSON PROCESSES AND QUEUEING NETWORKS WITH SERVICE AND SYNCHRONIZATION NODES

A probabilistic proof of Perron s theorem arxiv: v1 [math.pr] 16 Jan 2018

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

Markov processes and queueing networks

Performance Evaluation of Queuing Systems

J. MEDHI STOCHASTIC MODELS IN QUEUEING THEORY

The passage time distribution for a birth-and-death chain: Strong stationary duality gives a first stochastic proof

IEOR 6711, HMWK 5, Professor Sigman

An Introduction to Stochastic Modeling

CONTROLLABILITY OF NONLINEAR DISCRETE SYSTEMS

Analysis of an M/G/1 queue with customer impatience and an adaptive arrival process

(implicitly assuming time-homogeneity from here on)

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

On Tandem Blocking Queues with a Common Retrial Queue

Advanced Queueing Theory

STAT 380 Continuous Time Markov Chains

A COMPOUND POISSON APPROXIMATION INEQUALITY

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing

Continuous Time Processes

STABILITY OF MULTICLASS QUEUEING NETWORKS UNDER LONGEST-QUEUE AND LONGEST-DOMINATING-QUEUE SCHEDULING

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Asymptotic Irrelevance of Initial Conditions for Skorohod Reflection Mapping on the Nonnegative Orthant

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions

On Tandem Blocking Queues with a Common Retrial Queue

Matrix analytic methods. Lecture 1: Structured Markov chains and their stationary distribution

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

CS 798: Homework Assignment 3 (Queueing Theory)

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

e - c o m p a n i o n

HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

A Diffusion Approximation for Stationary Distribution of Many-Server Queueing System In Halfin-Whitt Regime

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Stationary remaining service time conditional on queue length

Lecture 20: Reversible Processes and Queues

THE ON NETWORK FLOW EQUATIONS AND SPLITTG FORMULAS TRODUCTION FOR SOJOURN TIMES IN QUEUEING NETWORKS 1 NO FLOW EQUATIONS

ON THE NON-EXISTENCE OF PRODUCT-FORM SOLUTIONS FOR QUEUEING NETWORKS WITH RETRIALS

PITMAN S 2M X THEOREM FOR SKIP-FREE RANDOM WALKS WITH MARKOVIAN INCREMENTS

A Note on the Event Horizon for a Processor Sharing Queue

Data analysis and stochastic modeling

Operations Research Letters. Instability of FIFO in a simple queueing system with arbitrarily low loads

Continuous Time Markov Chains

The SIS and SIR stochastic epidemic models revisited

1 Continuous-time chains, finite state space

Introduction to Queuing Networks Solutions to Problem Sheet 3

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

MARKOVIANITY OF A SUBSET OF COMPONENTS OF A MARKOV PROCESS

Overload Analysis of the PH/PH/1/K Queue and the Queue of M/G/1/K Type with Very Large K

On hitting times and fastest strong stationary times for skip-free and more general chains

The Transition Probability Function P ij (t)

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

Transcription:

Queueing Systems 46, 353 361, 24 24 Kluwer Academic Publishers Manufactured in The Netherlands On Existence of Limiting Distribution for Time-Nonhomogeneous Countable Markov Process V ABRAMOV vyachesl@zahavnetil Department of Mathematics, The Faculty of Exact Sciences, Tel Aviv University, 69978 Tel Aviv, and College of Judea and Samaria, 44837 Ariel, Israel R LIPTSER liptser@engtauacil Department of Electrical Engineering-Systems, Tel Aviv University, 69978 Tel Aviv, Israel Received 15 October 22; Revised 1 March 23 Abstract In this paper, sufficient conditions are given for the existence of iting distribution of a nonhomogeneous countable Markov chain with time-dependent transition intensity matrix The method of proof exploits the fact that if the distribution of random process Q = (Q t t is absolutely continuous with respect to the distribution of ergodic random process Q = (Q t t,then law Q t π, where π is the invariant measure of Q We apply this result for asymptotic analysis, as t,ofa nonhomogeneous countable Markov chain which shares iting distribution with an ergodic birth-anddeath process Keywords: countable Markov process, existence of the iting distribution, birth-and-death process 1 Introduction There is a large number of papers in the queueing literature devoted to analysis of state dependent and time dependent queueing systems as M t /M t /1andM t /M t /c and associated with their Markovian queueing networks (eg, see [23 27] An analysis of such queueing system is motivated by a wide spectra of practical problems well-known in the literature For instance, a simple example corresponding to the police dispatching problem is given in [26, p 6] (see also [15] for further discussion The M t /M t / queue, used for a model of emergency ambulances and intensive care units, is considered in [6] Other applications are known for client/server computer networks, when arrival and service rates of nodes depend on amount of unfinished work and number of available tasks on server (see, eg, [2,3,19,2] Typically, an asymptotic analysis of M t /M t /1andM t /M t /c uses differential equations for transition probabilities and asymptotic analysis, as t, for their solutions This type of analysis is similar to an investigation of stability for Markov

354 ABRAMOV AND LIPTSER chain and is associated with a verification of stationarity (quasi-stationarity (see [1,4,5, 7 1,21,29] and many others We mention now results related to time-nonhomogeneous stochastic models converging, as t, to time-homogeneous ones Although first results were published more than 3 years ago by Gnedenko and Soloviev [12] and Gnedenko [11], a remarkable progress was achieved not a long time ago (see [13,14,16,23,3 36] In particular, Zeifman developed a number of effective tools, permitting investigate successfully ergodicity conditions for special classes of time-nonhomogeneous birth-and-death processes including M t /M t /1, M t /M t /S and M t /M t /S/ queues (for further discussion, see [14] In the present paper, we give sufficient conditions, under which a time-nonhomogeneous countable Markov chain with the transition intensity matrix (t shares the iting distribution with time-homogeneous ergodic Markov chain with the transition intensity matrix Our setting is heavily related to the above-mentioned settings and the result obtained supplements the results from [14,31,32,35,36] (more detailed comparison is given in section 6 The main difference with known approaches to this problem is that the convergence (t, t is not required In contrast to that we assume the existence of nonnegative λ ij,i j such that (here λ ij (t are entries of (t λ ij <, j i t λ ij (si (λ ij > ds = ( λij (t λ ij 2 dt <, t λ ij (s ds, t >, (11 and create the matrix with entries λ ij,i j and λ ii = j i λ ij We assume that the Markov chain with this transition intensity matrix is ergodic To explain our method with more details, notice that (11 guarantees the absolute continuity of the distribution for (t-markov chain with respect to the distribution for -Markov chain It is also assumed that -Markov chain is ergodic but the geometrical ergodicity is not required We show in theorem 21 (section 2 that the above-mentioned absolute continuity of distributions provides the iting distribution, as t, for (t-markov chain coinciding with the invariant measure of -Markov chain In section 3, we give the proof of theorem 21 In section 4, we show that not only the iting distribution but also other iting functionals are the same as for -Markov chain In section 5, an asymptotic equivalence of (t-markov chain to an ergodic birthand-death process is established

LIMITING DISTRIBUTION FOR TIME-NONHOMOGENEOUS MARKOV PROCESS 355 2 The main result We consider a nonhomogeneous Markov chain Q = (Q t t with the countable set of states S = {, 1,} and the transition intensity matrix (t with entries λ ij (t Suppose that for any pair (i, j with i j there is a nonnegative constant λ ij such that (11 holds true, and introduce Markov chain Q = (Q t t with the set of its states S, and Q = Q, and the transition intensity matrix (see section 1 Our main assumption is that Q is ergodic, ie there is the unique probabilistic measure π on S such that π = and Q t = j Q s = i = π j, s,i,j, (21 where π j are entries of π Theorem 21 Under (11 and (21, 3 The proof of theorem 21 31 Preinaries P(Q t = j Q s = i = π j, s,i,j (22 Without loss of generality one may assume that the Markov chains Q and Q have paths in the Skorokhod space D = D [, of right continuous having its to the left functions x = (x t t Let ν, ν be the distributions of Q and Q respectively, that is ν, ν are probabilistic measures on (D, G,whereG is the Borel σ -algebra on D Without loss of generality we may assume that G is completed with respect to the measure (ν + ν /2 We shall use in the sequel that ν ν Recall (see, eg, [28] that ν ν provides that ν(a = for any A G, ifν (A = For the verification of ν ν, we apply [17, theorem 24] Following this theorem, ν ν if for all i, j S (a ν (x = i = ν(x = i = ; (b t I(x s = jλ ij (s ds = t I(x s = jλ ij (si (λ ij ds, ν-as; (c with = ( [ P 1 j i ] 2 λ ij (t I(λ ij λ ij I(x t = idt < = 1, ν-as λ ij Notice that for any j i, (c is provided by the condition [ ] 2 λ ij (t 1 I(λ ij λ ij dt < λ ij equivalent to the first part in (11

356 ABRAMOV AND LIPTSER Introduce a stochastic basis (D, G,(G t t,ν with the general condition (see, eg, [22], where (G t t is the filtration generated by x Henceforth, E ν and E ν denote the expectations with respect to ν and ν, respectively Set Z(x = dν/dν (x and Z t (x = E ν (Z G t (x We shall use the fact that (Z t (x t is positive uniformly integrable martingale with respect to ν Throughout the paper we use the notation ( for minimum (maximum of two numbers 32 Auxiliary lemma Lemma 31 Under the assumptions of theorem 21, for any s andj S P(Q t = j Q s prob π j Proof With s<s <t, using Markov property, write P(Q t = j Q s = ν(x t = j G s = E ν( ν(x t = j G s x s According to well known formula for the conditional expectation under absolute continuous change of measure: for any integrable random variable α, E ν (α G s = E ν ((Z/Z s α G s we find ( Z(x ν(x t = j G s = E ν Z s (x I(x t = j G s = ν (x t = j x s + E ν ([ Z(x Z s (x 1 ] I(x t = j G s = j x s ν as π j and by ν ν the same convergence holds By (21, ν (x t ν-as too well So, it remains to show that ([ ] Z(x Eν Z s (x 1 I(x t = j G s Notice that ([ ] Z(x E ν E ν Z s (x 1 Z(x E ν Z s (x 1 ( Z(x E ν Z s (x 1 3 ( Z(x = E ν Z s (x 1 3 E ν ( Z(x Z s (x + 1 I I(x t = j G s ( Z(x + E ν Z s (x + 1 I ( Z(x + E ν Z s (x + 1 ( Z(x Z s (x 2 ν prob (31 s ( Z(x Z s (x > 2

LIMITING DISTRIBUTION FOR TIME-NONHOMOGENEOUS MARKOV PROCESS 357 As was mentioned above Z t (x is the positive uniformly integrable ν -martingale Hence s Z s (x = Z(x, ν -as Consequently, by Lebesgue dominated theorem ( Z(x s Eν Z s (x 1 3 = (32 Now, it remains to show that ( ( ( Z(x Z(x Z(x s Eν Z s (x + 1 = s Eν Z s (x + 1 I Z s (x 2 Since Z s (x Z(x,s, by the Lebesgue dominated theorem the right hand side of the above equality is equal to 2 At the same time for any s we have E ((Z(x/Z s (x G s = 1andsoforanys it holds E (Z(x/Z s (x + 1 = 2 Thus, ([ ] Z(x Eν Z s (x 1 ν I(x t = j G prob s s 33 Final part of proof By lemma 31 we have P(Q t = j Q s = ii(q s = i prob π j Hence, for any i S j= P(Q t = j Q s = ii(q s = i prob ji(q s = i (33 and the statement of theorem 21 follows 4 Asymptotic equivalence for other functionals Denote h(x t = I(x t = j Theorem 21 guarantees the asymptotic equivalence Eν( ( h(x t G s = E ν h(xt G s An analysis of the proof of theorem 21 shows that the same type of asymptotic equivalence holds for any bounded functional h(x [t, of argument x [t, ={x u,u t} provided that E ν (h(x [t, G s exists, that is under the assumptions of theorem 21 Eν( h(x [t, G s = E ν ( h(x[t, G s (41

358 ABRAMOV AND LIPTSER 5 Asymptotic equivalence to birth-and-death process Let λ (t λ (t µ 1 (t (λ 1 (t + µ 1 (t λ 1 (t (t = µ 2 (t (λ 2 (t + µ 2 (t λ 2 (t µ 3 (t (λ 3 (t + µ 3 (t and Assume that there exist positive numbers λ j,µ j s such that n n=1 j=1 λ j 1 µ j < (51 [ ( λj (t λ j 2 + ( µj (t µ j 2 ] dt <, j (52 We mention here that (52 does not provide λ j (t λ j,µ j (t µ j, Introduce the matrix λ λ µ 1 (λ 1 + µ 1 λ 1 = µ 2 (λ 2 + µ 2 λ 2 µ 3 (λ 3 + µ 3 and notice that Markov chain with the transition intensity matrix generates the birthand-death process It is well known (see, eg, [18, chapter 7, section 5] that under (51 the birth-anddeath process is ergodic with the unique stationary distribution on S: π = 1 + n=1 1 n n j=1 (λ j 1/µ j, π n = π j=1 λ j 1 µ j, n = 1, 2, (53 By theorem 21, the Markov chain with transition intensity matrix (t possesses the same stationary distribution It is known that the sojourn time T i in state i for the birth-and-death process is exponentially distributed P(T i x = 1 e (λ i+µ i x Under (51 and (52, for Markov chain with the transition intensity (t we have the following Let T i (t = v i (t t, where v i (t = inf{s t: Q s i, Q t = i}

LIMITING DISTRIBUTION FOR TIME-NONHOMOGENEOUS MARKOV PROCESS 359 Then, applying the result from section 4, we obtain P ( T i (t x = 1 e (λ i+µ i x 6 Discussion We consider here M t /M t /1 model Let A t and D t be independent and timenonhomogeneous Poisson processes with positive rates λ(t and µ(t respectively Let Q be a random variable, independent of A t and D t, taking values in S ={, 1,} We define the queue-length process Q = (Q t t in the M t /M t /1 as follows (notice that jumps of A t and D t are disjoint and so Q t S Q t = Q + A t t I(Q s > dd s Let A t and Dt be independent and homogeneous Poisson processes with positive rates λ and µ, letq = Q be independent of A t and Dt,andletQ = (Q t t be the queue-length process in the M/M/1 queue with parameters λ and µ for arrival and service of customers, respectively, The queue-length process Q t is defined as by Q t = Q + A t λ<µ t I ( Q s > dd s By theorem 21 the existence of the iting distribution for M t /M t /1 is provided [ ( λ λ(t 2 + ( µ µ(t 2 ] dt < (61 On the other hand, it is known from [14,31,32,35] that the existence of the iting distribution is provided by [ λ λ(t + µ µ(t ] = (62 Generally, (61 (62, and so (61 and (62 supplement each other If λ(t and µ(t are uniformly continuous on [, functions, then (61 (62, that is (62 is weaker than (61 Notice also that (62 (61, say, under additional condition: for small positive ε and t large enough λ λ(t + µ µ(t =O(t (1/2+ε It would be noted that in [36], it is studied an ergodicity problem, in the uniform operator topology, of time-nonhomogeneous Markov chains with transition intensity ma-

36 ABRAMOV AND LIPTSER trices possessing summable perturbations These results have some connection with our one too well Acknowledgement The authors indebt Prof Granovsky and the anonymous referee providing them related topics and drawing attention on some flaws References [1] VM Abramov, On the asymptotic distribution of the maximum number of infectives in epidemic models with immigrations, J Appl Probab 31 (1994 66 613 [2] VM Abramov, A large closed queueing network with autonomous service and bottleneck, Queueing Systems 35(1 3 (2 23 54 [3] VM Abramov, Some results for large closed queueing networks with and without bottleneck: Upand down-crossings approach, Queueing Systems 38(2 (21 149 184 [4] S Asmussen and H Thorisson, A Markov chain approach to periodic queues, J Appl Probab 24 (1987 215 225 [5] N Bambos and J Walrand, On stability of state-dependent queues and acyclic queueing networks, Adv in Appl Probab 21 (1989 681 71 [6] T Collings and C Stoneman, The M/M/ queue with varying arrival and departure rates, Oper Res 24 (1976 76 773 [7] E Gelenbe and D Finkel, Stationary deterministic flows: The single server queue, Theoret Comput Sci 52 (1987 269 28 [8] II Geterontidis, On certain aspects of homogeneous Markov systems in continuous time, J Appl Probab 27 (199 53 544 [9] II Geterontidis, Periodic strong ergodicity in non-homogeneous Markov systems, J Appl Probab 28 (1991 58 73 [1] II Geterontidis, Cyclic strong ergodicity in non-homogeneous Markov systems, SIAM J Matrix Anal Appl 13 (1992 55 566 [11] DB Gnedenko, On a generalization of Erlang formulae, Zastos Matem 12 (1971 239 242 [12] BV Gnedenko and AD Soloviev, On the conditions of the existence of final probabilities for a Markov process, Math Operationsforsh Statist 4 (1973 379 39 [13] BL Granovsky and AI Zeifman, The decay function of nonhomogeneous birth-and-death process, with application to mean-field model, Stochastic Process Appl 72 (1997 15 12 [14] BL Granovsky and AI Zeifman, Nonstationary Markovian queues, J Math Sci 99(4 (2 1415 1438 [15] L Green and P Kolesar, Testing a validity of the queueing model of a police patrol, Managm Sci 35 (1989 127 148 [16] J Johnson and D Isaacson, Conditions for strong ergodicity using intensity matrices, J Appl Probab 25 (1988 34 42 [17] YuM Kabanov, RSh Liptser and AN Shiryaev, Absolutely continuity and singularity of locally absolutely continuous probability distributions II, Math USSR Sbornik 36(1 (198 31 58 [18] S Karlin, A First Course in Stochastic Processes (Academic Press, New York/London, 1968 [19] C Knessl, B Matkovsky, Z Schuss and C Tier, Asymptotic analysis of a state-dependent M/G/1 queueing system, SIAM J Appl Math 46(3 (1986 483 55 [2] Y Kogan and RSh Liptser, Limit non-stationary behavior of large closed queueing networks with bottlenecks, Queueing Systems 14 (1993 33 55

LIMITING DISTRIBUTION FOR TIME-NONHOMOGENEOUS MARKOV PROCESS 361 [21] AJ Lemoine, On queues with periodic Poisson input, J Appl Probab 18 (1981 889 9 [22] RSh Liptser and AN Shiryayev, Theory of Martingales (Kluwer Academic, Dordrecht, 1989 [23] A Mandelbaum and W Massey, Strong approximations for time dependent queues, Math Oper Res 2 (1995 33 64 [24] A Mandelbaum and G Pats, State-dependent queues: Approximations and applications, in: IMA Volumes in Mathematics and Its Applications, eds F Kelly and RJ Williams, Vol 71 (Springer, Berlin, 1995 pp 239 282 [25] A Mandelbaum and G Pats, State-dependent stochastic networks Part I: Approximations and applications with continuous diffusion its, Ann Appl Probab 8(2 (1998 569 646 [26] BH Margoluis, A sample path analysis of the M t /M t /c queue, Queueing Systems 31(1 (1999 59 93 [27] W Massey, Asymptotic analysis of the time-dependent M/M/1 queues, Math Oper Res 1(2 (1985 35 327 [28] AN Shiryayev, Probability (Springer, Berlin, 1984 [29] H Thorisson, Periodic regeneration, Stochastic Process Appl 2 (1985 85 14 [3] A Zeifman, Some estimates of the rate of convergence for birth-and-death processes, J Appl Probab 28 (1991 268 277 [31] A Zeifman, On stability of continuous-time nonhomogeneous Markov chains, Soviet Math 35(7 (1991 29 34 [32] A Zeifman, On the ergodicity of nonhomogeneous birth and death processes, J Math Sci 72(1 (1994 2893 2899 [33] A Zeifman, On the estimation of probabilities for birth-and-death processes, J Appl Probab 32 (1995 623 634 [34] A Zeifman, Upper and lower bounds on the rate of convergence for nonhomogeneous birth-and-death processes, Stochastic Process Appl 59 (1995 159 173 [35] A Zeifman, Stability of birth and death processes, J Math Sci 91(3 (1998 323 331 [36] A Zeifman and D Isaacson, On strong ergodicity for nonhomogeneous continuous-time Markov chains, Stochastic Process Appl 5 (1994 263 273