Linear-fractional branching processes with countably many types
|
|
- Maud Cain
- 5 years ago
- Views:
Transcription
1 Branching processes and and their applications Badajoz, April 11-13, 2012 Serik Sagitov Chalmers University and University of Gothenburg Linear-fractional branching processes with countably many types Supported by the Swedish Research Council grant General linear-fractional generating functions 2 3 LF branching processes with countably many types 4 6 Classification of branching processes with countably many types 7 9 Main result Associated branching renewal system Example R-positively recurrent case References
2 General linear-fractional generating functions Notation for vectors of possibly infinite dimension s = (s 1, s 2,...) h = (h 1, h 2,...) g = (g 1, g 2,...) 0 = (0, 0,...) Definition 1 Random vector Z has a linear-fractional distribution LF(h, g, m) if (h 0, h 1, h 2,...) is a probability distribution on {0, 1, 2,...} (g 1, g 2,...) is a probability distribution on {1, 2,...} m is a positive constant so that in particular E(s Z ) = h 0 + i=1 h is i 1 + m m i=1 g is i, (1) P(Z = 0) = h 0. 2
3 General linear-fractional generating functions E(s Z ) = h 0 + h 1 s 1 +h 2 s m m(g 1 s 1 +g 2 s ) Such a multivariate linear-fractional distribution can be viewed as a multivariate geometric distribution modified at zero: Z = X + (Y Y N ) 1 {X 0} (2) in terms of mutually independent random entities (X, N, Y 1, Y 2,...). Here vectors X and Y j have multivariate Bernoulli distributions P(X = e i ) = h i, i 1, P(X = 0) = h 0, P(Y j = e i ) = g i, i 1, j 1, and N is a geometric random variable with mean m. 3
4 Definition 2 LF branching processes with countably many types LF branching process with parameters (H, g, m) sub-stochastic matrix H with non-negative rows h 1, h 2,... probability distribution g on {1, 2,...} positive constant m if particles of type i reproduce according to a LF(h i, g, m) distribution, i = 1, 2,.... Remark Strong limitation on the reproduction law: parameters (g, m) are ignorant on mother s type. This is needed for iterations of the corresponding generating functions to be also linear-fractional. 4
5 LF branching processes with countably many types The matrix of mean offspring numbers for a LF branching process with parameters (H, g, m) is computed as M = H + mh1 t g, (3) where 1 t is the column version of 1 = (1, 1,...). It follows, H = M m 1 + m M1t g. (4) Proposition 1 Consider a linear-fractional branching process with parameters (H, g, m) starting from a type i particle. The vector of its n-th generation sizes has a LF(h (n) i, g (n), m (n) ) distribution with 5
6 LF branching processes with countably many types m (n) g (n) = mg(i + M + + M n 1 ), (5) m (n) = m n 1 j=0 gm j 1 t, (6) H (n) = M n m(n) 1 + m (n) Mn 1 t g (n), (7) where H (n) is the matrix with rows h (n) 1, h(n) 2,.... In particular, P(Z (n) 0) = (1 + m (n) ) 1 M n 1 t, (8) where P(Z (n) 0) is a vector with elements P(Z (n) 0 Z (0) = e i ). 6
7 Classification of BP with countably many types Given that M is irreducible and aperiodic and all powers M n = (m (n) ij ) i,j=1 are element-wise finite, which is always true in the linear-fractional case, the asymptotical behavior of these powers is described by the Perron-Frobenius theory for countable matrices. Due to Theorem 6.1 in [Seneta] all elements of the matrix power series M(s) = s n M n (9) n=0 have a common convergence radius 0 R <. Furthermore, one of the two alternatives holds: R-transient case: R-recurrent case: n=0 m(n) ii n=0 m(n) ii R n < for all i R n = for all i 7
8 Classification of BP with countably many types In the R-recurrent case there exist unique up to constant multipliers positive vectors u and v such that RMu t = u t, RvM = v. Renormalization Rv j m ji /v i transforms M into a stochastic matrix The R-recurrent case is further divided in two sub-cases: R-null case, when vu t = R-positive case, when vu t < In the R-null case (and clearly also in the R-transient case) R n m (n) ij 0, i, j = 1, 2,... 8
9 Classification of BP with countably many types In the R-positive case one can scale eigenvectors so that vu t = 1 and R n m (n) ij u i v j, i, j = 1, 2,.... These suggest a double classification of BP with a mean matrix M usual classification based on the value of the growth rate ρ = 1/R: subcritical ρ < 1, critical ρ = 1, or supercritical ρ > 1 an additional classification due to recurrence property of the branching process with respect to the infinite type space. Definition 3 A branching process will be called R-transient, R-recurrent, R-null recurrent, or R-positively recurrent depending on the respective property of its matrix of mean offspring numbers M. 9
10 Consider a LF branching process with parameters (H, g, m). The monotone function f(s) = Main result s n gh n 1 t (10) n=1 grows from zero to infinity as s goes from zero to infinity. Therefore equation mf(r) = 1 (11) always has a unique positive solution R. Observe that mf(rs) is the generating function for a probability distribution with mean β = m nr n gh n 1 t. (12) n=1 10
11 Main result Theorem 1 Consider a LF branching process with parameters (H, g, m) assuming that its mean matrix M is irreducible and aperiodic. The convergence parameter R of M is defined by (11) and M is always R-recurrent If β =, then M is R-null recurrent. If β <, then M is R-positively recurrent and R n M n u t v, n. (13) 11
12 Main result Here positive vectors u t = (1 + m)β 1 v = m 1 + m k=1 R k H k 1 t, (14) R k gh k (15) k=0 are such that vu t = v1 t = 1 and gu t = 1+m mβ. Remark Formulae (14) and (15) are new even in the case of finitely many types. 12
13 The proof is based on M(s) = H(s) + Main result m 1 mf(s) H(s)1t g(h(s) + I), (16) where M(s) = n=0 sn M n and H(s) = n=0 sn H n. Apply the renewal theorem taken from Chapter XIII.4 in [Feller-2]: Proposition 2 Let A(s) = n=0 a ns n be a probability generating function and B(s) = n=0 b ns n is a generating function for a non-negative sequence so that A(1) = 1 while B(1) (0, ). Then the non-negative sequence defined by t n s n = B(s) 1 A(s) is such that t n B(1) A (1) n=0 as n. 13
14 Associated branching renewal system If the initial particle of the LF branching process with parameters (H, g, m) has distribution g P(Z (0) = e i ) = g i, i = 1, 2,..., (17) then we get a branching renewal property implying an alternative description of this reproduction system in terms of individuals defined as sequences of first-born particles. 14
15 Associated branching renewal system Individual s life length L has a discrete phase-type distribution governed by the pair (H, g). Its tail probabilities are computed as P(L > n) = gh n 1 t, n 0, (18) allowing to express the generating function (10) in the form f(s) = s n P(L > n). (19) n=1 Clearly, and R is found from f(s) = 1 E(sL ) 1 s 1 E(R L ) = R 1 + m m 1 m. 15
16 Associated branching renewal system The life length has always a positive mean (finite or infinite) λ := E(L) = 1 + f(1). (20) Every year during its life (except the last year) the individual produces a geometric number of offspring with mean m. Thus the mean total offspring number per individual is µ = m(λ 1) = mf(1). (21) Depending on µ < 1, µ = 1, or µ > 1, the branching process should be called sub-, critical, or supercritical resulting exactly in the same classification defined in terms of the convergence parameter R. 16
17 Associated branching renewal system Total population sizes Z (n) = Z (n) 1 t form a discrete time CMJ process. There are many triplets (H, g, m) resulting in the same CMJ process with the same values of R and β. Mean age at childbearing β = k=1 kmrk P(L > k) The difference between such related branching processes is in the labeling rules for particles and will be seen in their Perron-Frobenius eigenvectors. Remark Making births very rare events we can approximate the linear-fractional CMJ processes by the continuous time CMJ processes with Poisson reproduction, recently studied by Lambert. 17
18 There is a particular labeling system allowing for more explicit formulae for the Perron-Frobenius eigenvectors. Example We relabel individuals by looking into the future and assigning label i to individuals whose remaining life length equals i for i = 1, 2,.... For example, an individual with label 1 must die in the next year. The relabeled process is again a linear-fractional branching process with a modified triplet of parameters ( H, g, m), where H = , g n = P(L = n). 18
19 Observe that the corresponding mean matrix M is not fully irreducible, since the type 1 is final type. Example However, the Perron-Frobenius theorem can be extended to this case as well and the corresponding Perron-Frobenius eigenvectors could be computed using (14) and (15) as It follows ů i = β 1 ( R i 1 ), i = 1, 2,..., v j = m(1 + m) 1 E(R L j ; L j), j = 1, 2,.... m (n) ij ρ n+j (ρ ρ i )m(1 + m) 1 β 1 E(ρ L ; L j), thus in the critical case m (n) ij iλ 1 β 1 P(L j). 19
20 R-positively recurrent case Proposition 3 In the subcritical case when ρ < 1, or equivalently µ < 1 we have P(Z (n) 0) ρ n (1 + m) 1 (1 µ)u, and a convergence in distribution [ Z (n) Z (n) 0, Z (0) = e i ] d Y. Limit Y has a LF( h, g, m) distribution with parameters h = (1 + m)(1 µ) 1 v (µ + m)(1 µ) 1 g, (22) g = λ 1 (1 µ)g(i M) 1 (23) being independent of the initial type. In particular, for Y = Y1 t P(Y = k) = m k 1 (1 + m) k, k = 1, 2,.... (24) 20
21 R-positively recurrent case Proposition 4 In the critical case when ρ = 1 we have P(Z (n) 0) n 1 (1 + m) 1 βu, and [ n 1 Z (n) Z (n) 0, Z (0) ] d = e i Xv, where X is exponentially distributed with mean (1 + m)β 1. If furthermore a vector w is such that vw t = 0, then [ n 1/2 Z (n) w t Z (n) 0, Z (0) = e i ] d Y (1 + m)β 1 w 2 v t, (25) where w 2 = (w 2 1, w 2 2,...) and Y has a Laplace distribution with parameter 1. 21
22 R-positively recurrent case Proposition 5 In the supercritical case when ρ > 1 we have P(Z (n) 0) (ρ 1)(1 + m) 1 βu, and [ ρ n Z (n) Z (n) 0, Z (0) ] d = e i Xv, where X is exponentially distributed with mean (1 + m)(ρ 1) 1 β 1. If furthermore a vector w is such that vw t = 0, then [ ρ n/2 Z (n) w t Z (n) 0, Z (0) = e i ] d Y (1 + m)(ρ 1) 1 β 1 w 2 v t, where w 2 = (w 2 1, w 2 2,...) and Y has a Laplace distribution with parameter 1. 22
23 References References [1] Jagers, P. and Sagitov, S. (2008) General branching processes in discrete time as random trees. Bernoulli 14, [2] Joffe A. and G. Letac (2006) Multitype linear fractional branching processes. J. Appl. Probab. 43, [3] Hoppe FM. (1997) Coupling and the Non-degeneracy of the Limit in Some Plasmid Reproduction Models. Theor. Pop. Biol. 52, [4] Kesten, H. (1989) Supercritical branching processes with countably many types and the sizes of random cantor sets. In T.W. Anderson, K.B. Athreya and D. Iglehart (eds), Probability,Statistics and Mathematics Papers in Honor of Samuel Karlin, pp New York:Academic Press. [5] Lambert, A. (2010) The contour of splitting trees is a Levy process. Ann. Probab. 38,
24 References [6] Moy,S.-T. C. (1967) Extensions of a limit theorem of Everett Ulam and Harrison multi-type branching processes to a branching process with countably many types. Ann. Math. Statist [7] Pollak, E. (1974) Survival probabilities and extinction times for some multitype branching processes. Adv. Appl. Prob. 6, [8] Seneta, E. (2006). Non-negative matrices and Markov chains. Springer Series in Statistics No. 21, Springer, New-York. [9] Seneta E, Tavare S. (1983) Some stochastic models for plasmid copy number. Theor. Pop. Biol. 23,
Decomposition of supercritical branching processes with countably many types
Pomorie 26 June 2012 Serik Sagitov and Altynay Shaimerdenova 20 slides Chalmers University and University of Gothenburg Al-Farabi Kazakh National University Decomposition of supercritical branching processes
More informationA NOTE ON THE ASYMPTOTIC BEHAVIOUR OF A PERIODIC MULTITYPE GALTON-WATSON BRANCHING PROCESS. M. González, R. Martínez, M. Mota
Serdica Math. J. 30 (2004), 483 494 A NOTE ON THE ASYMPTOTIC BEHAVIOUR OF A PERIODIC MULTITYPE GALTON-WATSON BRANCHING PROCESS M. González, R. Martínez, M. Mota Communicated by N. M. Yanev Abstract. In
More informationSIMILAR MARKOV CHAINS
SIMILAR MARKOV CHAINS by Phil Pollett The University of Queensland MAIN REFERENCES Convergence of Markov transition probabilities and their spectral properties 1. Vere-Jones, D. Geometric ergodicity in
More informationEXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS
(February 25, 2004) EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS, University of Queensland PHIL POLLETT, University of Queensland Abstract The birth, death and catastrophe
More informationMarkov Chains, Stochastic Processes, and Matrix Decompositions
Markov Chains, Stochastic Processes, and Matrix Decompositions 5 May 2014 Outline 1 Markov Chains Outline 1 Markov Chains 2 Introduction Perron-Frobenius Matrix Decompositions and Markov Chains Spectral
More informationBranching within branching: a general model for host-parasite co-evolution
Branching within branching: a general model for host-parasite co-evolution Gerold Alsmeyer (joint work with Sören Gröttrup) May 15, 2017 Gerold Alsmeyer Host-parasite co-evolution 1 of 26 1 Model 2 The
More informationBranching processes, budding yeast, and k-nacci numbers
Branching processes, budding yeast, and k-nacci numbers Peter Olofsson March 26, 2010 Abstract A real-world application of branching processes to a problem in cell biology where generalized k-nacci numbers
More informationEXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS
J. Appl. Prob. 41, 1211 1218 (2004) Printed in Israel Applied Probability Trust 2004 EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS and P. K. POLLETT, University of Queensland
More informationprocess on the hierarchical group
Intertwining of Markov processes and the contact process on the hierarchical group April 27, 2010 Outline Intertwining of Markov processes Outline Intertwining of Markov processes First passage times of
More informationIrregular Birth-Death process: stationarity and quasi-stationarity
Irregular Birth-Death process: stationarity and quasi-stationarity MAO Yong-Hua May 8-12, 2017 @ BNU orks with W-J Gao and C Zhang) CONTENTS 1 Stationarity and quasi-stationarity 2 birth-death process
More informationMarkov Chains, Random Walks on Graphs, and the Laplacian
Markov Chains, Random Walks on Graphs, and the Laplacian CMPSCI 791BB: Advanced ML Sridhar Mahadevan Random Walks! There is significant interest in the problem of random walks! Markov chain analysis! Computer
More informationPositive and null recurrent-branching Process
December 15, 2011 In last discussion we studied the transience and recurrence of Markov chains There are 2 other closely related issues about Markov chains that we address Is there an invariant distribution?
More information88 CONTINUOUS MARKOV CHAINS
88 CONTINUOUS MARKOV CHAINS 3.4. birth-death. Continuous birth-death Markov chains are very similar to countable Markov chains. One new concept is explosion which means that an infinite number of state
More informationCrump Mode Jagers processes with neutral Poissonian mutations
Crump Mode Jagers processes with neutral Poissonian mutations Nicolas Champagnat 1 Amaury Lambert 2 1 INRIA Nancy, équipe TOSCA 2 UPMC Univ Paris 06, Laboratoire de Probabilités et Modèles Aléatoires Paris,
More informationPREPRINT 2007:25. Multitype Galton-Watson processes escaping extinction SERIK SAGITOV MARIA CONCEIÇÃO SERRA
PREPRINT 2007:25 Multitype Galton-Watson processes escaping extinction SERIK SAGITOV MARIA CONCEIÇÃO SERRA Department of Mathematical Sciences Division of Mathematical Statistics CHALMERS UNIVERSITY OF
More informationLecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321
Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 1
MATH 56A: STOCHASTIC PROCESSES CHAPTER. Finite Markov chains For the sake of completeness of these notes I decided to write a summary of the basic concepts of finite Markov chains. The topics in this chapter
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More informationAn Introduction to Stochastic Modeling
F An Introduction to Stochastic Modeling Fourth Edition Mark A. Pinsky Department of Mathematics Northwestern University Evanston, Illinois Samuel Karlin Department of Mathematics Stanford University Stanford,
More information8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains
8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States
More informationSTOCHASTIC PROCESSES Basic notions
J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving
More informationResistance Growth of Branching Random Networks
Peking University Oct.25, 2018, Chengdu Joint work with Yueyun Hu (U. Paris 13) and Shen Lin (U. Paris 6), supported by NSFC Grant No. 11528101 (2016-2017) for Research Cooperation with Oversea Investigators
More information2. Transience and Recurrence
Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 2
MATH 56A: STOCHASTIC PROCESSES CHAPTER 2 2. Countable Markov Chains I started Chapter 2 which talks about Markov chains with a countably infinite number of states. I did my favorite example which is on
More informationRECENT RESULTS FOR SUPERCRITICAL CONTROLLED BRANCHING PROCESSES WITH CONTROL RANDOM FUNCTIONS
Pliska Stud. Math. Bulgar. 16 (2004), 43-54 STUDIA MATHEMATICA BULGARICA RECENT RESULTS FOR SUPERCRITICAL CONTROLLED BRANCHING PROCESSES WITH CONTROL RANDOM FUNCTIONS Miguel González, Manuel Molina, Inés
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More informationLimiting distribution for subcritical controlled branching processes with random control function
Available online at www.sciencedirect.com Statistics & Probability Letters 67 (2004 277 284 Limiting distribution for subcritical controlled branching processes with random control function M. Gonzalez,
More informationCDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv
More informationarxiv: v1 [math.pr] 13 Nov 2018
PHASE TRANSITION FOR THE FROG MODEL ON BIREGULAR TREES ELCIO LEBENSZTAYN AND JAIME UTRIA arxiv:1811.05495v1 [math.pr] 13 Nov 2018 Abstract. We study the frog model with death on the biregular tree T d1,d
More informationIntertwining of Markov processes
January 4, 2011 Outline Outline First passage times of birth and death processes Outline First passage times of birth and death processes The contact process on the hierarchical group 1 0.5 0-0.5-1 0 0.2
More informationMatrix analytic methods. Lecture 1: Structured Markov chains and their stationary distribution
1/29 Matrix analytic methods Lecture 1: Structured Markov chains and their stationary distribution Sophie Hautphenne and David Stanford (with thanks to Guy Latouche, U. Brussels and Peter Taylor, U. Melbourne
More informationP i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=
2.7. Recurrence and transience Consider a Markov chain {X n : n N 0 } on state space E with transition matrix P. Definition 2.7.1. A state i E is called recurrent if P i [X n = i for infinitely many n]
More informationUPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES
Applied Probability Trust 7 May 22 UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES HAMED AMINI, AND MARC LELARGE, ENS-INRIA Abstract Upper deviation results are obtained for the split time of a
More informationCDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical
CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one
More informationStochastic modelling of epidemic spread
Stochastic modelling of epidemic spread Julien Arino Department of Mathematics University of Manitoba Winnipeg Julien Arino@umanitoba.ca 19 May 2012 1 Introduction 2 Stochastic processes 3 The SIS model
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationStochastic modelling of epidemic spread
Stochastic modelling of epidemic spread Julien Arino Centre for Research on Inner City Health St Michael s Hospital Toronto On leave from Department of Mathematics University of Manitoba Julien Arino@umanitoba.ca
More information14 Branching processes
4 BRANCHING PROCESSES 6 4 Branching processes In this chapter we will consider a rom model for population growth in the absence of spatial or any other resource constraints. So, consider a population of
More informationProbability & Computing
Probability & Computing Stochastic Process time t {X t t 2 T } state space Ω X t 2 state x 2 discrete time: T is countable T = {0,, 2,...} discrete space: Ω is finite or countably infinite X 0,X,X 2,...
More informationBirth-death chain models (countable state)
Countable State Birth-Death Chains and Branching Processes Tuesday, March 25, 2014 1:59 PM Homework 3 posted, due Friday, April 18. Birth-death chain models (countable state) S = We'll characterize the
More informationTHE x log x CONDITION FOR GENERAL BRANCHING PROCESSES
J. Appl. Prob. 35, 537 544 (1998) Printed in Israel Applied Probability Trust 1998 THE x log x CONDITION FOR GENERAL BRANCHING PROCESSES PETER OLOFSSON, Rice University Abstract The x log x condition is
More informationA review of Continuous Time MC STA 624, Spring 2015
A review of Continuous Time MC STA 624, Spring 2015 Ruriko Yoshida Dept. of Statistics University of Kentucky polytopes.net STA 624 1 Continuous Time Markov chains Definition A continuous time stochastic
More informationStochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS
Stochastic Processes Theory for Applications Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Contents Preface page xv Swgg&sfzoMj ybr zmjfr%cforj owf fmdy xix Acknowledgements xxi 1 Introduction and review
More informationON THE COMPLETE LIFE CAREER OF POPULATIONS IN ENVIRONMENTS WITH A FINITE CARRYING CAPACITY. P. Jagers
Pliska Stud. Math. 24 (2015), 55 60 STUDIA MATHEMATICA ON THE COMPLETE LIFE CAREER OF POPULATIONS IN ENVIRONMENTS WITH A FINITE CARRYING CAPACITY P. Jagers If a general branching process evolves in a habitat
More informationTheory of Stochastic Processes 3. Generating functions and their applications
Theory of Stochastic Processes 3. Generating functions and their applications Tomonari Sei sei@mist.i.u-tokyo.ac.jp Department of Mathematical Informatics, University of Tokyo April 20, 2017 http://www.stat.t.u-tokyo.ac.jp/~sei/lec.html
More informationMATH 56A SPRING 2008 STOCHASTIC PROCESSES 65
MATH 56A SPRING 2008 STOCHASTIC PROCESSES 65 2.2.5. proof of extinction lemma. The proof of Lemma 2.3 is just like the proof of the lemma I did on Wednesday. It goes like this. Suppose that â is the smallest
More informationNon-Parametric Bayesian Inference for Controlled Branching Processes Through MCMC Methods
Non-Parametric Bayesian Inference for Controlled Branching Processes Through MCMC Methods M. González, R. Martínez, I. del Puerto, A. Ramos Department of Mathematics. University of Extremadura Spanish
More informationNote that in the example in Lecture 1, the state Home is recurrent (and even absorbing), but all other states are transient. f ii (n) f ii = n=1 < +
Random Walks: WEEK 2 Recurrence and transience Consider the event {X n = i for some n > 0} by which we mean {X = i}or{x 2 = i,x i}or{x 3 = i,x 2 i,x i},. Definition.. A state i S is recurrent if P(X n
More informationBudding Yeast, Branching Processes, and Generalized Fibonacci Numbers
Integre Technical Publishing Co., Inc. Mathematics Magazine 84:3 April 2, 211 11:36 a.m. olofsson.tex page 163 VOL. 84, NO. 3, JUNE 211 163 Budding Yeast, Branching Processes, and Generalized Fibonacci
More informationConvergence Rates for Renewal Sequences
Convergence Rates for Renewal Sequences M. C. Spruill School of Mathematics Georgia Institute of Technology Atlanta, Ga. USA January 2002 ABSTRACT The precise rate of geometric convergence of nonhomogeneous
More information2 Discrete-Time Markov Chains
2 Discrete-Time Markov Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationPopulation Games and Evolutionary Dynamics
Population Games and Evolutionary Dynamics William H. Sandholm The MIT Press Cambridge, Massachusetts London, England in Brief Series Foreword Preface xvii xix 1 Introduction 1 1 Population Games 2 Population
More informationCountable state discrete time Markov Chains
Countable state discrete time Markov Chains Tuesday, March 18, 2014 2:12 PM Readings: Lawler Ch. 2 Karlin & Taylor Chs. 2 & 3 Resnick Ch. 1 Countably infinite state spaces are of practical utility in situations
More informationarxiv:math.pr/ v1 17 May 2004
Probabilistic Analysis for Randomized Game Tree Evaluation Tämur Ali Khan and Ralph Neininger arxiv:math.pr/0405322 v1 17 May 2004 ABSTRACT: We give a probabilistic analysis for the randomized game tree
More informationMARKOV PROCESSES. Valerio Di Valerio
MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some
More informationThe Λ-Fleming-Viot process and a connection with Wright-Fisher diffusion. Bob Griffiths University of Oxford
The Λ-Fleming-Viot process and a connection with Wright-Fisher diffusion Bob Griffiths University of Oxford A d-dimensional Λ-Fleming-Viot process {X(t)} t 0 representing frequencies of d types of individuals
More informationAge-dependent branching processes with incubation
Age-dependent branching processes with incubation I. RAHIMOV Department of Mathematical Sciences, KFUPM, Box. 1339, Dhahran, 3161, Saudi Arabia e-mail: rahimov @kfupm.edu.sa We study a modification of
More information6.207/14.15: Networks Lecture 3: Erdös-Renyi graphs and Branching processes
6.207/14.15: Networks Lecture 3: Erdös-Renyi graphs and Branching processes Daron Acemoglu and Asu Ozdaglar MIT September 16, 2009 1 Outline Erdös-Renyi random graph model Branching processes Phase transitions
More informationOutlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)
Markov Chains (2) Outlines Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC) 2 pj ( n) denotes the pmf of the random variable p ( n) P( X j) j We will only be concerned with homogenous
More informationDe los ejercicios de abajo (sacados del libro de Georgii, Stochastics) se proponen los siguientes:
Probabilidades y Estadística (M) Práctica 7 2 cuatrimestre 2018 Cadenas de Markov De los ejercicios de abajo (sacados del libro de Georgii, Stochastics) se proponen los siguientes: 6,2, 6,3, 6,7, 6,8,
More informationIntegrals for Continuous-time Markov chains
Integrals for Continuous-time Markov chains P.K. Pollett Abstract This paper presents a method of evaluating the expected value of a path integral for a general Markov chain on a countable state space.
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationStochastic Models: Markov Chains and their Generalizations
Scuola di Dottorato in Scienza ed Alta Tecnologia Dottorato in Informatica Universita di Torino Stochastic Models: Markov Chains and their Generalizations Gianfranco Balbo e Andras Horvath Outline Introduction
More informationFinite rooted tree. Each vertex v defines a subtree rooted at v. Picking v at random (uniformly) gives the random fringe subtree.
Finite rooted tree. Each vertex v defines a subtree rooted at v. Picking v at random (uniformly) gives the random fringe subtree. 1 T = { finite rooted trees }, countable set. T n : random tree, size n.
More informationClassification of Countable State Markov Chains
Classification of Countable State Markov Chains Friday, March 21, 2014 2:01 PM How can we determine whether a communication class in a countable state Markov chain is: transient null recurrent positive
More informationLecture 21. David Aldous. 16 October David Aldous Lecture 21
Lecture 21 David Aldous 16 October 2015 In continuous time 0 t < we specify transition rates or informally P(X (t+δ)=j X (t)=i, past ) q ij = lim δ 0 δ P(X (t + dt) = j X (t) = i) = q ij dt but note these
More informationConvex Optimization CMU-10725
Convex Optimization CMU-10725 Simulated Annealing Barnabás Póczos & Ryan Tibshirani Andrey Markov Markov Chains 2 Markov Chains Markov chain: Homogen Markov chain: 3 Markov Chains Assume that the state
More informationLIMIT THEOREMS FOR NON-CRITICAL BRANCHING PROCESSES WITH CONTINUOUS STATE SPACE. S. Kurbanov
Serdica Math. J. 34 (2008), 483 488 LIMIT THEOREMS FOR NON-CRITICAL BRANCHING PROCESSES WITH CONTINUOUS STATE SPACE S. Kurbanov Communicated by N. Yanev Abstract. In the paper a modification of the branching
More informationHANDBOOK OF APPLICABLE MATHEMATICS
HANDBOOK OF APPLICABLE MATHEMATICS Chief Editor: Walter Ledermann Volume II: Probability Emlyn Lloyd University oflancaster A Wiley-Interscience Publication JOHN WILEY & SONS Chichester - New York - Brisbane
More information12 Markov chains The Markov property
12 Markov chains Summary. The chapter begins with an introduction to discrete-time Markov chains, and to the use of matrix products and linear algebra in their study. The concepts of recurrence and transience
More informationA probabilistic proof of Perron s theorem arxiv: v1 [math.pr] 16 Jan 2018
A probabilistic proof of Perron s theorem arxiv:80.05252v [math.pr] 6 Jan 208 Raphaël Cerf DMA, École Normale Supérieure January 7, 208 Abstract Joseba Dalmau CMAP, Ecole Polytechnique We present an alternative
More informationMarkov chains. Randomness and Computation. Markov chains. Markov processes
Markov chains Randomness and Computation or, Randomized Algorithms Mary Cryan School of Informatics University of Edinburgh Definition (Definition 7) A discrete-time stochastic process on the state space
More informationQuasi-Stationary Distributions in Linear Birth and Death Processes
Quasi-Stationary Distributions in Linear Birth and Death Processes Wenbo Liu & Hanjun Zhang School of Mathematics and Computational Science, Xiangtan University Hunan 411105, China E-mail: liuwenbo10111011@yahoo.com.cn
More informationMAS275 Probability Modelling Exercises
MAS75 Probability Modelling Exercises Note: these questions are intended to be of variable difficulty. In particular: Questions or part questions labelled (*) are intended to be a bit more challenging.
More informationN.G.Bean, D.A.Green and P.G.Taylor. University of Adelaide. Adelaide. Abstract. process of an MMPP/M/1 queue is not a MAP unless the queue is a
WHEN IS A MAP POISSON N.G.Bean, D.A.Green and P.G.Taylor Department of Applied Mathematics University of Adelaide Adelaide 55 Abstract In a recent paper, Olivier and Walrand (994) claimed that the departure
More informationThree Disguises of 1 x = e λx
Three Disguises of 1 x = e λx Chathuri Karunarathna Mudiyanselage Rabi K.C. Winfried Just Department of Mathematics, Ohio University Mathematical Biology and Dynamical Systems Seminar Ohio University November
More informationAARMS Homework Exercises
1 For the gamma distribution, AARMS Homework Exercises (a) Show that the mgf is M(t) = (1 βt) α for t < 1/β (b) Use the mgf to find the mean and variance of the gamma distribution 2 A well-known inequality
More informationINTERTWINING OF BIRTH-AND-DEATH PROCESSES
K Y BERNETIKA VOLUM E 47 2011, NUMBER 1, P AGES 1 14 INTERTWINING OF BIRTH-AND-DEATH PROCESSES Jan M. Swart It has been known for a long time that for birth-and-death processes started in zero the first
More informationarxiv: v2 [math.pr] 4 Sep 2017
arxiv:1708.08576v2 [math.pr] 4 Sep 2017 On the Speed of an Excited Asymmetric Random Walk Mike Cinkoske, Joe Jackson, Claire Plunkett September 5, 2017 Abstract An excited random walk is a non-markovian
More informationTransience: Whereas a finite closed communication class must be recurrent, an infinite closed communication class can be transient:
Stochastic2010 Page 1 Long-Time Properties of Countable-State Markov Chains Tuesday, March 23, 2010 2:14 PM Homework 2: if you turn it in by 5 PM on 03/25, I'll grade it by 03/26, but you can turn it in
More informationQuestion Points Score Total: 70
The University of British Columbia Final Examination - April 204 Mathematics 303 Dr. D. Brydges Time: 2.5 hours Last Name First Signature Student Number Special Instructions: Closed book exam, no calculators.
More informationMath 597/697: Solution 5
Math 597/697: Solution 5 The transition between the the ifferent states is governe by the transition matrix 0 6 3 6 0 2 2 P = 4 0 5, () 5 4 0 an v 0 = /4, v = 5, v 2 = /3, v 3 = /5 Hence the generator
More informationIEOR 6711: Stochastic Models I Professor Whitt, Thursday, November 29, Weirdness in CTMC s
IEOR 6711: Stochastic Models I Professor Whitt, Thursday, November 29, 2012 Weirdness in CTMC s Where s your will to be weird? Jim Morrison, The Doors We are all a little weird. And life is a little weird.
More informationA Note on Bisexual Galton-Watson Branching Processes with Immigration
E extracta mathematicae Vol. 16, Núm. 3, 361 365 (2001 A Note on Bisexual Galton-Watson Branching Processes with Immigration M. González, M. Molina, M. Mota Departamento de Matemáticas, Universidad de
More informationLecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.
1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods
Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric
More informationThe Theory behind PageRank
The Theory behind PageRank Mauro Sozio Telecom ParisTech May 21, 2014 Mauro Sozio (LTCI TPT) The Theory behind PageRank May 21, 2014 1 / 19 A Crash Course on Discrete Probability Events and Probability
More informationProbability Distributions
Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)
More informationThe range of tree-indexed random walk
The range of tree-indexed random walk Jean-François Le Gall, Shen Lin Institut universitaire de France et Université Paris-Sud Orsay Erdös Centennial Conference July 2013 Jean-François Le Gall (Université
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationDepartment of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing
Department of Applied Mathematics Faculty of EEMCS t University of Twente The Netherlands P.O. Box 27 75 AE Enschede The Netherlands Phone: +3-53-48934 Fax: +3-53-48934 Email: memo@math.utwente.nl www.math.utwente.nl/publications
More informationAdventures in Stochastic Processes
Sidney Resnick Adventures in Stochastic Processes with Illustrations Birkhäuser Boston Basel Berlin Table of Contents Preface ix CHAPTER 1. PRELIMINARIES: DISCRETE INDEX SETS AND/OR DISCRETE STATE SPACES
More informationA Conceptual Proof of the Kesten-Stigum Theorem for Multi-type Branching Processes
Classical and Modern Branching Processes, Springer, New Yor, 997, pp. 8 85. Version of 7 Sep. 2009 A Conceptual Proof of the Kesten-Stigum Theorem for Multi-type Branching Processes by Thomas G. Kurtz,
More informationComputer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo
Group Prof. Daniel Cremers 11. Sampling Methods: Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative
More informationQUEUEING FOR AN INFINITE BUS LINE AND AGING BRANCHING PROCESS
QUEUEING FOR AN INFINITE BUS LINE AND AGING BRANCHING PROCESS VINCENT BANSAYE & ALAIN CAMANES Abstract. We study a queueing system with Poisson arrivals on a bus line indexed by integers. The buses move
More informationThe Perron Frobenius theorem and the Hilbert metric
The Perron Frobenius theorem and the Hilbert metric Vaughn Climenhaga April 7, 03 In the last post, we introduced basic properties of convex cones and the Hilbert metric. In this post, we loo at how these
More informationCELEBRATING 50 YEARS OF THE APPLIED PROBABILITY TRUST
J. Appl. Prob. Spec. Vol. 51A, 359 376 (2014) Applied Probability Trust 2014 CELEBATING 50 YEAS OF THE APPLIED POBABILITY TUST Edited by S. ASMUSSEN, P. JAGES, I. MOLCHANOV and L. C. G. OGES Part 8. Markov
More informationRandom trees and branching processes
Random trees and branching processes Svante Janson IMS Medallion Lecture 12 th Vilnius Conference and 2018 IMS Annual Meeting Vilnius, 5 July, 2018 Part I. Galton Watson trees Let ξ be a random variable
More informationCHUN-HUA GUO. Key words. matrix equations, minimal nonnegative solution, Markov chains, cyclic reduction, iterative methods, convergence rate
CONVERGENCE ANALYSIS OF THE LATOUCHE-RAMASWAMI ALGORITHM FOR NULL RECURRENT QUASI-BIRTH-DEATH PROCESSES CHUN-HUA GUO Abstract The minimal nonnegative solution G of the matrix equation G = A 0 + A 1 G +
More informationAncestor Problem for Branching Trees
Mathematics Newsletter: Special Issue Commemorating ICM in India Vol. 9, Sp. No., August, pp. Ancestor Problem for Branching Trees K. B. Athreya Abstract Let T be a branching tree generated by a probability
More information