Macroscopic quasi-stationary distribution and microscopic particle systems
|
|
- Joel Oliver
- 5 years ago
- Views:
Transcription
1 Macroscopic quasi-stationary distribution and microscopic particle systems Matthieu Jonckheere, UBA-CONICET, BCAM visiting fellow Coauthors: A. Asselah, P. Ferrari, P. Groisman, J. Martinez, S. Saglietti. BCAM, May 2016
2 Outline Introduction to quasi-stationary distributions: Macroscopic model Particle systems : Microscopic model Selection principle and traveling waves
3 Outline Introduction to quasi-stationary distributions: Macroscopic model Particle systems : Microscopic model Selection principle and traveling waves
4 Outline Introduction to quasi-stationary distributions: Macroscopic model Particle systems : Microscopic model Selection principle and traveling waves
5 Denying eternity Most phenomena do not last for ever. However most of them might reach some kind of equilibrium before vanishing. What are we observing when considering a macroscopic stochastic evolution (in biology, physics, populations models, telecommunications) that has not vanished for (very) large times?
6 Denying eternity Most phenomena do not last for ever. However most of them might reach some kind of equilibrium before vanishing. What are we observing when considering a macroscopic stochastic evolution (in biology, physics, populations models, telecommunications) that has not vanished for (very) large times?
7 Denying eternity Most phenomena do not last for ever. However most of them might reach some kind of equilibrium before vanishing. What are we observing when considering a macroscopic stochastic evolution (in biology, physics, populations models, telecommunications) that has not vanished for (very) large times?
8 Denying eternity Most phenomena do not last for ever. However most of them might reach some kind of equilibrium before vanishing. What are we observing when considering a macroscopic stochastic evolution (in biology, physics, populations models, telecommunications) that has not vanished for (very) large times? TRANSIENT
9 Denying eternity Most phenomena do not last for ever. However most of them might reach some kind of equilibrium before vanishing. What are we observing when considering a macroscopic stochastic evolution (in biology, physics, populations models, telecommunications) that has not vanished for (very) large times? TRANSIENT STATIONARY
10 Denying eternity Most phenomena do not last for ever. However most of them might reach some kind of equilibrium before vanishing. What are we observing when considering a macroscopic stochastic evolution (in biology, physics, populations models, telecommunications) that has not vanished for (very) large times? TRANSIENT QUASI-STATIONARY STATIONARY
11 2 types of quasi-stationarity If the stochastic evolution has a strong drift towards extinction, this quasi-equilibrium might correspond to a large deviation event. E.g. observing a player winning at a casino for hours, traffic jam more than 10 hours,... If it tends to vanish more slowly, (spend large time in a subset of the state space before vanishing) the quasi-equilibrium corresponds to a metastable state. E.g. a bottle of cold beer just before freezing. In the study of population dynamics, this phenomenon is coined as mortality plateau.
12 2 types of quasi-stationarity If the stochastic evolution has a strong drift towards extinction, this quasi-equilibrium might correspond to a large deviation event. E.g. observing a player winning at a casino for hours, traffic jam more than 10 hours,... If it tends to vanish more slowly, (spend large time in a subset of the state space before vanishing) the quasi-equilibrium corresponds to a metastable state. E.g. a bottle of cold beer just before freezing. In the study of population dynamics, this phenomenon is coined as mortality plateau.
13 2 types of quasi-stationarity
14 Challenges Both cases (large deviation and metastability) are interesting to study theoretically. These quasi-equilibrium are generally difficult to simulate. When there are several quasi-equilibrium (an infinity), which one has a physical meaning?
15 Markov process conditioned on non-absorption Let X t N, (N = N or R) an irreducible Markov process absorbed in 0. Let T the absorption time of X. Given an initial law µ and a measurable set A: φ µ t (A) = Pµ (X t A T > t). Kolmogorov (1938) proposed to study the long time behavior of processes conditioned not to being absorbed, i.e. the limits (if it exists) of φ µ t ( ).
16 Markov process conditioned on non-absorption Let X t N, (N = N or R) an irreducible Markov process absorbed in 0. Let T the absorption time of X. Given an initial law µ and a measurable set A: φ µ t (A) = Pµ (X t A T > t). Kolmogorov (1938) proposed to study the long time behavior of processes conditioned not to being absorbed, i.e. the limits (if it exists) of φ µ t ( ).
17 Quasi-stationary distribution We say that ν is a quasi-stationary distribution (QSD ) if there exists a probability measure µ such that: lim t φµ t ( ) = νµ ( ).
18 QSD Finite state space: S N : there exists a unique QSD. Spectral point of view: QSD = maximal left eigenvector of Q (infinitesimal generator of the killed process) For countable state space, there are different possible scenarios: 1. No QSD. 2. Unique QSD and convergence from any initial distribution towards this measure. 3. Infinity of QSD : Parametrization of the family of QSD with a parameter θ (eigenvalue of the infinitesimal generator): if ν QSD then P ν (T > t) = exp( θ νt). There might exist a maximal θ corresponding to the so-called minimal QSD.
19 QSD Finite state space: S N : there exists a unique QSD. Spectral point of view: QSD = maximal left eigenvector of Q (infinitesimal generator of the killed process) For countable state space, there are different possible scenarios: 1. No QSD. 2. Unique QSD and convergence from any initial distribution towards this measure. 3. Infinity of QSD : Parametrization of the family of QSD with a parameter θ (eigenvalue of the infinitesimal generator): if ν QSD then P ν (T > t) = exp( θ νt). There might exist a maximal θ corresponding to the so-called minimal QSD.
20 Challenges Existence, Simulation, Properties, extremality
21 Challenges Existence, Simulation, Properties, extremality
22 Challenges Existence, Simulation, Properties, extremality
23 Bibliography on QSDs - van Doorn, Ferrari, Martinez, Pollet, Seneta, Vere-Jones,... See P. Pollett bibiliography: pkp/papers/qsds/qsds.pdf
24 Outline Particle systems: microscopic models 1. Branching processes 2. Fleming-Viot 3. N- Branching Brownian motion 4. Choose the fittest
25 Outline Particle systems: microscopic models 1. Branching processes 2. Fleming-Viot 3. N- Branching Brownian motion 4. Choose the fittest
26 Outline Particle systems: microscopic models 1. Branching processes 2. Fleming-Viot 3. N- Branching Brownian motion 4. Choose the fittest
27 Traveling waves for PDE An important question in mathematics and physics is the existence of traveling waves solutions to (in particular parabolic, reaction-diffusion) PDEs, i.e., solutions of the form u(x, t) = w(x ct) where c is the speed of the traveling wave. Example: KPP (Kolmogorov-Petrovsky-Piskounov) equation: u t = 1/2u xx + f (u). Links with the maximum of the Branching Brownian motion (McKean, Bramson,...) In general, there may exist an infinity of solutions parametrized by their speed s. Which speed to select?
28 Traveling waves and QSDs For the KPP equation with f (u) = u 2 u, one can prove that: the equation has the same traveling wave as the free boundary equation obtained for the N-BBM, There exists a traveling wave (for a given eigenvalue λ) if and only if there exists a QSD for an associated Brownian motion with drift. This can be extended to Lévy processes dynamics, i.e. more general equations...
29 Need for a selection principle: Quoting Fisher (About the velocity of advance for genetic evolutions): Common sense would, I think, lead us to believe that, though the velocity of advance might be temporarily enhanced by this method, yet ultimately, the velocity of advance would adjust itself so as to be the same irrespective of the initial conditions. If this is so, this equation must omit some essential element of the problem, and it is indeed clear that while a coefficient of diffusion may represent the biological conditions adequately in places where large numbers of individuals of both types are available, it cannot do so at the extreme front and back of the advancing wave, where the numbers of the mutant and the parent gene respectively are small, and where their distribution must be largely sporadic.
30 The missing links: particle systems Macroscopic models (QSDs and traveling waves) forget that a population can be very large but finite. Microscopic models are intrinsically corresponding to finite population. They do select the minimal QSD/traveling wave.
31 references Simulation of quasi-stationary distributions on countable spaces, P. Groisman, M. J., Markov processes and related fields 2013 Fleming-Viot selects the minimal quasi-stationary distribution: The Galton-Watson case A. Asselah, P. Ferrari, P. Groisman, M. J. Ann Inst H. Poincaré, 2015 Front propagation and quasi-stationary distributions: the same selection principle. P. Groisman, M. J. Kesten-Stigum theorems in L 2 beyond R-positivity. S. Saglietti, M.J.
32 Thanks
33
34 Existence of QSD Very few general result: Proposition (Ferrari et. al. 1995) X t N. If lim x P x (T > t) =, There exists γ > 0 and z N such that E z (exp(γt 0 )) <, then there exists a QSD.
35 QSD and QLD : examples Birth and death process (non-empty M/M/1 queue): q(x, x + 1) = p1 x>0, q(x, x 1) = q1 x>0. Infinite family of QSD. Minimal QSD : ν (x) = c(x + 1) ( p ) x/2. q Population process with linear drift: q(x, x + 1) = px1 x>0, q(x, x 1) = qx1 x>0. Infinite family of QSD. Minimal QSD : ν (x) = c ( p ) x. q
36 QSD and QLD : examples Birth and death process (non-empty M/M/1 queue): q(x, x + 1) = p1 x>0, q(x, x 1) = q1 x>0. Infinite family of QSD. Minimal QSD : ν (x) = c(x + 1) ( p ) x/2. q Population process with linear drift: q(x, x + 1) = px1 x>0, q(x, x 1) = qx1 x>0. Infinite family of QSD. Minimal QSD : ν (x) = c ( p ) x. q
SIMILAR MARKOV CHAINS
SIMILAR MARKOV CHAINS by Phil Pollett The University of Queensland MAIN REFERENCES Convergence of Markov transition probabilities and their spectral properties 1. Vere-Jones, D. Geometric ergodicity in
More informationConvergence exponentielle uniforme vers la distribution quasi-stationnaire de processus de Markov absorbés
Convergence exponentielle uniforme vers la distribution quasi-stationnaire de processus de Markov absorbés Nicolas Champagnat, Denis Villemonais Séminaire commun LJK Institut Fourier, Méthodes probabilistes
More informationQuasi-stationary distributions
Quasi-stationary distributions Phil Pollett Discipline of Mathematics The University of Queensland http://www.maths.uq.edu.au/ pkp AUSTRALIAN RESEARCH COUNCIL Centre of Excellence for Mathematics and Statistics
More informationQuasi stationary distributions and Fleming Viot Processes
Quasi stationary distributions and Fleming Viot Processes Pablo A. Ferrari Nevena Marić Universidade de São Paulo http://www.ime.usp.br/~pablo 1 Example Continuous time Markov chain in Λ={0, 1, 2} with
More informationProblems on Evolutionary dynamics
Problems on Evolutionary dynamics Doctoral Programme in Physics José A. Cuesta Lausanne, June 10 13, 2014 Replication 1. Consider the Galton-Watson process defined by the offspring distribution p 0 =
More informationConvergence exponentielle uniforme vers la distribution quasi-stationnaire en dynamique des populations
Convergence exponentielle uniforme vers la distribution quasi-stationnaire en dynamique des populations Nicolas Champagnat, Denis Villemonais Colloque Franco-Maghrébin d Analyse Stochastique, Nice, 25/11/2015
More informationThe SIS and SIR stochastic epidemic models revisited
The SIS and SIR stochastic epidemic models revisited Jesús Artalejo Faculty of Mathematics, University Complutense of Madrid Madrid, Spain jesus_artalejomat.ucm.es BCAM Talk, June 16, 2011 Talk Schedule
More informationElementary Applications of Probability Theory
Elementary Applications of Probability Theory With an introduction to stochastic differential equations Second edition Henry C. Tuckwell Senior Research Fellow Stochastic Analysis Group of the Centre for
More informationarxiv: v1 [math.pr] 29 Sep 2016
Front propagation and quasi-stationary distributions for one-dimensional Lévy processes Pablo Groisman and Matthieu Jonckheere arxiv:1609.09338v1 [math.pr] 29 Sep 2016 Abstract We jointly investigate the
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More informationIrregular Birth-Death process: stationarity and quasi-stationarity
Irregular Birth-Death process: stationarity and quasi-stationarity MAO Yong-Hua May 8-12, 2017 @ BNU orks with W-J Gao and C Zhang) CONTENTS 1 Stationarity and quasi-stationarity 2 birth-death process
More informationSTOCHASTIC PROCESSES Basic notions
J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving
More informationMATH 56A SPRING 2008 STOCHASTIC PROCESSES 65
MATH 56A SPRING 2008 STOCHASTIC PROCESSES 65 2.2.5. proof of extinction lemma. The proof of Lemma 2.3 is just like the proof of the lemma I did on Wednesday. It goes like this. Suppose that â is the smallest
More informationHydrodynamics of the N-BBM process
Hydrodynamics of the N-BBM process Anna De Masi, Pablo A. Ferrari, Errico Presutti, Nahuel Soprano-Loto Illustration by Eric Brunet Institut Henri Poincaré, June 2017 1 Brunet and Derrida N branching particles
More informationThe fate of beneficial mutations in a range expansion
The fate of beneficial mutations in a range expansion R. Lehe (Paris) O. Hallatschek (Göttingen) L. Peliti (Naples) February 19, 2015 / Rutgers Outline Introduction The model Results Analysis Range expansion
More informationQuasi-Stationary Distributions in Linear Birth and Death Processes
Quasi-Stationary Distributions in Linear Birth and Death Processes Wenbo Liu & Hanjun Zhang School of Mathematics and Computational Science, Xiangtan University Hunan 411105, China E-mail: liuwenbo10111011@yahoo.com.cn
More informationStatistics 150: Spring 2007
Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities
More informationExample: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected
4. Markov Chains A discrete time process {X n,n = 0,1,2,...} with discrete state space X n {0,1,2,...} is a Markov chain if it has the Markov property: P[X n+1 =j X n =i,x n 1 =i n 1,...,X 0 =i 0 ] = P[X
More informationWXML Final Report: Chinese Restaurant Process
WXML Final Report: Chinese Restaurant Process Dr. Noah Forman, Gerandy Brito, Alex Forney, Yiruey Chou, Chengning Li Spring 2017 1 Introduction The Chinese Restaurant Process (CRP) generates random partitions
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More informationMarkov processes and queueing networks
Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution
More informationprocess on the hierarchical group
Intertwining of Markov processes and the contact process on the hierarchical group April 27, 2010 Outline Intertwining of Markov processes Outline Intertwining of Markov processes First passage times of
More informationA NOTE ON THE ASYMPTOTIC BEHAVIOUR OF A PERIODIC MULTITYPE GALTON-WATSON BRANCHING PROCESS. M. González, R. Martínez, M. Mota
Serdica Math. J. 30 (2004), 483 494 A NOTE ON THE ASYMPTOTIC BEHAVIOUR OF A PERIODIC MULTITYPE GALTON-WATSON BRANCHING PROCESS M. González, R. Martínez, M. Mota Communicated by N. M. Yanev Abstract. In
More informationMouvement brownien branchant avec sélection
Mouvement brownien branchant avec sélection Soutenance de thèse de Pascal MAILLARD effectuée sous la direction de Zhan SHI Jury Brigitte CHAUVIN, Francis COMETS, Bernard DERRIDA, Yueyun HU, Andreas KYPRIANOU,
More informationThe genealogy of branching Brownian motion with absorption. by Jason Schweinsberg University of California at San Diego
The genealogy of branching Brownian motion with absorption by Jason Schweinsberg University of California at San Diego (joint work with Julien Berestycki and Nathanaël Berestycki) Outline 1. A population
More informationQuasi-stationary distributions for discrete-state models
Quasi-stationary distributions for discrete-state models Erik A. van Doorn a and Philip K. Pollett b a Department of Applied Mathematics, University of Twente P.O. Box 217, 7500 AE Enschede, The Netherlands
More informationCritical branching Brownian motion with absorption. by Jason Schweinsberg University of California at San Diego
Critical branching Brownian motion with absorption by Jason Schweinsberg University of California at San Diego (joint work with Julien Berestycki and Nathanaël Berestycki) Outline 1. Definition and motivation
More informationYaglom-type limit theorems for branching Brownian motion with absorption. by Jason Schweinsberg University of California San Diego
Yaglom-type limit theorems for branching Brownian motion with absorption by Jason Schweinsberg University of California San Diego (with Julien Berestycki, Nathanaël Berestycki, Pascal Maillard) Outline
More informationQueueing Networks and Insensitivity
Lukáš Adam 29. 10. 2012 1 / 40 Table of contents 1 Jackson networks 2 Insensitivity in Erlang s Loss System 3 Quasi-Reversibility and Single-Node Symmetric Queues 4 Quasi-Reversibility in Networks 5 The
More information1 Types of stochastic models
1 Types of stochastic models Models so far discussed are all deterministic, meaning that, if the present state were perfectly known, it would be possible to predict exactly all future states. We have seen
More informationIntertwining of Markov processes
January 4, 2011 Outline Outline First passage times of birth and death processes Outline First passage times of birth and death processes The contact process on the hierarchical group 1 0.5 0-0.5-1 0 0.2
More informationIntroduction LECTURE 1
LECTURE 1 Introduction The source of all great mathematics is the special case, the concrete example. It is frequent in mathematics that every instance of a concept of seemingly great generality is in
More informationMinimal quasi-stationary distribution approximation for a birth and death process
Minimal quasi-stationary distribution approximation for a birth and death process Denis Villemonais To cite this version: Denis Villemonais. Minimal quasi-stationary distribution approximation for a birth
More informationClassification of Countable State Markov Chains
Classification of Countable State Markov Chains Friday, March 21, 2014 2:01 PM How can we determine whether a communication class in a countable state Markov chain is: transient null recurrent positive
More informationReaction-Diffusion Equations In Narrow Tubes and Wave Front P
Outlines Reaction-Diffusion Equations In Narrow Tubes and Wave Front Propagation University of Maryland, College Park USA Outline of Part I Outlines Real Life Examples Description of the Problem and Main
More informationLectures on Markov Chains
Lectures on Markov Chains David M. McClendon Department of Mathematics Ferris State University 2016 edition 1 Contents Contents 2 1 Markov chains 4 1.1 The definition of a Markov chain.....................
More informationCapacitary inequalities in discrete setting and application to metastable Markov chains. André Schlichting
Capacitary inequalities in discrete setting and application to metastable Markov chains André Schlichting Institute for Applied Mathematics, University of Bonn joint work with M. Slowik (TU Berlin) 12
More informationQUASI-STATIONARITY FOR A NON-CONSERVATIVE EVOLUTION SEMIGROUP
QUASI-STATIONARITY FOR A NON-CONSERVATIVE EVOLUTION SEMIGROUP ILIE GRIGORESCU 1, MIN KANG 2, AND YISHU SONG 1 Abstract. We investigate a non-conservative branching particle process on an open domain D
More informationQuasi-stationary distributions
Quasi-stationary distributions Erik A. van Doorn a and Philip K. Pollett b a Department of Applied Mathematics, University of Twente P.O. Box 217, 7500 AE Enschede, The Netherlands E-mail: e.a.vandoorn@utwente.nl
More informationThe Λ-Fleming-Viot process and a connection with Wright-Fisher diffusion. Bob Griffiths University of Oxford
The Λ-Fleming-Viot process and a connection with Wright-Fisher diffusion Bob Griffiths University of Oxford A d-dimensional Λ-Fleming-Viot process {X(t)} t 0 representing frequencies of d types of individuals
More informationA review of Continuous Time MC STA 624, Spring 2015
A review of Continuous Time MC STA 624, Spring 2015 Ruriko Yoshida Dept. of Statistics University of Kentucky polytopes.net STA 624 1 Continuous Time Markov chains Definition A continuous time stochastic
More informationhttp://www.math.uah.edu/stat/markov/.xhtml 1 of 9 7/16/2009 7:20 AM Virtual Laboratories > 16. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 1. A Markov process is a random process in which the future is
More informationA. Bovier () Branching Brownian motion: extremal process and ergodic theorems
Branching Brownian motion: extremal process and ergodic theorems Anton Bovier with Louis-Pierre Arguin and Nicola Kistler RCS&SM, Venezia, 06.05.2013 Plan 1 BBM 2 Maximum of BBM 3 The Lalley-Sellke conjecture
More information9.2 Branching random walk and branching Brownian motions
168 CHAPTER 9. SPATIALLY STRUCTURED MODELS 9.2 Branching random walk and branching Brownian motions Branching random walks and branching diffusions have a long history. A general theory of branching Markov
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationFinite-time Blowup of Semilinear PDEs via the Feynman-Kac Representation. CENTRO DE INVESTIGACIÓN EN MATEMÁTICAS GUANAJUATO, MEXICO
Finite-time Blowup of Semilinear PDEs via the Feynman-Kac Representation JOSÉ ALFREDO LÓPEZ-MIMBELA CENTRO DE INVESTIGACIÓN EN MATEMÁTICAS GUANAJUATO, MEXICO jalfredo@cimat.mx Introduction and backgrownd
More informationStochastic modelling of epidemic spread
Stochastic modelling of epidemic spread Julien Arino Centre for Research on Inner City Health St Michael s Hospital Toronto On leave from Department of Mathematics University of Manitoba Julien Arino@umanitoba.ca
More information14 Branching processes
4 BRANCHING PROCESSES 6 4 Branching processes In this chapter we will consider a rom model for population growth in the absence of spatial or any other resource constraints. So, consider a population of
More informationMarkov Chains, Stochastic Processes, and Matrix Decompositions
Markov Chains, Stochastic Processes, and Matrix Decompositions 5 May 2014 Outline 1 Markov Chains Outline 1 Markov Chains 2 Introduction Perron-Frobenius Matrix Decompositions and Markov Chains Spectral
More informationLecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes
Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities
More informationStochastic processes. MAS275 Probability Modelling. Introduction and Markov chains. Continuous time. Markov property
Chapter 1: and Markov chains Stochastic processes We study stochastic processes, which are families of random variables describing the evolution of a quantity with time. In some situations, we can treat
More informationStochastic processes and Markov chains (part II)
Stochastic processes and Markov chains (part II) Wessel van Wieringen w.n.van.wieringen@vu.nl Department of Epidemiology and Biostatistics, VUmc & Department of Mathematics, VU University Amsterdam, The
More information2 Discrete-Time Markov Chains
2 Discrete-Time Markov Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationDiscrete time Markov chains. Discrete Time Markov Chains, Definition and classification. Probability axioms and first results
Discrete time Markov chains Discrete Time Markov Chains, Definition and classification 1 1 Applied Mathematics and Computer Science 02407 Stochastic Processes 1, September 5 2017 Today: Short recap of
More information25.1 Ergodicity and Metric Transitivity
Chapter 25 Ergodicity This lecture explains what it means for a process to be ergodic or metrically transitive, gives a few characterizes of these properties (especially for AMS processes), and deduces
More informationNon-homogeneous random walks on a semi-infinite strip
Non-homogeneous random walks on a semi-infinite strip Chak Hei Lo Joint work with Andrew R. Wade World Congress in Probability and Statistics 11th July, 2016 Outline Motivation: Lamperti s problem Our
More informationBirth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes
DTU Informatics 247 Stochastic Processes 6, October 27 Today: Limiting behaviour of birth and death processes Birth and death processes with absorbing states Finite state continuous time Markov chains
More information2. Transience and Recurrence
Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times
More informationCS145: Probability & Computing Lecture 18: Discrete Markov Chains, Equilibrium Distributions
CS145: Probability & Computing Lecture 18: Discrete Markov Chains, Equilibrium Distributions Instructor: Erik Sudderth Brown University Computer Science April 14, 215 Review: Discrete Markov Chains Some
More informationFigure 10.1: Recording when the event E occurs
10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable
More informationarxiv: v2 [math.na] 20 Dec 2016
SAIONARY AVERAGING FOR MULI-SCALE CONINUOUS IME MARKOV CHAINS USING PARALLEL REPLICA DYNAMICS ING WANG, PER PLECHÁČ, AND DAVID ARISOFF arxiv:169.6363v2 [math.na 2 Dec 216 Abstract. We propose two algorithms
More informationStochastic modelling of epidemic spread
Stochastic modelling of epidemic spread Julien Arino Department of Mathematics University of Manitoba Winnipeg Julien Arino@umanitoba.ca 19 May 2012 1 Introduction 2 Stochastic processes 3 The SIS model
More informationIntroduction to Random Diffusions
Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales
More informationCONTINUOUS STATE BRANCHING PROCESSES
CONTINUOUS STATE BRANCHING PROCESSES BY JOHN LAMPERTI 1 Communicated by Henry McKean, January 20, 1967 1. Introduction. A class of Markov processes having properties resembling those of ordinary branching
More informationMarkov Processes Hamid R. Rabiee
Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete
More informationOther properties of M M 1
Other properties of M M 1 Přemysl Bejda premyslbejda@gmail.com 2012 Contents 1 Reflected Lévy Process 2 Time dependent properties of M M 1 3 Waiting times and queue disciplines in M M 1 Contents 1 Reflected
More informationModelling Complex Queuing Situations with Markov Processes
Modelling Complex Queuing Situations with Markov Processes Jason Randal Thorne, School of IT, Charles Sturt Uni, NSW 2795, Australia Abstract This article comments upon some new developments in the field
More informationCOMPETITIVE OR WEAK COOPERATIVE STOCHASTIC LOTKA-VOLTERRA SYSTEMS CONDITIONED ON NON-EXTINCTION. Université de Toulouse
COMPETITIVE OR WEAK COOPERATIVE STOCHASTIC LOTKA-VOLTERRA SYSTEMS CONITIONE ON NON-EXTINCTION PATRICK CATTIAUX AN SYLVIE MÉLÉAR Ecole Polytechnique Université de Toulouse Abstract. We are interested in
More information3. The Voter Model. David Aldous. June 20, 2012
3. The Voter Model David Aldous June 20, 2012 We now move on to the voter model, which (compared to the averaging model) has a more substantial literature in the finite setting, so what s written here
More informationLocal vs. Nonlocal Diffusions A Tale of Two Laplacians
Local vs. Nonlocal Diffusions A Tale of Two Laplacians Jinqiao Duan Dept of Applied Mathematics Illinois Institute of Technology Chicago duan@iit.edu Outline 1 Einstein & Wiener: The Local diffusion 2
More informationAn Introduction to Stochastic Modeling
F An Introduction to Stochastic Modeling Fourth Edition Mark A. Pinsky Department of Mathematics Northwestern University Evanston, Illinois Samuel Karlin Department of Mathematics Stanford University Stanford,
More information8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains
8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States
More informationHomework 3 posted, due Tuesday, November 29.
Classification of Birth-Death Chains Tuesday, November 08, 2011 2:02 PM Homework 3 posted, due Tuesday, November 29. Continuing with our classification of birth-death chains on nonnegative integers. Last
More informationURN MODELS: the Ewens Sampling Lemma
Department of Computer Science Brown University, Providence sorin@cs.brown.edu October 3, 2014 1 2 3 4 Mutation Mutation: typical values for parameters Equilibrium Probability of fixation 5 6 Ewens Sampling
More informationTable of Contents [ntc]
Table of Contents [ntc] 1. Introduction: Contents and Maps Table of contents [ntc] Equilibrium thermodynamics overview [nln6] Thermal equilibrium and nonequilibrium [nln1] Levels of description in statistical
More informationMarkov Chains on Countable State Space
Markov Chains on Countable State Space 1 Markov Chains Introduction 1. Consider a discrete time Markov chain {X i, i = 1, 2,...} that takes values on a countable (finite or infinite) set S = {x 1, x 2,...},
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationTMA4265 Stochastic processes ST2101 Stochastic simulation and modelling
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes
More informationComputable bounds for the decay parameter of a birth-death process
Loughborough University Institutional Repository Computable bounds for the decay parameter of a birth-death process This item was submitted to Loughborough University's Institutional Repository by the/an
More informationLecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking
Lecture 7: Simulation of Markov Processes Pasi Lassila Department of Communications and Networking Contents Markov processes theory recap Elementary queuing models for data networks Simulation of Markov
More informationA NOTE ON QUASI-STATIONARY DISTRIBUTIONS OF BIRTH-DEATH PROCESSES AND THE SIS LOGISTIC EPIDEMIC
(January 8, 2003) A NOTE ON QUASI-STATIONARY DISTRIBUTIONS OF BIRTH-DEATH PROCESSES AND THE SIS LOGISTIC EPIDEMIC DAMIAN CLANCY, University of Liverpoo PHILIP K. POLLETT, University of Queensand Abstract
More informationTravelling-waves for the FKPP equation via probabilistic arguments
Travelling-waves for the FKPP equation via probabilistic arguments Simon C. Harris Department of Mathematical Sciences, University of Bath, Bath, BA2 7AY, United Kingdom. Abstract. We outline a completely
More informationPath integrals for classical Markov processes
Path integrals for classical Markov processes Hugo Touchette National Institute for Theoretical Physics (NITheP) Stellenbosch, South Africa Chris Engelbrecht Summer School on Non-Linear Phenomena in Field
More informationReadings: Finish Section 5.2
LECTURE 19 Readings: Finish Section 5.2 Lecture outline Markov Processes I Checkout counter example. Markov process: definition. -step transition probabilities. Classification of states. Example: Checkout
More informationF n = F n 1 + F n 2. F(z) = n= z z 2. (A.3)
Appendix A MATTERS OF TECHNIQUE A.1 Transform Methods Laplace transforms for continuum systems Generating function for discrete systems This method is demonstrated for the Fibonacci sequence F n = F n
More informationFleming-Viot processes: two explicit examples
ALEA Lat. Am. J. Probab. Math. Stat. 13 1) 337 356 2016) Fleming-Viot processes: two explicit examples Bertrand Cloez and Marie-oémie Thai UMR IRA-SupAgro MISTEA Montpellier France E-mail address: bertrand.cloez@supagro-inra.fr
More informationStochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS
Stochastic Processes Theory for Applications Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Contents Preface page xv Swgg&sfzoMj ybr zmjfr%cforj owf fmdy xix Acknowledgements xxi 1 Introduction and review
More informationPopulation growth: Galton-Watson process. Population growth: Galton-Watson process. A simple branching process. A simple branching process
Population growth: Galton-Watson process Population growth: Galton-Watson process Asimplebranchingprocess September 27, 2013 Probability law on the n-th generation 1/26 2/26 A simple branching process
More informationEXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS
J. Appl. Prob. 41, 1211 1218 (2004) Printed in Israel Applied Probability Trust 2004 EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS and P. K. POLLETT, University of Queensland
More informationDISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition
DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition R. G. Gallager January 31, 2011 i ii Preface These notes are a draft of a major rewrite of a text [9] of the same name. The notes and the text are outgrowths
More informationRecent results on branching Brownian motion on the positive real axis. Pascal Maillard (Université Paris-Sud (soon Paris-Saclay))
Recent results on branching Brownian motion on the positive real axis Pascal Maillard (Université Paris-Sud (soon Paris-Saclay)) CMAP, Ecole Polytechnique, May 18 2017 Pascal Maillard Branching Brownian
More informationSome mathematical models from population genetics
Some mathematical models from population genetics 5: Muller s ratchet and the rate of adaptation Alison Etheridge University of Oxford joint work with Peter Pfaffelhuber (Vienna), Anton Wakolbinger (Frankfurt)
More informationThe strictly 1/2-stable example
The strictly 1/2-stable example 1 Direct approach: building a Lévy pure jump process on R Bert Fristedt provided key mathematical facts for this example. A pure jump Lévy process X is a Lévy process such
More informationSharpness of second moment criteria for branching and tree-indexed processes
Sharpness of second moment criteria for branching and tree-indexed processes Robin Pemantle 1, 2 ABSTRACT: A class of branching processes in varying environments is exhibited which become extinct almost
More informationSTA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008
Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There
More informationLecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321
Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process
More informationLecture 7. µ(x)f(x). When µ is a probability measure, we say µ is a stationary distribution.
Lecture 7 1 Stationary measures of a Markov chain We now study the long time behavior of a Markov Chain: in particular, the existence and uniqueness of stationary measures, and the convergence of the distribution
More informationContinuous Time Markov Chains
Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov
More information4 Branching Processes
4 Branching Processes Organise by generations: Discrete time. If P(no offspring) 0 there is a probability that the process will die out. Let X = number of offspring of an individual p(x) = P(X = x) = offspring
More informationDiscrete solid-on-solid models
Discrete solid-on-solid models University of Alberta 2018 COSy, University of Manitoba - June 7 Discrete processes, stochastic PDEs, deterministic PDEs Table: Deterministic PDEs Heat-diffusion equation
More information