AARMS Homework Exercises
|
|
- Herbert Clark
- 5 years ago
- Views:
Transcription
1 1 For the gamma distribution, AARMS Homework Exercises (a) Show that the mgf is M(t) = (1 βt) α for t < 1/β (b) Use the mgf to find the mean and variance of the gamma distribution 2 A well-known inequality in probability theory is Markov s inequality The Markov inequality states that for a nonnegative random variable X with expectation µ, Verify Markov s inequality Prob{X c} µ c, c > 0 3 Another well-known inequality in probability theory is Chebyshev s inequality The Chebyshev inequality states that for any random variable X with finite expectation µ and positive variance σ 2, Prob{ X µ < kσ} 1 1 k 2, k > 0 Use the Markov inequality in Exercise 2 with Y 2 = (X µ) 2 and E(Y 2 ) = σ 2 to verify the Chebyshev inequality 4 Suppose the states for two different Markov chains are {1, 2, } and for a third Markov chain the states are {1, 2, 3} The corresponding transition matrices are a a 2 0 a a 1 a 1 a a a 1 a a 1 0 P 1 = , P 2 = 0 a 2 0, and P 3 = 1 0 1, a 2 0 a a 2 where 0 < a i < 1, i = 1, 2 and a 1 + a 2 = 1 (a) Draw a directed graph for each chain Determine whether the corresponding DTMC is reducible or irreducible (b) Identify the communicating classes, find their period (c) If the corresponding DTMC is irreducible and recurrent, compute lim n Pi n Hint: Compute Pi 2 and Pi 4 (d) For the recurrent DTMCs, use the Basic Limit Theorems (periodic or aperiodic) to determine the mean recurrence time for each state i Is the chain positive recurrent or null recurrent? 5 Suppose the transition matrix P of a finite Markov chain is doubly stochastic; that is, row and column sums equal one, p ij 0, N N p ij = 1, and p ij = 1 i=1 Prove the following: If an irreducible, aperiodic finite Markov chain (ergodic chain) has a doubly stochastic transition matrix, then all stationary probabilities are equal, π 1 = π 2 = = π N In particular π i = 1/N 1 j=1
2 6 Suppose the size of a population remains constant from generation to generation; the size equals N The dynamics of a particular gene in this population is modeled Suppose the gene has two alleles, A and a Therefore, individual genotypes are either AA, Aa, or aa Let the random variable X n denote the number of A alleles in the population in the nth generation, n = 0, 1, 2, Then X n {0, 1, 2,, 2N} Assume random mating of individuals so that the genes in generation n + 1 are found by sampling with replacement from the genes in generation n (Ewens, 1979) Then the one-step transition probability has a binomial probability distribution with the probability of success X n /(2N), ie, if X n = i, then the one-step transition probability is the binomial pdf b(2n, i/2n), p ji = ( ) ( ) 2N i j ( 1 i ) 2N j, j 2N 2N i, j = 0, 1, 2,, 2N (Ewens, 1979; Schinazi, 1999) This model is known as the Wright-Fisher model (a) Show that states 0 and 2N are absorbing and states {1, 2,, 2N 1} are transient (b) Given X n = k, show that the mean of X n+1 satisfies µ Xn+1 = E(X n+1 X n = k) = k A discretetime Markov process with the property E(X n+1 X n = k) = k is called a martingale (c) Show that the probability of fixation of allele A is k/2n, ie probability of absorption into state 2N 7 Normal cell division results in two identical daughter cells containing the same number of chromosomes as the original cell (2n for a diploid cell) Sometimes a mistake occurs and only one cell is produced having twice the number of chromosomes (4n chromosomes), referred to as omitosis When this abnormal cell divides again, it will produce two daughter cells with twice the number of chromosomes (4n chromosomes) Endomitosis can occur again for a cell having 4n chromosomes to produce a cell with 8n chromosomes and, in general, omitosis occurring in a cell with 2 i n chromosomes produces a cell with 2 i+1 n chromosomes Cells with more than two copies of the genes are known as polyploid cells The incidence of higher ploidies than four is small Therefore, it is reasonable to consider a cellular model with only two types: diploid cells and polyploid cells (Jagers, 1975) Let p be the probability of omitosis, 0 < p < 1/2, q the probability the cell dies, and 1 p q, the probability of cell division into two daughter cells of the same type The pgfs for the two cell types are f 1 (t 1, t 2 ) = (1 p q)t pt 2 + q and f 2 (t 1, t 2 ) = pt 2 + (1 p q)t q (a) Discuss why the pgfs have the given forms Calculate the expectation matrix M Show that matrix M is reducible (b) The pgf for the polyploid cells can be considered separately Beginning with one polyploid cell, what is the probability that the polyploid cell line will die out? f 2 (t) = pt + (1 p q)t 2 + q 8 Use the following MATLAB program to compute the expected population growth/year given the growth/year in a good year is m 1 = 4 and in a bad year is m 2 = 05 Let p 1 be the probability of a good year and p 2 be the probability of a bad year, p 1 + p 2 = 1 Complete the following table to compare the expected population growth/year in a random environment with the average of the growth rates Even though the average of the growth rates is greater than one, the expected population growth/year may be less than one clear all m1=4; % Mean growth in Good Year m2=05; % Mean growth in Bad Year 2
3 p 1 Expected Growth Average p 1 m 1 + p 2 m p1=08; % Probability of Good Year sumx=0; n=10^7; % Total number of years for i=1:n u=rand; if u<p1 x=log(m1); else x=log(m2); sumx=x+sumx; ExpectedGRand=exp(sumx/n) Average=m1*p1+m2*(1-p1) 9 Suppose the generator matrix of a finite CTMC is Q = (a) Compute the corresponding transition matrix T of the embedded Markov chain Show that the CTMC is irreducible (b) Find the stationary distribution of the CTMC: lim t P (t)p(0) 10 Use the backward Kolmogorov differential equations dp/ = P Q for the Poisson process and write differential equations to find the probability of hitting state 3, ie, p i (t) = probability of hitting state 3 at time t beginning from state i, i = 0, 1, 2, p i (0) = 0, i = 0, 1, 2 and p 3 (t) = 1 Solve these three differential equations to find explicit expressions for p i (t), i = 0, 1, 2 11 Consider the simple birth and death process with immigration (a) Use the mgf M(θ, t) to find expressions for the mean µ(t) and variance σ 2 (t) of the process (b) Find the limit of the mean and variance, lim t µ(t) and lim t σ 2 (t), when λ < µ (c) What are the mean and variance of the process if there is no immigration? 12 Suppose a general birth and death process has birth and death rates given by λ i = b 0 + b 1 i + b 2 i 2, and µ i = d 1 i + d 2 i 2, for i = 0, 1, 2, (a) Write the forward Kolmogorov differential equations Then write differential equations satisfied by the mgf (b) Write the partial differential equation for the mgf in the more general case, where λ i = n k=0 b ki k and µ i = n k=1 d ki k 3
4 13 The mgf M(θ, t) for a simple birth process is a solution of the following first order partial differential equation: M t = λ(e θ 1) M θ, M(θ, 0) = enθ (a) Differentiate the partial differential equation with respect to θ and evaluate at θ = 0 to write M(θ, t) a differential equation for the mean, m(t) = E(X(t)) = θ Solve the differential θ=0 equation for m(t) (b) Differentiate the partial differential equation twice with respect to θ and evaluate at θ = 0 to write a differential equation for the second moment E(X 2 (t)) Solve the differential equation for E(X 2 (t)) (c) Use parts (a) and (b) to find the variance σ 2 (t) of the process 14 Consider a model for a lytic viral population After successful attachment and entry into a host cell, the virus uses the host cell for its own reproduction, killing the cell to release the new virus particles or virions The number of virions produced per host cell is referred to as the burst size We describe a burst-death process studied by Hubbarde et al (2007) Let µ be the death rate, λ be the birth rate, and β the burst size, 0 < µ < λ, and β is some positive integer Then the pgf for the next generation of virions is f(z) = µ µ + λ + λ µ + λ zβ (a) For β = 2, this model is the same as the simple birth and death process and the probability of extinction is µ/λ Show for β > 2 that the probability of extinction for the burst-death model is less than µ/λ (b) Show that if β and µ and λ are fixed constants, then the probability of extinction approaches µ/(µ + λ) This means the probability of survival is λ/(µ + λ) Even if the burst size is extremely large, the probability of viral survival is less than one (provided the birth and death rates remain constant) 15 The following MATLAB program is for the SIR epidemic model Graph three sample paths for the case that N = 500, I(0) = 1 and R 0 = β/γ = 2 Modify the program and compute the following based on 10, 000 sample paths (1) probability histogram for time the epidemic s (duration of the epidemic) (2) estimate the probability of an outbreak (run until time t when I(t) = 10 or I(t) = 0, then count number of times of hitting 10 infectives before hitting zero) The estimated value should be close to 1 (1/R 0 ) I(0) How does the duration of an epidemic and probability of an outbreak change if I(0) = 2 or I(0) = 3 or if R 0 = 3? %Matlab Program: Three sample paths for the SIR epidemic clear all beta=1; g=05; N=500; % Parameters for k=1:3 clear t s i t(1)=0; i(1)=1; s(1)=n-i(1); j=1; while i(j)>0 & t(j)<100; 4
5 u1=rand; u2=rand; t(j+1)=-log(u1)/((beta/n)*i(j)*s(j)+g*i(j))+t(j); if u2<=(beta/n)*s(j)/(beta/n*s(j)+g) i(j+1)=i(j)+1; s(j+1)=s(j)-1; else i(j+1)=i(j)-1; s(j+1)=s(j); j=j+1; t(k)=t(j);% Time epidemic s l1=stairs(t,i); set(l1, LineWih,2); hold on ylabel( I(t) ); xlabel( Time ); hold off 16 Group Project 1: The following SEIR epidemic model includes births and deaths and an additional class of exposed or latent individuals, E, individuals that are not yet infectious The differential equations are ds de di dr = Λ β S N I µs = β S I σe µe N = σe γi µi = γi µr, where S + E + I + R = N The parameter Λ is the birth/immigration rate µ is the natural death rate, and σ is the rate exposed individuals become infectious The basic reproduction number of the SEIR epidemic model is βσ R 0 = (γ + µ)(σ + µ) (Hethcote, 2000) (a) Show that the total population size approaches Λ/µ (b) Set up a table of the infinitesimal transition probabilities for the eight events in a CTMC SEIR model (c) Assume Λ = 1, d = 0002, β = 1, σ = 025, γ = 025 If N = Λ/µ, S(0) = N 1, E(0) = 0, I(0) = 1 and R(0) = 0, use the table of transition probabilities to write a MATLAB program to simulate sample paths of the SEIR model Compute the probability of an outbreak by computing the probability of hitting E(t) + I(t) = 20 before hitting E(t) + I(t) = 0 (epidemic s) based on 10,000 sample paths What is the probability of an outbreak? (d) If time permits, change the parameters and the initial conditions to graph several sample paths Compute the probability of an outbreak given N = Λ/µ, S(0) = N 1, E(0) = 1, I(0) = 0 and R(0) = 0 and hypothesize a formula for the probability of an outbreak 5
6 17 Group Project 2: Suppose one population moves between two habitats (patches) The population size in each patch changes and can be modeled by the following differential equation dx ( i = rx i 1 x i K ) + m(x j x i ), i, j = 1, 2, j i (a) Without movement or migration, m = 0, the population grows logistically What is the stable positive equilibrium in each patch when m = 0 (b) Let X i (t) i = 1, 2 denote random variables for the stochastic process Assume the birth and death rates in each patch are λ i = ri(1 i 2 /(2K)) and µ i = ri 2 /(2K), i = 0, 1, 2,, 2K Consider only a single patch Let r = 2, K = 10, and m = 0 The maximum population size in patch 1 with no migration is N = 2K = 20 Find the infinitesimal generator matrix Q (of size 21 21) Compute the expected time to extinction, τ: τ Q = 1, where Q is the infinitesimal generator matrix with the first row and first column deleted, 1 = (1, 1,, 1) and τ = (τ 1,, τ N ) (d) Set up a table with six transition probabilities for the births and deaths in each patch and movement between the two patches Use the transition probabilities to write a MATLAB computer program to graph sample paths for the preceding model with m = 0 and m = 01 Estimate the probability of extinction from the MATLAB program with m = 01 and compare your results to part (b) References (e) If time permits, modify this patch model, to investigate the impact of patch depent parameters, r i, K i, m i x i on population survival Discuss your results Allen, LJS 2011 An Introduction to Stochastic Processes with Applications to Biology and Hall/CRC, Boca Raton, FL Chapman Ewens, WJ 1979 Mathematical Population Genetics Springer-Verlag, Berlin, Heidelberg, New York Hethcote, HW 2000 The mathematics of infectious diseases SIAM Review 42: Hubbarde, JE, G Wild, and LM Wahl 2007 Fixation probabilities when generation times are variable: the burst-death model Genetics 176: Jagers, P 1975 Branching Processes with Biological Applications John Wiley & Sons, London Schinazi, RB 1999 Classical and Spatial Stochastic Processes Birkhäuser, Boston 6
Probability Distributions
Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)
More informationProbability Distributions
Lecture 1: Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function
More information6 Continuous-Time Birth and Death Chains
6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.
More informationDerivation of Itô SDE and Relationship to ODE and CTMC Models
Derivation of Itô SDE and Relationship to ODE and CTMC Models Biomathematics II April 23, 2015 Linda J. S. Allen Texas Tech University TTU 1 Euler-Maruyama Method for Numerical Solution of an Itô SDE dx(t)
More informationIrreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1
Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate
More information2 Discrete-Time Markov Chains
2 Discrete-Time Markov Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationAn Introduction to Stochastic Epidemic Models
An Introduction to Stochastic Epidemic Models Linda J. S. Allen Department of Mathematics and Statistics Texas Tech University Lubbock, Texas 79409-1042, U.S.A. linda.j.allen@ttu.edu 1 Introduction The
More informationStochastic modelling of epidemic spread
Stochastic modelling of epidemic spread Julien Arino Centre for Research on Inner City Health St Michael s Hospital Toronto On leave from Department of Mathematics University of Manitoba Julien Arino@umanitoba.ca
More informationAN INTRODUCTION TO STOCHASTIC EPIDEMIC MODELS-PART I
AN INTRODUCTION TO STOCHASTIC EPIDEMIC MODELS-PART I Linda J. S. Allen Department of Mathematics and Statistics Texas Tech University Lubbock, Texas U.S.A. 2008 Summer School on Mathematical Modeling of
More informationStochastic modelling of epidemic spread
Stochastic modelling of epidemic spread Julien Arino Department of Mathematics University of Manitoba Winnipeg Julien Arino@umanitoba.ca 19 May 2012 1 Introduction 2 Stochastic processes 3 The SIS model
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationLecture 4a: Continuous-Time Markov Chain Models
Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time
More informationContinuous-Time Markov Chain
Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),
More informationBirth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes
DTU Informatics 247 Stochastic Processes 6, October 27 Today: Limiting behaviour of birth and death processes Birth and death processes with absorbing states Finite state continuous time Markov chains
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More informationTMA4265 Stochastic processes ST2101 Stochastic simulation and modelling
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes
More informationDiscrete Time Markov Chain of a Dynamical System with a Rest Phase
Discrete Time Markov Chain of a Dynamical System with a Rest Phase Abstract A stochastic model, in the form of a discrete time Markov chain, is constructed to describe the dynamics of a population that
More informationLECTURE NOTES: Discrete time Markov Chains (2/24/03; SG)
LECTURE NOTES: Discrete time Markov Chains (2/24/03; SG) A discrete time Markov Chains is a stochastic dynamical system in which the probability of arriving in a particular state at a particular time moment
More informationStat 516, Homework 1
Stat 516, Homework 1 Due date: October 7 1. Consider an urn with n distinct balls numbered 1,..., n. We sample balls from the urn with replacement. Let N be the number of draws until we encounter a ball
More informationThe Wright-Fisher Model and Genetic Drift
The Wright-Fisher Model and Genetic Drift January 22, 2015 1 1 Hardy-Weinberg Equilibrium Our goal is to understand the dynamics of allele and genotype frequencies in an infinite, randomlymating population
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationSTA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008
Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There
More informationStatistics 150: Spring 2007
Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities
More informationISyE 6650 Probabilistic Models Fall 2007
ISyE 6650 Probabilistic Models Fall 2007 Homework 4 Solution 1. (Ross 4.3) In this case, the state of the system is determined by the weather conditions in the last three days. Letting D indicate a dry
More informationHANDBOOK OF APPLICABLE MATHEMATICS
HANDBOOK OF APPLICABLE MATHEMATICS Chief Editor: Walter Ledermann Volume II: Probability Emlyn Lloyd University oflancaster A Wiley-Interscience Publication JOHN WILEY & SONS Chichester - New York - Brisbane
More informationIntroduction to SEIR Models
Department of Epidemiology and Public Health Health Systems Research and Dynamical Modelling Unit Introduction to SEIR Models Nakul Chitnis Workshop on Mathematical Models of Climate Variability, Environmental
More information8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains
8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains 8.1 Review 8.2 Statistical Equilibrium 8.3 Two-State Markov Chain 8.4 Existence of P ( ) 8.5 Classification of States
More information(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?
IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More informationHomework 3 posted, due Tuesday, November 29.
Classification of Birth-Death Chains Tuesday, November 08, 2011 2:02 PM Homework 3 posted, due Tuesday, November 29. Continuing with our classification of birth-death chains on nonnegative integers. Last
More informationExample: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected
4. Markov Chains A discrete time process {X n,n = 0,1,2,...} with discrete state space X n {0,1,2,...} is a Markov chain if it has the Markov property: P[X n+1 =j X n =i,x n 1 =i n 1,...,X 0 =i 0 ] = P[X
More informationOutlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)
Markov Chains (2) Outlines Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC) 2 pj ( n) denotes the pmf of the random variable p ( n) P( X j) j We will only be concerned with homogenous
More informationAn Introduction to Stochastic Modeling
F An Introduction to Stochastic Modeling Fourth Edition Mark A. Pinsky Department of Mathematics Northwestern University Evanston, Illinois Samuel Karlin Department of Mathematics Stanford University Stanford,
More informationLecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321
Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process
More informationLECTURE #6 BIRTH-DEATH PROCESS
LECTURE #6 BIRTH-DEATH PROCESS 204528 Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University Outline 2 Birth-Death
More informationMathematical Methods for Computer Science
Mathematical Methods for Computer Science Computer Science Tripos, Part IB Michaelmas Term 2016/17 R.J. Gibbens Problem sheets for Probability methods William Gates Building 15 JJ Thomson Avenue Cambridge
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationDe los ejercicios de abajo (sacados del libro de Georgii, Stochastics) se proponen los siguientes:
Probabilidades y Estadística (M) Práctica 7 2 cuatrimestre 2018 Cadenas de Markov De los ejercicios de abajo (sacados del libro de Georgii, Stochastics) se proponen los siguientes: 6,2, 6,3, 6,7, 6,8,
More informationSTOCHASTIC PROCESSES Basic notions
J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving
More informationElementary Applications of Probability Theory
Elementary Applications of Probability Theory With an introduction to stochastic differential equations Second edition Henry C. Tuckwell Senior Research Fellow Stochastic Analysis Group of the Centre for
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 1
MATH 56A: STOCHASTIC PROCESSES CHAPTER. Finite Markov chains For the sake of completeness of these notes I decided to write a summary of the basic concepts of finite Markov chains. The topics in this chapter
More informationPart IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015
Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.
More informationMarkov Chains, Stochastic Processes, and Matrix Decompositions
Markov Chains, Stochastic Processes, and Matrix Decompositions 5 May 2014 Outline 1 Markov Chains Outline 1 Markov Chains 2 Introduction Perron-Frobenius Matrix Decompositions and Markov Chains Spectral
More informationThe SIS and SIR stochastic epidemic models revisited
The SIS and SIR stochastic epidemic models revisited Jesús Artalejo Faculty of Mathematics, University Complutense of Madrid Madrid, Spain jesus_artalejomat.ucm.es BCAM Talk, June 16, 2011 Talk Schedule
More informationMARKOV PROCESSES. Valerio Di Valerio
MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some
More informationSTAT 380 Continuous Time Markov Chains
STAT 380 Continuous Time Markov Chains Richard Lockhart Simon Fraser University Spring 2018 Richard Lockhart (Simon Fraser University)STAT 380 Continuous Time Markov Chains Spring 2018 1 / 35 Continuous
More informationHomework 4 due on Thursday, December 15 at 5 PM (hard deadline).
Large-Time Behavior for Continuous-Time Markov Chains Friday, December 02, 2011 10:58 AM Homework 4 due on Thursday, December 15 at 5 PM (hard deadline). How are formulas for large-time behavior of discrete-time
More informationLTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather
1. Markov chain LTCC. Exercises Let X 0, X 1, X 2,... be a Markov chain with state space {1, 2, 3, 4} and transition matrix 1/2 1/2 0 0 P = 0 1/2 1/3 1/6. 0 0 0 1 (a) What happens if the chain starts in
More informationPart I Stochastic variables and Markov chains
Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)
More informationTHE QUEEN S UNIVERSITY OF BELFAST
THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M
More informationP (A G) dp G P (A G)
First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume
More informationThree Disguises of 1 x = e λx
Three Disguises of 1 x = e λx Chathuri Karunarathna Mudiyanselage Rabi K.C. Winfried Just Department of Mathematics, Ohio University Mathematical Biology and Dynamical Systems Seminar Ohio University November
More informationEvolutionary dynamics on graphs
Evolutionary dynamics on graphs Laura Hindersin May 4th 2015 Max-Planck-Institut für Evolutionsbiologie, Plön Evolutionary dynamics Main ingredients: Fitness: The ability to survive and reproduce. Selection
More informationContinuous Time Markov Chains
Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov
More informationMarkov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015
Markov chains c A. J. Ganesh, University of Bristol, 2015 1 Discrete time Markov chains Example: A drunkard is walking home from the pub. There are n lampposts between the pub and his home, at each of
More informationAn Overview of Methods for Applying Semi-Markov Processes in Biostatistics.
An Overview of Methods for Applying Semi-Markov Processes in Biostatistics. Charles J. Mode Department of Mathematics and Computer Science Drexel University Philadelphia, PA 19104 Overview of Topics. I.
More informationEXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00
Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013
More information1.1 Review of Probability Theory
1.1 Review of Probability Theory Angela Peace Biomathemtics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology. CRC Press,
More informationBayesian Methods with Monte Carlo Markov Chains II
Bayesian Methods with Monte Carlo Markov Chains II Henry Horng-Shing Lu Institute of Statistics National Chiao Tung University hslu@stat.nctu.edu.tw http://tigpbp.iis.sinica.edu.tw/courses.htm 1 Part 3
More informationThe Transition Probability Function P ij (t)
The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it
More informationLet (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t
2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition
More information1 Types of stochastic models
1 Types of stochastic models Models so far discussed are all deterministic, meaning that, if the present state were perfectly known, it would be possible to predict exactly all future states. We have seen
More informationCASE STUDY: EXTINCTION OF FAMILY NAMES
CASE STUDY: EXTINCTION OF FAMILY NAMES The idea that families die out originated in antiquity, particilarly since the establishment of patrilineality (a common kinship system in which an individual s family
More informationCentral limit theorems for ergodic continuous-time Markov chains with applications to single birth processes
Front. Math. China 215, 1(4): 933 947 DOI 1.17/s11464-15-488-5 Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes Yuanyuan LIU 1, Yuhui ZHANG 2
More informationThursday. Threshold and Sensitivity Analysis
Thursday Threshold and Sensitivity Analysis SIR Model without Demography ds dt di dt dr dt = βsi (2.1) = βsi γi (2.2) = γi (2.3) With initial conditions S(0) > 0, I(0) > 0, and R(0) = 0. This model can
More informationQuantitative Model Checking (QMC) - SS12
Quantitative Model Checking (QMC) - SS12 Lecture 06 David Spieler Saarland University, Germany June 4, 2012 1 / 34 Deciding Bisimulations 2 / 34 Partition Refinement Algorithm Notation: A partition P over
More informationZdzis law Brzeźniak and Tomasz Zastawniak
Basic Stochastic Processes by Zdzis law Brzeźniak and Tomasz Zastawniak Springer-Verlag, London 1999 Corrections in the 2nd printing Version: 21 May 2005 Page and line numbers refer to the 2nd printing
More informationCS145: Probability & Computing Lecture 18: Discrete Markov Chains, Equilibrium Distributions
CS145: Probability & Computing Lecture 18: Discrete Markov Chains, Equilibrium Distributions Instructor: Erik Sudderth Brown University Computer Science April 14, 215 Review: Discrete Markov Chains Some
More informationLecture 20 : Markov Chains
CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called
More informationIEOR 6711, HMWK 5, Professor Sigman
IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.
More informationPopulation Genetics: a tutorial
: a tutorial Institute for Science and Technology Austria ThRaSh 2014 provides the basic mathematical foundation of evolutionary theory allows a better understanding of experiments allows the development
More informationQuasi-Stationary Distributions in Linear Birth and Death Processes
Quasi-Stationary Distributions in Linear Birth and Death Processes Wenbo Liu & Hanjun Zhang School of Mathematics and Computational Science, Xiangtan University Hunan 411105, China E-mail: liuwenbo10111011@yahoo.com.cn
More informationCDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv
More informationLecture 21. David Aldous. 16 October David Aldous Lecture 21
Lecture 21 David Aldous 16 October 2015 In continuous time 0 t < we specify transition rates or informally P(X (t+δ)=j X (t)=i, past ) q ij = lim δ 0 δ P(X (t + dt) = j X (t) = i) = q ij dt but note these
More informationMath Homework 5 Solutions
Math 45 - Homework 5 Solutions. Exercise.3., textbook. The stochastic matrix for the gambler problem has the following form, where the states are ordered as (,, 4, 6, 8, ): P = The corresponding diagram
More informationChapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process
More informationAPM 541: Stochastic Modelling in Biology Discrete-time Markov Chains. Jay Taylor Fall Jay Taylor (ASU) APM 541 Fall / 92
APM 541: Stochastic Modelling in Biology Discrete-time Markov Chains Jay Taylor Fall 2013 Jay Taylor (ASU) APM 541 Fall 2013 1 / 92 Outline 1 Motivation 2 Markov Processes 3 Markov Chains: Basic Properties
More informationA review of Continuous Time MC STA 624, Spring 2015
A review of Continuous Time MC STA 624, Spring 2015 Ruriko Yoshida Dept. of Statistics University of Kentucky polytopes.net STA 624 1 Continuous Time Markov chains Definition A continuous time stochastic
More informationStochastic Modelling Unit 1: Markov chain models
Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson
More informationContinuous time Markov chains
Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017
More informationMarkov Chains. Arnoldo Frigessi Bernd Heidergott November 4, 2015
Markov Chains Arnoldo Frigessi Bernd Heidergott November 4, 2015 1 Introduction Markov chains are stochastic models which play an important role in many applications in areas as diverse as biology, finance,
More informationDynamic network sampling
Dynamic network sampling Steve Thompson Simon Fraser University thompson@sfu.ca Graybill Conference Colorado State University June 10, 2013 Dynamic network sampling The population of interest has spatial
More informationMathematical Analysis of Epidemiological Models: Introduction
Mathematical Analysis of Epidemiological Models: Introduction Jan Medlock Clemson University Department of Mathematical Sciences 8 February 2010 1. Introduction. The effectiveness of improved sanitation,
More informationStochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS
Stochastic Processes Theory for Applications Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Contents Preface page xv Swgg&sfzoMj ybr zmjfr%cforj owf fmdy xix Acknowledgements xxi 1 Introduction and review
More information2. Transience and Recurrence
Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times
More informationURN MODELS: the Ewens Sampling Lemma
Department of Computer Science Brown University, Providence sorin@cs.brown.edu October 3, 2014 1 2 3 4 Mutation Mutation: typical values for parameters Equilibrium Probability of fixation 5 6 Ewens Sampling
More informationHow robust are the predictions of the W-F Model?
How robust are the predictions of the W-F Model? As simplistic as the Wright-Fisher model may be, it accurately describes the behavior of many other models incorporating additional complexity. Many population
More informationLIMITING PROBABILITY TRANSITION MATRIX OF A CONDENSED FIBONACCI TREE
International Journal of Applied Mathematics Volume 31 No. 18, 41-49 ISSN: 1311-178 (printed version); ISSN: 1314-86 (on-line version) doi: http://dx.doi.org/1.173/ijam.v31i.6 LIMITING PROBABILITY TRANSITION
More informationPositive and null recurrent-branching Process
December 15, 2011 In last discussion we studied the transience and recurrence of Markov chains There are 2 other closely related issues about Markov chains that we address Is there an invariant distribution?
More informationStochastic Processes (Week 6)
Stochastic Processes (Week 6) October 30th, 2014 1 Discrete-time Finite Markov Chains 2 Countable Markov Chains 3 Continuous-Time Markov Chains 3.1 Poisson Process 3.2 Finite State Space 3.2.1 Kolmogrov
More informationSARAH P. OTTO and TROY DAY
A Biologist's Guide to Mathematical Modeling in Ecology and Evolution SARAH P. OTTO and TROY DAY Univsr?.ltats- und Lender bibliolhek Darmstadt Bibliothek Biotogi Princeton University Press Princeton and
More informationMarkov-modulated interactions in SIR epidemics
Markov-modulated interactions in SIR epidemics E. Almaraz 1, A. Gómez-Corral 2 (1)Departamento de Estadística e Investigación Operativa, Facultad de Ciencias Matemáticas (UCM), (2)Instituto de Ciencias
More informationStochastic Processes
Stochastic Processes 8.445 MIT, fall 20 Mid Term Exam Solutions October 27, 20 Your Name: Alberto De Sole Exercise Max Grade Grade 5 5 2 5 5 3 5 5 4 5 5 5 5 5 6 5 5 Total 30 30 Problem :. True / False
More informationIEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.
IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas
More informationLecture 7. We can regard (p(i, j)) as defining a (maybe infinite) matrix P. Then a basic fact is
MARKOV CHAINS What I will talk about in class is pretty close to Durrett Chapter 5 sections 1-5. We stick to the countable state case, except where otherwise mentioned. Lecture 7. We can regard (p(i, j))
More informationLecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes
Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities
More informationDemography April 10, 2015
Demography April 0, 205 Effective Population Size The Wright-Fisher model makes a number of strong assumptions which are clearly violated in many populations. For example, it is unlikely that any population
More informationExamination paper for TMA4265 Stochastic Processes
Department of Mathematical Sciences Examination paper for TMA4265 Stochastic Processes Academic contact during examination: Andrea Riebler Phone: 456 89 592 Examination date: December 14th, 2015 Examination
More informationMarkov chains. Randomness and Computation. Markov chains. Markov processes
Markov chains Randomness and Computation or, Randomized Algorithms Mary Cryan School of Informatics University of Edinburgh Definition (Definition 7) A discrete-time stochastic process on the state space
More information