Epidemic Models. Maaike Koninx July 12, Bachelor thesis Supervisor: dr. S.G. Cox

Size: px
Start display at page:

Download "Epidemic Models. Maaike Koninx July 12, Bachelor thesis Supervisor: dr. S.G. Cox"

Transcription

1 Epidemic Models Maaike Koninx July 12, 2015 Bachelor thesis Supervisor: dr. S.G. Cox Korteweg-de Vries Instituut voor Wiskunde Faculteit der Natuurwetenschappen, Wiskunde en Informatica Universiteit van Amsterdam

2 Abstract In our everyday lives, we are challanged with many questions concerning our health. In this thesis, we will explore the spread of infectious diseases using a mathematical approach. Our primary goal is to examine multiple epidemic models and their related traits that contribute to the occurrence of an epidemic. We will consider discrete and continuous time epidemic models and argue that for all models the same general result emerges: in the supercritical regime an epidemic may occur while in the subcritical regime the disease will surely die out. For our discrete time models, we will compute the probability that an infectious disease will die out and discuss the probability that a certain proportion of the population is infected. For our continuous time models, we will explore the conditions that will lead to a decrease in the number of infected people. We will argue that if we view the number of infected people as a Markov chain on N, the number of infected people approximates a differential equation in large populations. Title: Epidemic Models Author: Maaike Koninx, maaike.koninx@student.uva.nl, Supervisor: dr. S.G. Cox Second grader: dhr. dr. B.J.K. Kleijn Date: July 12, 2015 Korteweg-de Vries Instituut voor Wiskunde Universiteit van Amsterdam Science Park 904, 1098 XH Amsterdam 2

3 Contents 1. Introduction 4 2. The Galton-Watson branching process 6 3. Dicrete time SIR epidemic models The Reed-Frost model The Erdős-Rényi model Simulation Continuous time epidemic models The deterministic model Kurtz s Theorem Proof of Lemma Simulation Conclusion Populaire samenvatting 33 Bibliografie 35 A. Appendix A 36 B. Appendix B 37 B.1. Supporting theorems for Chapter C. Appendix C 38 C.1. The number of infected people in an Reed-Frost epidemic C.2. The number of infected people on G(n,p) C.3. The mean number of infected people for 100 simulations on G(n,p) C.4. Number of infected people in a Markov chain model C.5. Accessory function for C

4 1. Introduction In our everyday lives, we are challanged with many questions concerning our health. New infectious diseases get discovered at a regular basis, while older, well-known diseases are still present in our world today. Some of the most well-known infectious diseases have played a major part in history, and even altered its course. For example in the 1300s a series of epidemics of the plague, also known as the Second plague pandemic, that started with the Black Death wiped out 30 to 70 percent of the population in Europe. Since then, many more outbreaks of the plague 1 have been well documented, as well as major outbreaks of cholera, influenza, measles, yellow fever, etc. The world is becoming increasingly interconnected and naturally a concern for our health arises. In this thesis, we will explore the spread of infectious diseases using a mathematical approach. Figure 1.1.: A depiction of the Black Death from a 15th century bible. Our main concern for this thesis is in which circumstances an epidemic occurs. We will define an epidemic as the presence of a certain amount of infected individuals of an infectious disease in a population for a substantial amount of time. In general, an epidemic may occur when a single individual or small proportion of the population is exposed to an infectious disease. If the infectious individuals make contact with other members of the population, the disease is spread throughout the population. Whether or not an epidemic occurs, is determined by certain traits of the disease as well as the population. When the population is exposed to the disease, an epidemic may or may not occur according to these traits. For example, high rates on the infectiousness of the disease and the rate at which people make contact in a population may lead to an 1 at the time of writing, an outbreak was documented in Madagascar 4

5 epidemic, while other traits like the rate at which sick people recover may lead to a decline in the number of infected people. The primary goal of this thesis is to examine multiple epidemic models and their related traits that contribute to the occurrence of an epidemic. Furthermore, if an epidemic occurs, we are interested in the number of people that become infected and the probability that the disease will die out eventually. We will support our theoretic findings with computer simulations. We argue that the same, general result emerges for all of our epidemic models, which is the following; If the expected number of people a single infected person infects is greater than one, an epidemic may occur. If the expected number of people he infects equals or is less than one, the disease will surely die out. In Chapter 2, we consider the spread of an infectious disease in a branching process and we will compute the probability that an infectious disease will die out. In Chapter 3 we discuss a discrete time epidemic-model, where we will argue that the Reed-Frost SIR model is equivalent to an E-R random graph given the same initial state. Here we will discuss the probability that a certain proportion of the population is infected. In Chapter 4, we introduce two continuous time epidemic models. We argue that if we view the number of infected people as a Markov chain on N, we find by applying Kurtz s Theorem, that the number of infected people approximates the solution to a differential equation in large populations. Acknowledgments. I would like to truly thank my supervisor Sonja Cox for offering me guidance throughout this project. Enjoy! 5

6 2. The Galton-Watson branching process For our first exploration of epidemics, we consider a simple branching process in which an epidemic starts at a so-called patient zero and spreads with a certain probability to a number of his contacts. In their turn, the infected individuals infect their contacts with a certain probability. This process continues as long as the disease is passed on between generations and is called the Galton-Watson branching process. 1 Definition 2.1. Let the random variable N represent the number of people each infected individual spreads the disease on to. The distribution of N is referred to as the offspring distribution, and its expected value R 0 = EN] is called the basic reproductive number. We are interested in the probability that the disease lives on for a certain amount of time. The probability p ext n that the disease has died out by the nth generation is a sequence in 0, 1] that converges to a point p ext, the probability that the disease dies out eventually. By Theorem 2.2, the probability that the epidemic either dies out or continues to exist indefinitely depends completely on the offspring distribution. If the number of new infections an infected individual causes is strictly less than one, the disease will surely die out. If this number is greater than one, the disease may continue to exist indefinitely. This is stated in the next theorem. Note that in Theorem 2.2 below we do not consider the case where P N = 1] = 1. In this case, every infected individual spreads his disease to exactly one individual. It is clear that p ext n = 0 for every n N, and therefore p ext = 0. Theorem 2.2. Conditional on P N = 1] < 1, the following regimes occur: (a) If R 0 1 then p ext = 1. (b) If R 0 > 1, then p ext < 1. Proof. Let X n be the number of infected people in the nth generation. We view an individual i from the 1 st generation and notice that if i is infected, he can be considered as patient zero in his own branching process. In particular, if Z n,i represents the number of infected people in the nth generation after i, (in)directly caused by i X 1, then Z n+1,i X n. Furthermore if i = 1, 2,... X 1, the {Z n,i } are i.i.d by definition and we 1 This model is best known not for its application in epidemics, but for its application in genealogy. The model was first introduced by Francis Galton to study the extinction of family names. The terminology used in this chapter is therefore closely related to common word usage in genealogy. 6

7 can deduce the following Es X n+1 : X 1 = j] = Es j Z n+1,i ] = Es Z n+1,1 s Z n+1,2 ] i.i.d = ( Es Z n+1,1 ] ) j = ( Es X n ] ) j. (2.1) Now we can introduce the probability generating function φ n (s) of X n, by φ n (s) = Es Xn ] = 2.1 = = Es Xn : X 1 = j]p X 1 = j] j=0 ( Es X n 1 ] ) j P N = j] j=0 (φ n 1 (s)) j P N = j]. j=0 Note that φ n (0) = P X n equation gives us = 0] = p ext n. Consequently, setting s = 0 in the previous p ext n = j=0 ( p ext n 1) j P N = j]. (2.2) Since p ext n converges to a unique point p ext in 0, 1], we know that p ext n p ext n 1 as n. We will now argue that by setting p ext n = p ext n 1, the smallest solution to the equation (2.2) in 0, 1] gives us p ext. This will finalize our proof. Let f : 0, 1] R be defined by f(x) = j=0 xj P N = j]. We deduce the following: (i) f is non-decreasing as f (x) = j=1 jxj 1 P (N = j) 0, for all x 0, 1]. (ii) f is strictly convex, indeed since P N = 1] < 1 and EN] = 1, we know that P N 2] > 0. Therefore f (x) = j=2 j(j 1)xj 2 P (N = j) > 0 for all x 0, 1]. (iii) f(1) = j=0 P N = j] = 1. We can see from Figure 2.1 that in the cases where (a) R 0 1 and (b) R 0 > 1 the following is true: (a) In the case where R 0 1, the strict convexity of f together with the condition that f (1) = EN] = R 0 1 force the graph of f to lie strictly above the function y = x on 0, 1). This implies that the sequence p ext n is increasing on 0, 1) and will converge to p ext = 1. (b) In the case where R 0 > 1, the condition that f (1) = R 0 > 1 and the above statements force f to lie under the function y = x on the interval (a, 1) for some a < 1 7

8 (a) (b) Figure 2.1.: A graph of f in the case where (a) R 0 1 and (b) R 0 > 1. and that a is the unique solution to f(x) = x in 0, 1). Indeed, if p ext n is a decreasing sequence and if p ext n the sequence p ext 0, a) then (p ext n n will converge to a = p ext < 1. (a, 1) then (p ext ) is increasing. We conclude that Definition 2.3. We will denote R 0 < 1 as the subcritical regime, R 0 = 1 the critical regime and R 0 > 1 as the supercritical regime. The previous theorem tells us that if we know the value of R 0, we can predict whether a disease will die out or persist indefinitely in a population. We refer the interested reader to Appendix A for an overview of well-known infectious diseases and their basic reproductive numbers. In particular, if we know the distribution of N, we can calculate the extinction probability of the disease by finding the minimal non-negative solution to 2.2. We conclude this chapter by examining the extinction probability for some distributions of N. Example 2.4. Let 0 < p < 1 and consider p k = P N = k] = p k (1 p) for k = 0, 1,... as the probability of infecting k people. Clearly p k (0, 1) for all k = 0, 1,..., and these probabilities add to one seeing as k=0 p k = (1 p) k=0 pk = (1 p)(1 p) 1 = 1. Therefore the set {p k } is a probability distribution 2. Suppose that p 1. Since 2 n ) EN] = kp k = (1 p) kp k p = (1 p) (1 p) = p 2 1 p k=0 k=0 we know that EN] 1. Seeing as p 1 < 1, Theorem 2.2 tells us that p ext = 1. In this case, the disease will almost surely die out at a certain point in time. Now suppose p > 1 2, then EN] > 1 and by Theorem 2.2 pext < 1. We are interested in finding the value of p ext. Therefore we have to find the minimal non-negative solution 2 In fact, N is Geometric(1 p) distributed 8

9 to φ(x) = x for φ(x) = x k p k k=0 = (1 p) x k p k k=0 1 = (1 p) 1 xp Some calculations are required to find that φ(x) = x on 0, 1] when x equals 1 or 1 p. p Seeing as p ext < 1, we find that p ext = 1 p. In this case, the disease will live on p indefinitely with probability 1 p ext = 2p 1. p Example 2.5. We can consider the spread of an infectious disease not only by counting infectious individuals but also by counting the healthy ones. Whenever a strand of DNA is exposed to a certain amount of radiation, its molecular composition can be permanently altered. These errors in genetic material are called mutations. Suppose that a healthy strand of DNA replicates itself during radiation and the number of healthy replicas N is Poisson distributed with parameter λ. The healthy replicas on their turn replicate themselves, with the same offspring distribution N. This process is again a Galton-Watson branching process and EN] = R 0 = λ, so the Poisson distribution has a parameter equal to the expected number of non-mutated replica DNA strands. The probability p ext that the healthy DNA strands die out in a certain point of time is the minimal non-negative solution to x = φ(x) = k=0 x k λ k e λ k! = e λ (xλ) k k! k=0 = e λ(x 1) (2.3) The exact solution for this equation cannot be obtained analytically, but we can obtain it numerically. For instance if λ = 2 we get p ext = , which tells us that when we expect every healthy DNA strand to generate 2 healthy replicas, the probability that the healthy DNA strands die out is about 20.3%. 9

10 3. Dicrete time SIR epidemic models As we saw in the Chapter 2, the Galton-Watson branching process is an effective model for describing the extinction rate of an infectious disease that starts at a patient zero. However, the Galton Watson branching process is hardly an effective model for describing epidemics in real life. In real life epidemics, a healthy person can extract a infectious disease from multiple infected individuals he meets. In the Galton-Watson process a healthy person can only extract a disease from his direct contact in the former generation. Furthermore, in real epidemiology, infectious diseases often do not get discovered until a certain proportion of the population is sick. One cannot easily trace the generations in which the disease has spread much less find patient zero. Hence we need to find another model that describes the spread of infectious diseases without assuming that the process starts at a patient zero and allowing the disease to reach healthy people through multiple channels. One such model is called the Reed-Frost model The Reed-Frost model The Reed-Frost model is a discrete time SIR epidemic process. A SIR epidemic process involves three types of individuals: susceptible (S), infectious (I) and resistant (R). In a population of n individuals, during each unit of time, an infected person infects each susceptible with probability p independent of other infected people. The infected person is then assigned as resistant. During the following unit of time, the process continues for the newly infectious group of people. Once resistant, a person cannot become susceptible again, since being resistant equals being immune. The process kan be represented as S I R A more detailed description is as follows. Let S(t), I(t) and R(t) be the number of susceptible, infectious and resistant people respectively at time t. From the assumptions it is clear that S(t) + I(t) + R(t) = n. Then Z(t) = (S(t), I(t), R(t)) {1, 2,..., n} 3 is a random variable that represents the state of each of the n individuals at time t. Let z, z {1, 2,..., n} 3 be two states. If the number of susceptibles S(z) in state z equals S(z) = I( z)+s( z), then it is possible to reach state z from state z in one step (notation: z z). If {Z(t)} t N is a random walk on {1, 2,..., n} 3, and z z is possible, then the 10

11 transition probabilities equal ( ) S( z) ( ) S(z) S( z) S(z) P Z(t + 1) = z : Z(t) = z] = )((1 p) I(z) 1 (1 p) I(z) (3.1) S( z) ( ) ) S( z) ) I( z) S(z) = ((1 p) (1 I(z) (1 p) I(z). S( z) }{{}}{{}}{{} (i) (ii) (iii) Explanation. (i) The number of ways to obtain S( z) susceptibles out of S(z) susceptibles. (ii) The probability that S( z) people do not get infected. Obviously, for each susceptible, the probability of not getting infected is (1 p) I(z), so for all S( z) susceptibles this probability equals term (ii). (iii) The probability that I( z) people get infected. The probability for each susceptible of getting infected is 1 (1 p) I(z), so for all I( z) this probability equals (iii). Note that one has to encounter only one infected person to possibly get sick, but we do not exclude the case where one can meet multiple infected persons. Therefore, this probability does not equal p I(z)I( z). The reader can check that these probabilities add to 1 on the set { z : z z is possible}. Remark. Suppose that {Z(t)} t N is a random walk on {1, 2,..., n} 3. Then the conditional transition probability P Z(t + 1) = z : Z(t) = z,..., Z(0) = z 0 ] depends only on the present state Z(t), i.e. P Z(t + 1) = z : Z(t) = z,..., Z(0) = z 0 ] = P Z(t + 1) = z : Z(t) = z]. This means that the future process depends only on the present state and not on the history of the process. We call this memoryless property the Markov property, making Z(t) a Markov Chain on {1, 2,..., n} 3. As can be seen from equation 3.1, the conditional random variable S(t + 1) Z(t) has a binomial distribution of parameter (S(t), (1 p) I(t) ) 1. Using the law of large numbers, it can be shown that this distribution approximates a Poisson distribution in a population where n and I(t) is a finite number, i.e. in a population with a large number of healthy individuals and a small fraction of infected people. Theorem 3.1. The Law of Large Numbers. Let t 0 be fixed and let λ = S(t)(1 p) I(t). Then for finite I(t) and n, S(t + 1) Z(t) is Poisson(λ) distributed. Proof. We write ˆp = (1 p) I(t), so that S(t + 1) Z(t) Bin(S(t), ˆp) and the generating 1 while I(t + 1) Z(t) has a binomial distribution of parameter (S(t), 1 (1 p) I(t) ) 11

12 function equals φ(s) = Es S(t+1) Z(t)] n ( ) S(t) = s k ˆp k (1 ˆp) S(t) k k k=0 = (1 + (s 1)ˆp) S(t) ( = 1 + (s 1) λ S(t) e (s 1)λ, ) S(t) as n, which is exactly the generating function of a Poisson(λ) distribution as we saw in Example 2.5. Now that we know that S(t + 1) Z(t) is Poisson distributed for large enough n, provided that S(t) increases accordingly, one might think this would make calculations involving S(t + 1) easier and consequently lead to insights into the Reed-Frost epidemic model. However, since each transition probability is dependent on the former state, the previous model fails to provide insight into the long-term infection rate. We therefore introduce a new model based on random graphs which we will prove is equivalent to the Reed-Frost model The Erdős-Rényi model. The Erdős-Rényi (E-R) random graph G(n, p) is a graph on n nodes where each two nodes are connected with probability p, independent of how other edges are connected. Our model then simulates an epidemic in the following way. A finite number of nodes, which we will refer to as I(0), is infected at random at t = 0. During the next unit of time, each infected node infects his direct neighbours after which he is assigned as resistant. This process continues until all nodes in the connected component of each infected individual are infected. Now, if S(t) and I(t) represent the number of susceptibles and infectious at time t respectively, it can be argued that the conditional transition probability P S(t + 1) = z : S(t), I(t)] in our new model equals that of the Reed-Rényi model since it is exactly (i) (ii) (iii) as in 3.1, see Example 3.2. We conclude that the two models are equivalent given that both models start with the same amount of infected people I(0). Example We now consider two states z, z, of a graph in which the nodes are assigned as susceptible, infectious or resistant, see Figure 3.1. Suppose it is possible to reach state z from state z in one timestep, i.e. all susceptible nodes in z are assigned as either susceptible or infected in state z. The probability that all nodes were connected 12

13 accordingly at t = 0 in our model equals (i) (ii) (iii) as in 3.1, since (i) There are ( S(z) S( z)) ways of connecting the infectious nodes in state z to the infectious nodes in state z. (ii) The probability that the susceptible nodes in z were not connected to the infectious ones in z equals (1 p) I(z)S( z). (iii) The probability that the infectious nodes in z were connected to the infectious ones in z equals (1 (1 p) I(z) ) I( z). z (a) z z Figure 3.1.: Two states z, z on a graph of 6 nodes. The green nodes are susceptible, the red ones are infectious and the yellow ones are resistant. At least one of the solid drawn connections have to be made at t = 0 to reach state z from state z in one timestep. Through our new model we can deduce some valuable results. We can obtain an upper bound for the number of people that get infected in terms of R 0, the expected number of people that every infected person spreads his disease on to. It is clear that R 0 equals the expected number of neighbours pn of an infected person. The theorem below states that in the subcritical regime, when R 0 < 1, all connected components are sufficiently small so that there is at most a small outbreak. In the supercritical regime a so-called giant component appears, which leads to a large outbreak. We denote by C 1 and C 2 the largest and second-largest connected component respectively in G(n, p). Let C i the number of nodes in C i. Theorem 3.3. In an E-R graph G(n, p) with R 0 = pn, the following regimes occur. (i) Subcritical regime: If R 0 < 1 then there exists a constant c depending on R 0, so that lim P C 1 c log(n)] = 1. n (ii) Critical regime: If R 0 = 1 then for all a > 3/2 and all δ > 0 ] lim P C1 δ = 0. n n α Furthermore, if p ext R 0 is the extinction probability of a Galton-Watson branching process with N P oisson(r 0 ), i.e. p ext is the unique solution to 2.3, then the following holds. 13

14 (ii) Supercritical regime: If R 0 > 1 then for some constant c depending on λ and all δ > 0 ] lim P C 1 (1 pext R n n 0 ) δ and C 2 c log(n) = 1. The proof for this theorem exceeds this chapter but we refer the interested reader to 3]. We used Netlogo (version 5.10) to simulate an E-R graph for various values of R 0. For 100 nodes, we connnected each pair with probability p = R 0 /100. The results agreed with Theorem 3.3. Indeed, the largest connected components in the subcritical and critical regimes were relatively small while in the supercritical case, one can easily see the giant component emerge. See Figure 3.2. (a) R 0 = 0.5, C 1 = 6 (b) R 0 = 1, C 1 = 13 (c) R 0 = 2, C 1 = 82 Figure 3.2.: Simulation of a E-R random graph on 100 nodes. The red nodes are part of the largest component C 1. Only the connections of C 1 are shown. The previous theorem leads to some interesting results regarding the number of people that get infected when we simulate the spread of an epidemic. We will discuss this for each case: (i) For R 0 < 1 the largest component is at most of logarithmic size with respect to the total population almost surely when n is large, that is C 1 c log(n) a.s. for some c > 0. This implies that every component is at most of logarithmic size. When a finite number I(0) of the nodes are infected, a maximum of I(0) c log(n) people eventually become infected. Hence, the outbreak is restricted to a logarithmic sized proportion of the population. Seeing as I(0) c log(n) n 0 as n this is a negligible proportion of the total population. Therefore this will result in a small outbreak. (ii) For R 0 = 1 the size of the largest component is at most of order n 2/3 with high probability for large n, i.e. C 1 cn 2/3 with high probability for some c > 0. Therefore, 14

15 with high probability a maximum of I(0) c n 2/3 people are infected for some c > 0 and there is only a small outbreak of the disease, as I(0) c n 2/3 n 0 as n. (iii) For R 0 > 1, the size of the largest component in proportion to n converges in probability to 1 p ext R 0 as n. That is, in large populations a proportion of 1 p ext R 0 of the population is interconnected i.e. C 1 n (1 pext R 0 ) as n. We call C 1 the giant compontent. All other components including the second largest component are at most of logarithmic size almost surely. If a person in these logaritmic sized components is infected, the disease will be limited to a negligible proportion of the total population as n as we saw in (i). If a person in the giant component is infected this will almost surely lead to a large outbreak of the disease where at least a proportion of 1 p ext R 0 of the population is infected Simulation. We ran a simulation to find out more about the amount of sick people in a population of n people and to verify the previously stated theoretic bounds. In this section, we are interested in the mean number of infected people for R 0 = 0.5, 1 and 2 as a function of n. First, we used Matlab (version R2014b) to simulate the spread of an epidemic based on the Reed-Frost epidemic model. For every R 0, we chose a certain n and p = R 0 /n accordingly. Also, we chose I(0) = 10. For our simulation, we allowed the infected people to infect the susceptibles in each iteration, after which they were labeled as resistant. Every new amount of susceptibles was selected out of all susceptibles by using the rand function and transition probabilities stated in 3.1. See Appendix C.1 for the script. Although this procedure ran flawlessly for small n, the program failed to accurately produce all transition probabilities for n > 68. One might explain this by looking back at the transition probabilities in 3.1 and noticing that for S(z) > 68 and some values of S( z) ( ) S(z) > S( z) Any value stored as a double requires 64 bits. Hence values larger than can not be accurately stored as a double. It is therefore favorable to simulate the E-R graph, which is equivalent to the Reed-Frost model but does not require the same calculations. Again, we used Matlab to generate an E-R graph. By using the binornd function, we generated a symmetric n n adjacency matrix in which every two nodes are connected 15

16 with probability p. Next, 10 nodes were assigned as infected and during the next iteration we allowed them to infect their direct neighbours. During the iterations, the infected nodes infected their direct neighbours until there were no infected or healthy nodes left. See Appendix C.2 for the script. For every value of R 0, we repeated the previous script for increasing values of n with p = R 0 /n until the script exceeded a cpu of 300s. For every n, which we chose as a power of two starting from n = 2 4 = 16, we ran the script 100 times and calculated the mean number of infected people. See Appendix C.3 for this script. Results In the subcritical regime, the mean number of infected people was indeed bounded by c log(n) for c = 4. When fitted to an logarithmic function, the data was best fitted to Y = 1.64 log(n) Though as can be seen from Figure 3.3, the data disagrees with this fitting for some n. This could be explained by variance or maybe the data is best fit to another, non-logarithmic function. In the critical regime, the mean number of infected people was clearly bounded by and well-fitted to a function of the form c n 2/3, see Figure 3.3. The data appeared to fit the function Y = 1.10 n 2/ very well, confirming the theoretic upper bounds from Theorem 3.3 for c = Figure 3.3.: The mean number of infected nodes in an E-R graph for I(0) = 10. The blue markers represent the mean number of infected people, and the dotted lines represents the upper bounds stated in Theorem 3.3 for various values of c. Above row: R 0 = 0.5, total cpu = 379s. Lower row: R 0 = 1, total cpu = 399s. In the supercritical regime, the mean proportion infected people appeared to be converging towards 1 p ext for increasing values of n, suggesting the appearence of a giant component. See Figure 3.4. The difference δ between the data and 1 p ext is positive for every n, which tells us that in proportion to n even more people are infected than 1 p ext. The data does appear to be converging to 1 p ext, as we can see by the rapidly decreasing value of δ. 16

17 Figure 3.4.: The mean proportion of infected nodes in an E-R graph for R 0 = 2 and I(0) = 10. The blue markers represent the mean proportion of infected people, and the dotted line is the theoretic proportion of people in the giant component. The difference δ between the data and 1 p ext, is shown on the right. Total cpu = 379s. The size of I(0). The probability of a large outbreak of an infectious disease in supercritical regime is dependent on the size of I(0). If we take I(0) large enough, there is a substantial probability of infecting a node in C 1 and causing a large outbreak. In contrast, if we take I(0) small, there might not be a large outbreak. The probability of a large outbreak is dependent on I(0) as follows. Suppose we assign each of the initial infections randomly to a node. The probability of assigning an infection to a node that is not in the giant component is equal to p ext R 0, since the giant component is of size 1 p ext R 0 with respect to n. Therefore with probability (p ext R 0 ) I(0), none of the initial infections are assigned to nodes in the giant component. The probability that at least one of the infected nodes I(0) is in the giant component is 1 (p ext R 0 ) I(0). Consequently, with probability 1 (p ext R 0 ) I(0), there is a large outbreak in which at least a proportion of 1 p ext R 0 of the total population is infected. Example 3.4. Suppose R 0 = 2 i.e. the probability of infection is p = 2 n. Then pext = (see Example 2.5). Suppose I(0) = 1. For a population of size n, a proportion of 1 p ext = of the total population is infected with probability 1 p ext = as n. For I(0) = 2 at least the same proportion of the population is infected with probabiliy 1 (p ext ) 2 = Earlier, we assumed that I(0) was finite. Yet, there is a way of writing I(0) as a function of n such that the previous results still hold. For instance if we take I(0) = n α (log(n)) 1 for α < 1 in the subcritical case, a maximum of I(0) log(n) = n α people are infected. Since n α lim n n = lim n nα 1 0 this is still a negligible proportion of the total population. So this results in just a small outbreak. The same goes for I(0) = n α 2/3 with α < 1 in the critical regime, since again a maximum of I(0) n 2/3 = n α people are infected and this proportion again equals n α 1 0 as n. 17

18 The SIS epidemic process. We now consider the spread of a disease in which each infected person recovers and becomes susceptible again as opposed to becoming resistant as we saw in the SIR model. S I S An example of a SIS epidemic is the well-known common flu. We will now deduce some results based on previous observations. For I( z) = k the reader can check that the transition probabilities equal ( ) S(z) P Z(t + 1) = z : Z(t) = z] = (1 p) I(z)(S(z) k) {1 (1 p) I(z) } k. (3.2) k which implies that the conditional random variable Z(t + 1) Z(t) Bin(S(z), 1 (1 p) I(z) ). We now want to deduce some results that are similar to Theorem 3.3 that will predict whether there will be a large or small outbreak. Therefore we first have to find a random graph that will be equivalent to the SIS epidemic process. An obvious choice is a similar to the E-R graph, but with a slight alteration: each infected node infects his direct neighbours after which he becomes susceptible again. This model indeed has the same transition probabilities as 3.2, but unfortunately does not depict the same process. In the SIS epidemic model, every time a person becomes susceptible after infection, he becomes infected again in one timestep with probability (1 p) I(z). The same probability in our altered E-R graph is either 1 if the person has neighbours (seeing as they are surely infected), or 0 if the person has no neighbours. To properly adjust our altered E-R graph, we have to randomly reassign neighbours to every person that gets susceptible again after infection with probability p. Gaining further insight into this model will be complicated and unnessasary as there are easier ways of examining the SIS model. In Chapter 4 we will show that if the number of infected individuals in our SIS model is represented by a Markov chain X n (t) on N, we can construct a differential equation for which 1 X n n(t) approximates its solution for large values of n. 18

19 4. Continuous time epidemic models We now look into some continuous time epidemic models. In continuous time epidemic models, just like in real life, people can get infected at any point in time and stay infected for any amount of time. We will refer to the rate at which an infectious individual infects each susceptible as β, and the rate at which an infected person becomes immune as γ. Our main goal for this Chapter is to examine how β and γ contribute to the occurence of an epidemic and how many people are infected. In order to do so, we will have to look into large populations seeing as continuous time epidemic models are more advanced in small populations. The first continuous time model we discuss is a deterministic model which, as opposed to all previous models, contains no stochastic elements. Deterministic models are often described by differential equations and the output is fully determined by parameter values and initial conditions. Second, we will discuss a stochastic model where the number of infected people is represented by a Markov jump process. By applying Kurtz s theorem we argue that in large populations, the number of infected people approximates the solution to a differential equation. We will finalize this Chapter with a proof for Kurtz s theorem as well as a simulation of the above results The deterministic model. The SIR model We denote by s, i, r the proportion of susceptible, infectious and resistant people relative to n, i.e. s = S/n, i = I/n and r = R/n, and s(t)+i(t)+r(t) = 1 for all t 0. Clearly, this process can be represented as s β i γ r. We stated that an infectious individual infects a susceptible at rate β, where there are s susceptible and i infected people relative to n. Consequently, the rate at which all infected people infect all susceptibles equals βis. Also, since γ is the rate at which each infected person becomes immune, γi is the rate at which all infected people get immune. Therefore we obtain the following system of differental equations ds(t) = βi(t)s(t) dt (4.1) di(t) = βi(t)s(t) γi(t) dt (4.2) dr(t) = γi(t). dt 19

20 By setting ds(t) = di(t) = 0, we conclude that an equilibrium arises only when i(t) = 0, dt dt i.e. when the disease has died out. We consider two cases which eventually lead to an equilibrium, assuming that i(0) > 0. First note that the mean duration of an infection is γ 1 units of time. Since an infected individual infects each susceptible at rate β, the expected proportion of susceptibles he infects during his infectious period equals R 0 = β/γ. Seeing as R 0 is the reproductive number for this model, it is not surprising that the two cases we consider are related to the (sub)critical and the supercritical regimes. (i) s(0) R0 1. Note that R 0 1 always satisfies this condition. Since s(t) is decreasing for t > 0, one can deduce that { di = 0 if s(0) = R0 1 or i(t) = 0 (t) i(t)(βs(0) γ) dt < 0 if s(0) < R0 1. Seeing as s(0) = R0 1 is not an equilibrium, we conclude that the number of infected people will strictly decrease until i(t) = 0 and the disease has died out. No epidemic occurs. (ii) s(0) > R0 1. Note that this implies R 0 > 1. Clearly, di (t) = i(t)(βs(t) γ) dt > 0 if s(t) > R0 1 = 0 if s(t) = R0 1 or i(t) = 0 < 0 if s(t) < R0 1 Seeing as s(0) > R0 1, we deduce that the number of infected people will increase until s(t) = R0 1 for some t. Hence, an epidemic occurs. Since s(t) = R0 1 is not an equilibrium, the number of infective people then decreases until i(t) = 0 and the disease has died out. Obviously, no more than R0 1 of the susceptibles survive the epidemic. See Figure 4.1 how this event takes place in the subcritical and the supercritical regime. Figure 4.1.: Phase portrait for the equations 4.1 and 4.2 in the case where R 0 = 0.5 (left) and R 0 = 2 (right). 20

21 The SIS-model For our SIS-model we get a similar result, with the alteration that infected people become susceptible, as apposed to resistant, at rate γi. ds(t) = βi(t)s(t) + γi(t) dt di(t) = βi(t)s(t) γi(t) dt It is clear that an equilibrium occurs in either of the following conditions (i) i(t) = 0 and s(t) = 1 or (ii) i(t) = 1 R0 1 and s(t) = R0 1. Clearly di(t) > 0 if s(t) > R0 1 = i(t)(βs(t) γ) = 0 if s(t) = R0 1 or i(t) = 0 dt < 0 if s(t) < R0 1 implies that in the case where s(0) < R0 1 the number of infected people decreases until either s(t) = R0 1 and i(t) = 1 R0 1 if R 0 > 1 or in the case where R 0 < 1, the disease has died out. If R 0 1, this will imply that the disease has died out. In the case where s(0) > R0 1, and therefore R 0 > 1, the number of infected people increases until i(t) = 1 R0 1. We conclude that for all values of s(0), i(t), an equilibrium occurs between the proportion of susceptible and infected people where s(t) = R0 1 and i(t) = 1 R Kurtz s Theorem. Next, we will discuss a continuous time SIS model based on a continuous time stochastic process. We will first introduce the concept of a continuous time Markov chain. Let (X(t)) t 0 be a right-continuous process with values in a set N, which is called the state space. Let Λ = (λ x,y ) x,y N be a Q-matrix with jump martix Π. Definition 4.1. We call (X(t)) t 0 a Markov chain on a N with jump rates (λ x,y ) x,y N Λ, if for all t, h 0 conditional on X(t) = x, X(t + h) is independent of (X(s)) s t and as h 0 uniformly in t for all y N P X(t + h) = y X(t) = x] = δ x,y + λ x,y h + o(h), where δ x,y is the Kronecker delta, and o(h) is the little-o of h Remark. Recall that f(h) o(h) if and only if f(h) lim h 0 h = 0. 21

22 This implies that for any Markov chain (X(t)) t 0 on N and for any states x, y N the process jumps from x to y with certain probability upon expiration of a exponential distributed timer, whose parameter depends solely on the pair (x, y). In the remaining part of this chapter, we consider a family of Markov chains X n for n N on state space N. We will assume that for all n, X n has only a finite number k of jump directions, i.e. (e i ),...,k Z. For every n, wel define the functions λ i : R R + such that the transition rate of jumping to x + e i from x equals nλ i (x/n) for any state x N of X n. In this case, we call X n a birth and death process and the following holds for all x N P X n (t + h) = x + e i X n (t) = x] = nhλ i (x/n) + o(h) k P X n (t + h) = x X n (t) = x] = 1 nh λ i (x/n) + o(h). Example 4.2. Suppose X n (t) is a Markov chain on N that represents the number of infectives at time t in a population of n individuals. Then the jump directions are restricted to belong to { 1, 0, 1}. As we saw in the former paragraph, for jumping directions e + = +1 and e = 1 the corresponding functions λ ± equal λ (I/n) = γi/n and λ + (I/n) = βi/n(1 I/n). Therefore, for every state I N, the rate of jumping to I 1 equals nλ (I/n) = γi and the rate of jumping to I+1 equals nλ + (I/n) = β I(n I). n We obtain the following equations: P X n (t + h) = I + 1 X n (t) = I] = β I(n I)h + o(h) n (4.3) P X n (t + h) = I 1 X n (t) = I] = γih + o(h) (4.4) P X n (t + h) = I X n (t) = I] = 1 ( β I(n I) + γi n ) h + o(h) (4.5) We will now show that when n goes to infinity, the trajectories t 1 n X n(t) converge in probability to the solution x(t) of a ordinary differential equation. In particular this convergence is uniform and a.s. on 0, T ] for any T > 0. Theorem 4.3. Kurtz s Theorem Let ē := max{ e i : i = 1,..., k} for some arbitrary norm. Assume that λ = max λ i (x),...,k sup x R is finite and that the function F : R R defined by: F (x) = k e i λ i (x) 22

23 is Lipschitz-continuous, i.e. there exists a constant M such that F (x) F (y) M x y for all x, y R. 1 Assume that lim n X n n(0) = x(0) almost surely, and let x : R + R be the solution to the integral equation x(t) = x(0) + t 0 F (x(s))ds. (4.6) Then for any fixed ɛ, T > 0, for sufficiently large n, ] ( ( )) P sup 1 n X n(t) x(t) ɛ 2k exp nt λh ɛe MT 2kT λē, (4.7) where h(t) = (1 + t)log(1 + t) t. Moreover, lim 1 n n n(t) x(t) = 0 a.s. (4.8) This theorem provides a law of large numbers for a birth and death process. Equation 4.7 tells us that the right hand side of the equation converges to zero for n, we deduce that 1 n X n(t) converges in probability to x(t) for all fixed t. Moreover, this convergence is a.s. and uniform for all t 0, T ] for any T > 0, i.e. on all compact subsets of R 0. Hence, Kurtz s Theorem provides us with a limit of 1 n X n as n, as well as a characterisation of the limiting process. Before we prove Kurtz s Theorem, we will first apply Kurtz s Theorem on our SIS-model. Solution for SIS-model. For our SIS-model this implies the following. Equations 4.3, 4.4 and 4.5 imply that F equals F (x) = λ + (x) λ (x) = βx(1 x) γx. We will now provide a solution to dx = F (x) for γ = 1. For β 1 and some y R the dt solution to dx = F (x) = βx(1 x) x, x(0) = y. dt equals (β 1)y x(t) = (β 1 βy)e (1 β)t + βy. (4.9) It is clear that if β > 1 then (β 1)y lim x(t) = t 0 + βy = β 1 β, and if β < 1, then lim t x(t) = 0. Since Kurtz s Theorem tells us that x(t) is the probability limit of 1 n X n(t) as n, i.e. the proportion of infected people in a population of infinite size, we deduce that an epidemic occurs almost surely if β > 1, i.e. if R 0 > 1. In the subcritical regime, when R 0 < 1, no epidemic occurs. See Figure 4.2 for the value of x(t) in the subcritical and supercritical regime. 23

24 Figure 4.2.: Solution to 4.6 for our SIS-model in the case where R 0 R 0 = 2 (right). = 0.5 (left) and Proof. Kurtz s Theorem. Let (N i ),...,k be a family of independent unit-rate Poisson processes. That is, for each t > 0, N i (t) is Poisson distributed with parameter t for all i = 1,... k. By proof by induction it can be shown that we can write k ( t ) X n (t) = X n (0) + e i N i nλ i (X n (s)/n)ds. The proof of this statement is beyond the scope of this thesis. We now introduce the notation Y n (t) = 1 n X n(t), then it follows that Y n (t) = Y n (0) + = Y n (0) + (i) = Y n (0) + = Y n (0) + k k k k ( e t ) i n N i nλ i (X n (s)/n)ds 0 ( t e i n N i e i n N i e i n N i 0 ( t 0 ( t 0 ) nλ i (X n (s)/n)ds + ) nλ i (X n (s)/n)ds + ) nλ i (X n (s)/n)ds + 0 k t 0 t 0 e t i n 0 ( k nλ i (X n (s)/n)ds ) e i λ i (X n (s)/n) ds F (Y n (s))ds, (4.10) where N i (t) = N i (t) t, which is the centered Poisson process. For (i) we used the linearity of the integral. Clearly we are looking for an expression of Y n (t) x(t). Combining equation 4.6 and 4.10 gives us Y n (t) x(t) Y n (0) x(0) + + t 0 k F (Y n (s)) F (x(s)) ds e i n ( N i n t 0 λ i (X n (s))ds). 24

25 Now we can use the fact that lim n Y n (0) = x(0) a.s. and F is M Lipschitz to obtain the following equation for large n Y n (t) x(t) ɛ + M t 0 Y n (s) x(s) ds + k 1 n ɛ n,i(t), (4.11) ( where ɛ n,i (t) = e i N i n t λ 0 i(x n (s))ds). We are going to prove that the right side of this equation is small for large values of n. First, we will show that k 1 ɛ n n,i(t) is indeed small for large n. To obtain this result we need a Lemma 4.5 on page 27 to obtain the an upper bound for k 1 ɛ n n,i(t). Seeing as ɛ n,i (t) are non-negative for all t and i = 1,..., k we know that { sup Hence we deduce that P sup k k } { 1 n ɛ n,i(t) ɛ i : sup ] { 1 n ɛ n,i(t) ɛ P i : sup k { = P k P sup 1 n ɛ n,i(t) ɛ }. k 1 n ɛ n,i(t) ɛ }] k 1 n ɛ n,i(t) ɛ }] k sup ɛ n,i (t) ɛn k ]. (4.12) Note that λ 1 (t) λ 2 (t) for all t 0 implies that P N i (λ 1 (t)) ɛ] P N i (λ 2 (t)) ɛ] and therefore P N i (λ 1 (t)) t ɛ] P N i (λ 2 (t)) t ɛ]. Since we know that t λ 0 i(x(s))ds λt for all t 0 we deduce the following k P sup ɛ n,i (t) ɛn ] k k ( t ) P sup ē N i n λ i (X n (s))ds ɛn ] 0 k k ( ) P sup ē N i n λt ɛn ] k k P sup N i (t) ɛn ], (4.13) 0 t nt λ kē where ē := max{ e i : i = 1,..., k}. Since N i (t) = N(t) t, and N(t) is a unit-rate Poisson process we can apply Lemma 4.5 in estimate (ii) below to obtain an upper bound. Let Z n (t) = Y n (t) x(t). Combining equation 4.11 with 4.12 and 4.13 gives us 25

26 the following equation for large n { t } ] k ] P sup Z n (t) M Z n (s)ds 2ɛ = P sup 0 n ɛ n,i(t) ɛ k P sup N i (t) ɛn ] 0 t nt λ kē ( ( )) (ii) 2k exp (nt λ)h ɛn kē(nt ( ( )) λ) = 2k exp nt λh ɛ kēt λ, (4.14) where h(t) = (1 + t) log(1 + t) t. To finalize our proof, we want to show that Z n (t) is small for all t 0, T ] and large n. In particular, we want to show that ] P sup Z n (t) ɛ is small for large n. To obtain this result, we apply Grönwall s lemma on Z n (t). To see that Z n (t) is bounded on 0, T ], note that x(t) is a continuous function and therefore bounded on closed intervals. As Y n (t) is a Poisson process, it is clearly bounded on 0, T ]. Therefore Z n (t) = Y n (t) x(t) is bounded on 0, T ]. By Grönwall s lemma 1 implies Clearly this implies that { sup Hence, we deduce that { sup Z n (t) 2ɛ + M { Z n (t) M { Z n (t) M t 0 Z n (s)ds for all t 0, T ] Z n (t) 2ɛe Mt for all t 0, T ]. t 0 t 0 } } { Z n (s)ds 2ɛ sup Z n (t) 2ɛe }. MT } } { Z n (s)ds 2ɛ sup Z n (t) 2ɛe }. MT We finalize the proof for equation 4.7 by ] { t P sup Z n (t) 2ɛe MT P sup Z n (t) M 0 ( ( k exp nt λh ɛ kēt λ 1 see Appedix B.1 } ] Z n (s)ds 2ɛ )). 26

27 Or alternatively, if we substitute ɛ by ɛ 2 e MT ] ( ( )) P sup Z n (t) ɛ 2k exp nt λh ɛe MT 2kēT λ, which proves equation 4.7. To deduce that 4.15 holds, note that for any ɛ > 0 ] P sup Z n (t) ɛ 2k e nc(ɛ) 1 = 2k 1 e < c(ɛ) n N n N where ( ) c(ɛ) = T λh ɛe MT 2kēT λ > 0. We apply the Borel-Cantelli Lemma 2 to deduce that for any ɛ > 0 ] P lim sup n sup Z n (t) ɛ = 0, which tells us that sup Z n (t) converges to 0 almost surely. Hence lim sup 1 n n X n(t) x(t) = 0 a.s. (4.15) must be true Proof of Lemma 4.5 The next proof requires some knowledge of martingales. Definition 4.4. A real-valued process {X(t)} t 0 which satisfies EX(t) {X(u)} u s ] = X(s), for all 0 s t is called a martingale. Recall that we still need to prove the following statement. Lemma 4.5. Let N be a unit-rate Poisson process. Then for any ɛ > 0 and T > 0, ] P sup N(t) t ɛ 2e T h(ɛ/t ), where h(t) = (1 + t) log(1 + t) t. 2 see Appedix B.2 27

28 Proof. Let θ > 0 be fixed. Seeing as the exponential is an increasing function we have (a) ] { (i) }} ]{{ }} ]{ P sup N(t) t ɛ P sup (N(t) t) ɛ + P sup (t N(t)) ɛ ] = P sup e θ(n(t) t) e θɛ + P sup e θ(t N(t)) e ], θɛ where we used Boole s inequality for (i). Next, we are going to find an upper bound for (a) and (b). Example 4.6. {N(t) t} t 0 and {t N(t)} t 0 are martingales. The proof of this statement is beyond the scope of this chapter and is left to the reader. Example 4.7. {e θ(n(t) t) } t 0 and {e θ(t N(t)) } t 0 are non-negative submartingales by Example 4.6 and Jensen s inequality 3. The proof of this statement again exceeds this chapter and is left to the reader. By applying Doob s inequality 4 we obtain ] P sup e θ(n(t) t) e θɛ e θɛ Ee θ(n(t ) T ) ], and ] P sup e θ(t N(t)) e θɛ e θɛ Ee θ(n(t ) T ) ]. Seeing as N(T ) Poisson(T ), we know that by Example 2.5 that Ee θn(t ) ] = φ(e θ ) = exp(t (e θ 1)). Hence we deduce ] P sup e θ(n(t) t) e θɛ e θ(ɛ+t ) Ee θn(t ) ] = e θ(ɛ+t ) exp(t (e θ 1)) = e θ(ɛ+t ) exp(t (e θ 1)) (b) = exp( θ(ɛ + T ) + T (e θ 1)) We now want to minimize the exponent θ(ɛ + T ) + T (e θ 1) over θ. We will obtain this minimum by setting 0 = d dθ exp( θ(ɛ + T ) + T (eθ 1)) = ( (ɛ + T ) + T e θ ) exp( θ(ɛ + T ) + T (e θ 1)) = (ɛ + T ) + T e θ = 0 3 see Appendix B.3 4 see Appedix B.4 28

29 Thus, θ = log(1 + ɛ/t ). Combining the above yields the following upper bound for (a) ] P sup (N(t) t) ɛ exp( log(1 + ɛ/t )(ɛ + T ) + ɛ) = exp( T (log(1 + ɛ/t )(1 + ɛ/t ) ɛ/t )) = exp( T h(ɛ/t )), where h(t) = (1 + t) log(1 + t) t. We find an upper bound for (b) in a similar fashion ] P sup e θ(t N(t)) e θɛ exp(θ(t ɛ) + T (e θ 1)) Since the right side of the inequality attains its minimal at θ = log(1 ɛ/t ). ] P sup (t N(t)) ɛ exp( log(1 ɛ/t )(T ɛ) + T (e log(1 ɛ/t ) 1)) exp( T (log(1 ɛ/t )(1 ɛ/t ) ɛ/t )) = exp( T h( ɛ/t )) Clearly h( t) h(t) for all t 0, 1]. Therefore, we conclude that ] P sup N(t) t ɛ e T h(ɛ/t ) + e T h( ɛ/t ) T h(ɛ/t ) 2e holds Simulation. We ran a simulation in Matlab to comfirm that the Markov chain X n (t) described by equations 4.3, 4.4 and 4.5 indeed converges in probability to x(t) in 4.9 for large n. Again we ran the program for R 0 = 0.5 and R 0 = 2 for γ = 1, i.e. β = R 0 /γ. The jump matrix and mean sojourn times of X n (t) were constructed the according to equations 4.3, 4.4 and 4.5. We chose the number of infected people at t = 0 as ten percent of the population, i.e. I(0) = N/10, and constructed the initial distribution of X n (t) accordingly. Jump times were generated according to the sojourn times using the rand function. Jump directions were generated using the rand function to chose a direction according to the jumping prababilities. See Appendix C.4 and C.5 for the scripts. For every value of R 0, we repeated the previous script for increasing values of n, which we chose as 2 10, 2 11 and

30 Results In the subcritical as well as the supercritical regime, the value of X n (t) was indeed close to x(t) for all 3 simulations. Naturally, an epidemic occured for all simulations the supercritical regime where R 0 = 2, while the disease decreased to a small fraction in all simulations of the subcritical regime where R 0 = 0.5. See Figures 4.3 and 4.4 The difference between x(t) and X n (t) did not seem to decrease for larger n. An explanation for this could be that our largest value of n, 2 12, may not be sufficiently large to show convergence. Figure 4.3.: Number of infected people in a Markov jumping process for R 0 = 0.5. The blue line equals x(t), the others equal 1 n X n(t) for n = 2 10 (yellow), n = 2 11 (orange) and n = 2 12 (red). Figure 4.4.: Number of infected people in a Markov jumping process for R 0 = 2. The blue line equals x(t), the others equal 1 n X n(t) for n = 2 10 (yellow), n = 2 11 (orange) and n = 2 12 (red). 30

Three Disguises of 1 x = e λx

Three Disguises of 1 x = e λx Three Disguises of 1 x = e λx Chathuri Karunarathna Mudiyanselage Rabi K.C. Winfried Just Department of Mathematics, Ohio University Mathematical Biology and Dynamical Systems Seminar Ohio University November

More information

Susceptible-Infective-Removed Epidemics and Erdős-Rényi random

Susceptible-Infective-Removed Epidemics and Erdős-Rényi random Susceptible-Infective-Removed Epidemics and Erdős-Rényi random graphs MSR-Inria Joint Centre October 13, 2015 SIR epidemics: the Reed-Frost model Individuals i [n] when infected, attempt to infect all

More information

LAW OF LARGE NUMBERS FOR THE SIRS EPIDEMIC

LAW OF LARGE NUMBERS FOR THE SIRS EPIDEMIC LAW OF LARGE NUMBERS FOR THE SIRS EPIDEMIC R. G. DOLGOARSHINNYKH Abstract. We establish law of large numbers for SIRS stochastic epidemic processes: as the population size increases the paths of SIRS epidemic

More information

Lecture 06 01/31/ Proofs for emergence of giant component

Lecture 06 01/31/ Proofs for emergence of giant component M375T/M396C: Topics in Complex Networks Spring 2013 Lecture 06 01/31/13 Lecturer: Ravi Srinivasan Scribe: Tianran Geng 6.1 Proofs for emergence of giant component We now sketch the main ideas underlying

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

Homework 6: Solutions Sid Banerjee Problem 1: (The Flajolet-Martin Counter) ORIE 4520: Stochastics at Scale Fall 2015

Homework 6: Solutions Sid Banerjee Problem 1: (The Flajolet-Martin Counter) ORIE 4520: Stochastics at Scale Fall 2015 Problem 1: (The Flajolet-Martin Counter) In class (and in the prelim!), we looked at an idealized algorithm for finding the number of distinct elements in a stream, where we sampled uniform random variables

More information

Epidemics in Networks Part 2 Compartmental Disease Models

Epidemics in Networks Part 2 Compartmental Disease Models Epidemics in Networks Part 2 Compartmental Disease Models Joel C. Miller & Tom Hladish 18 20 July 2018 1 / 35 Introduction to Compartmental Models Dynamics R 0 Epidemic Probability Epidemic size Review

More information

Erdős-Renyi random graphs basics

Erdős-Renyi random graphs basics Erdős-Renyi random graphs basics Nathanaël Berestycki U.B.C. - class on percolation We take n vertices and a number p = p(n) with < p < 1. Let G(n, p(n)) be the graph such that there is an edge between

More information

14 Branching processes

14 Branching processes 4 BRANCHING PROCESSES 6 4 Branching processes In this chapter we will consider a rom model for population growth in the absence of spatial or any other resource constraints. So, consider a population of

More information

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018 ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to

More information

Branching Processes II: Convergence of critical branching to Feller s CSB

Branching Processes II: Convergence of critical branching to Feller s CSB Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied

More information

Thursday. Threshold and Sensitivity Analysis

Thursday. Threshold and Sensitivity Analysis Thursday Threshold and Sensitivity Analysis SIR Model without Demography ds dt di dt dr dt = βsi (2.1) = βsi γi (2.2) = γi (2.3) With initial conditions S(0) > 0, I(0) > 0, and R(0) = 0. This model can

More information

errors every 1 hour unless he falls asleep, in which case he just reports the total errors

errors every 1 hour unless he falls asleep, in which case he just reports the total errors I. First Definition of a Poisson Process A. Definition: Poisson process A Poisson Process {X(t), t 0} with intensity λ > 0 is a counting process with the following properties. INDEPENDENT INCREMENTS. For

More information

UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES

UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES Applied Probability Trust 7 May 22 UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES HAMED AMINI, AND MARC LELARGE, ENS-INRIA Abstract Upper deviation results are obtained for the split time of a

More information

Copyright is owned by the Author of the thesis. Permission is given for a copy to be downloaded by an individual for the purpose of research and

Copyright is owned by the Author of the thesis. Permission is given for a copy to be downloaded by an individual for the purpose of research and Copyright is owned by the Author of the thesis. Permission is given for a copy to be downloaded by an individual for the purpose of research and private study only. The thesis may not be reproduced elsewhere

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

The tail does not determine the size of the giant

The tail does not determine the size of the giant The tail does not determine the size of the giant arxiv:1710.01208v2 [math.pr] 20 Jun 2018 Maria Deijfen Sebastian Rosengren Pieter Trapman June 2018 Abstract The size of the giant component in the configuration

More information

Final Exam: Probability Theory (ANSWERS)

Final Exam: Probability Theory (ANSWERS) Final Exam: Probability Theory ANSWERS) IST Austria February 015 10:00-1:30) Instructions: i) This is a closed book exam ii) You have to justify your answers Unjustified results even if correct will not

More information

BRANCHING PROCESSES 1. GALTON-WATSON PROCESSES

BRANCHING PROCESSES 1. GALTON-WATSON PROCESSES BRANCHING PROCESSES 1. GALTON-WATSON PROCESSES Galton-Watson processes were introduced by Francis Galton in 1889 as a simple mathematical model for the propagation of family names. They were reinvented

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

LMI Methods in Optimal and Robust Control

LMI Methods in Optimal and Robust Control LMI Methods in Optimal and Robust Control Matthew M. Peet Arizona State University Lecture 15: Nonlinear Systems and Lyapunov Functions Overview Our next goal is to extend LMI s and optimization to nonlinear

More information

Introduction to SEIR Models

Introduction to SEIR Models Department of Epidemiology and Public Health Health Systems Research and Dynamical Modelling Unit Introduction to SEIR Models Nakul Chitnis Workshop on Mathematical Models of Climate Variability, Environmental

More information

CS224W: Analysis of Networks Jure Leskovec, Stanford University

CS224W: Analysis of Networks Jure Leskovec, Stanford University Announcements: Please fill HW Survey Weekend Office Hours starting this weekend (Hangout only) Proposal: Can use 1 late period CS224W: Analysis of Networks Jure Leskovec, Stanford University http://cs224w.stanford.edu

More information

AARMS Homework Exercises

AARMS Homework Exercises 1 For the gamma distribution, AARMS Homework Exercises (a) Show that the mgf is M(t) = (1 βt) α for t < 1/β (b) Use the mgf to find the mean and variance of the gamma distribution 2 A well-known inequality

More information

Mathematical Epidemiology Lecture 1. Matylda Jabłońska-Sabuka

Mathematical Epidemiology Lecture 1. Matylda Jabłońska-Sabuka Lecture 1 Lappeenranta University of Technology Wrocław, Fall 2013 What is? Basic terminology Epidemiology is the subject that studies the spread of diseases in populations, and primarily the human populations.

More information

Derivation of Itô SDE and Relationship to ODE and CTMC Models

Derivation of Itô SDE and Relationship to ODE and CTMC Models Derivation of Itô SDE and Relationship to ODE and CTMC Models Biomathematics II April 23, 2015 Linda J. S. Allen Texas Tech University TTU 1 Euler-Maruyama Method for Numerical Solution of an Itô SDE dx(t)

More information

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model. Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,

More information

Reproduction numbers for epidemic models with households and other social structures I: Definition and calculation of R 0

Reproduction numbers for epidemic models with households and other social structures I: Definition and calculation of R 0 Mathematical Statistics Stockholm University Reproduction numbers for epidemic models with households and other social structures I: Definition and calculation of R 0 Lorenzo Pellis Frank Ball Pieter Trapman

More information

THREE DIMENSIONAL SYSTEMS. Lecture 6: The Lorenz Equations

THREE DIMENSIONAL SYSTEMS. Lecture 6: The Lorenz Equations THREE DIMENSIONAL SYSTEMS Lecture 6: The Lorenz Equations 6. The Lorenz (1963) Equations The Lorenz equations were originally derived by Saltzman (1962) as a minimalist model of thermal convection in a

More information

Stochastic modelling of epidemic spread

Stochastic modelling of epidemic spread Stochastic modelling of epidemic spread Julien Arino Centre for Research on Inner City Health St Michael s Hospital Toronto On leave from Department of Mathematics University of Manitoba Julien Arino@umanitoba.ca

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Network Modelling for Sexually. Transmitted Diseases

Network Modelling for Sexually. Transmitted Diseases Network Modelling for Sexually Transmitted Diseases Thitiya Theparod, MSc Bsc (Hons) Submitted for the degree of Doctor of Philosophy at Lancaster University 2015 Abstract The aim of this thesis is to

More information

MAS1302 Computational Probability and Statistics

MAS1302 Computational Probability and Statistics MAS1302 Computational Probability and Statistics April 23, 2008 3. Simulating continuous random behaviour 3.1 The Continuous Uniform U(0,1) Distribution We have already used this random variable a great

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

Project 1 Modeling of Epidemics

Project 1 Modeling of Epidemics 532 Chapter 7 Nonlinear Differential Equations and tability ection 7.5 Nonlinear systems, unlike linear systems, sometimes have periodic solutions, or limit cycles, that attract other nearby solutions.

More information

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples Homework #3 36-754, Spring 27 Due 14 May 27 1 Convergence of the empirical CDF, uniform samples In this problem and the next, X i are IID samples on the real line, with cumulative distribution function

More information

Convergence of Feller Processes

Convergence of Feller Processes Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes

More information

Transmission in finite populations

Transmission in finite populations Transmission in finite populations Juliet Pulliam, PhD Department of Biology and Emerging Pathogens Institute University of Florida and RAPIDD Program, DIEPS Fogarty International Center US National Institutes

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Almost giant clusters for percolation on large trees

Almost giant clusters for percolation on large trees for percolation on large trees Institut für Mathematik Universität Zürich Erdős-Rényi random graph model in supercritical regime G n = complete graph with n vertices Bond percolation with parameter p(n)

More information

An Introduction to Stochastic Epidemic Models

An Introduction to Stochastic Epidemic Models An Introduction to Stochastic Epidemic Models Linda J. S. Allen Department of Mathematics and Statistics Texas Tech University Lubbock, Texas 79409-1042, U.S.A. linda.j.allen@ttu.edu 1 Introduction The

More information

MS 3011 Exercises. December 11, 2013

MS 3011 Exercises. December 11, 2013 MS 3011 Exercises December 11, 2013 The exercises are divided into (A) easy (B) medium and (C) hard. If you are particularly interested I also have some projects at the end which will deepen your understanding

More information

MCMC 2: Lecture 3 SIR models - more topics. Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham

MCMC 2: Lecture 3 SIR models - more topics. Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham MCMC 2: Lecture 3 SIR models - more topics Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham Contents 1. What can be estimated? 2. Reparameterisation 3. Marginalisation

More information

Branching processes. Chapter Background Basic definitions

Branching processes. Chapter Background Basic definitions Chapter 5 Branching processes Branching processes arise naturally in the study of stochastic processes on trees and locally tree-like graphs. After a review of the basic extinction theory of branching

More information

1 Mechanistic and generative models of network structure

1 Mechanistic and generative models of network structure 1 Mechanistic and generative models of network structure There are many models of network structure, and these largely can be divided into two classes: mechanistic models and generative or probabilistic

More information

The greedy independent set in a random graph with given degr

The greedy independent set in a random graph with given degr The greedy independent set in a random graph with given degrees 1 2 School of Mathematical Sciences Queen Mary University of London e-mail: m.luczak@qmul.ac.uk January 2016 Monash University 1 Joint work

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014 Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Monday, Feb 10, 2014 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

w i w j = w i k w k i 1/(β 1) κ β 1 β 2 n(β 2)/(β 1) wi 2 κ 2 i 2/(β 1) κ 2 β 1 β 3 n(β 3)/(β 1) (2.2.1) (β 2) 2 β 3 n 1/(β 1) = d

w i w j = w i k w k i 1/(β 1) κ β 1 β 2 n(β 2)/(β 1) wi 2 κ 2 i 2/(β 1) κ 2 β 1 β 3 n(β 3)/(β 1) (2.2.1) (β 2) 2 β 3 n 1/(β 1) = d 38 2.2 Chung-Lu model This model is specified by a collection of weights w =(w 1,...,w n ) that represent the expected degree sequence. The probability of an edge between i to is w i w / k w k. They allow

More information

STAT 380 Markov Chains

STAT 380 Markov Chains STAT 380 Markov Chains Richard Lockhart Simon Fraser University Spring 2016 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring 2016 1 / 38 1/41 PoissonProcesses.pdf (#2) Poisson Processes

More information

Models of Infectious Disease Formal Demography Stanford Summer Short Course James Holland Jones, Instructor. August 15, 2005

Models of Infectious Disease Formal Demography Stanford Summer Short Course James Holland Jones, Instructor. August 15, 2005 Models of Infectious Disease Formal Demography Stanford Summer Short Course James Holland Jones, Instructor August 15, 2005 1 Outline 1. Compartmental Thinking 2. Simple Epidemic (a) Epidemic Curve 1:

More information

On stochastic models for the spread of infections. Jan Pieter Trapman

On stochastic models for the spread of infections. Jan Pieter Trapman On stochastic models for the spread of infections Jan Pieter Trapman c Pieter Trapman, De Bilt 2006 ISBN-10: 90-9020910-7 ISBN-13: 978-90-9020910-4 printed by PrintPartners Ipskamp, Enschede Engraving

More information

A simple branching process approach to the phase transition in G n,p

A simple branching process approach to the phase transition in G n,p A simple branching process approach to the phase transition in G n,p Béla Bollobás Department of Pure Mathematics and Mathematical Statistics Wilberforce Road, Cambridge CB3 0WB, UK b.bollobas@dpmms.cam.ac.uk

More information

4: The Pandemic process

4: The Pandemic process 4: The Pandemic process David Aldous July 12, 2012 (repeat of previous slide) Background meeting model with rates ν. Model: Pandemic Initially one agent is infected. Whenever an infected agent meets another

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and

More information

Question Points Score Total: 70

Question Points Score Total: 70 The University of British Columbia Final Examination - April 204 Mathematics 303 Dr. D. Brydges Time: 2.5 hours Last Name First Signature Student Number Special Instructions: Closed book exam, no calculators.

More information

APPM 2360 Lab 3: Zombies! Due April 27, 2017 by 11:59 pm

APPM 2360 Lab 3: Zombies! Due April 27, 2017 by 11:59 pm APPM 2360 Lab 3: Zombies! Due April 27, 2017 by 11:59 pm 1 Introduction As you already know, in the past month, zombies have overrun much of North America, including all major cities on both the East and

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Lecture 2: Tipping Point and Branching Processes

Lecture 2: Tipping Point and Branching Processes ENAS-962 Theoretical Challenges in Networ Science Lecture 2: Tipping Point and Branching Processes Lecturer: Amin Karbasi Scribes: Amin Karbasi 1 The Tipping Point In most given systems, there is an inherent

More information

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking Lecture 7: Simulation of Markov Processes Pasi Lassila Department of Communications and Networking Contents Markov processes theory recap Elementary queuing models for data networks Simulation of Markov

More information

The Spreading of Epidemics in Complex Networks

The Spreading of Epidemics in Complex Networks The Spreading of Epidemics in Complex Networks Xiangyu Song PHY 563 Term Paper, Department of Physics, UIUC May 8, 2017 Abstract The spreading of epidemics in complex networks has been extensively studied

More information

Markov Chains and Pandemics

Markov Chains and Pandemics Markov Chains and Pandemics Caleb Dedmore and Brad Smith December 8, 2016 Page 1 of 16 Abstract Markov Chain Theory is a powerful tool used in statistical analysis to make predictions about future events

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

Any live cell with less than 2 live neighbours dies. Any live cell with 2 or 3 live neighbours lives on to the next step.

Any live cell with less than 2 live neighbours dies. Any live cell with 2 or 3 live neighbours lives on to the next step. 2. Cellular automata, and the SIRS model In this Section we consider an important set of models used in computer simulations, which are called cellular automata (these are very similar to the so-called

More information

On a stochastic epidemic SEIHR model and its diffusion approximation

On a stochastic epidemic SEIHR model and its diffusion approximation On a stochastic epidemic SEIHR model and its diffusion approximation Marco Ferrante (1), Elisabetta Ferraris (1) and Carles Rovira (2) (1) Dipartimento di Matematica, Università di Padova (Italy), (2)

More information

Dynamical models of HIV-AIDS e ect on population growth

Dynamical models of HIV-AIDS e ect on population growth Dynamical models of HV-ADS e ect on population growth David Gurarie May 11, 2005 Abstract We review some known dynamical models of epidemics, given by coupled systems of di erential equations, and propose

More information

2. Transience and Recurrence

2. Transience and Recurrence Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Optional Stopping Theorem Let X be a martingale and T be a stopping time such Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10

More information

Problems on Evolutionary dynamics

Problems on Evolutionary dynamics Problems on Evolutionary dynamics Doctoral Programme in Physics José A. Cuesta Lausanne, June 10 13, 2014 Replication 1. Consider the Galton-Watson process defined by the offspring distribution p 0 =

More information

The phase transition in the Erdös-Rényi random graph model

The phase transition in the Erdös-Rényi random graph model The phase transition in the Erdös-Rényi random graph model Tom van den Bosch July 17, 2014 Bachelor thesis Supervisor: Guus Regts Korteweg-de Vries Instituut voor Wiskunde Faculteit der Natuurwetenschappen,

More information

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected 4. Markov Chains A discrete time process {X n,n = 0,1,2,...} with discrete state space X n {0,1,2,...} is a Markov chain if it has the Markov property: P[X n+1 =j X n =i,x n 1 =i n 1,...,X 0 =i 0 ] = P[X

More information

Inference for partially observed stochastic dynamic systems: A new algorithm, its theory and applications

Inference for partially observed stochastic dynamic systems: A new algorithm, its theory and applications Inference for partially observed stochastic dynamic systems: A new algorithm, its theory and applications Edward Ionides Department of Statistics, University of Michigan ionides@umich.edu Statistics Department

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Time varying networks and the weakness of strong ties

Time varying networks and the weakness of strong ties Supplementary Materials Time varying networks and the weakness of strong ties M. Karsai, N. Perra and A. Vespignani 1 Measures of egocentric network evolutions by directed communications In the main text

More information

Markov Chains and Stochastic Sampling

Markov Chains and Stochastic Sampling Part I Markov Chains and Stochastic Sampling 1 Markov Chains and Random Walks on Graphs 1.1 Structure of Finite Markov Chains We shall only consider Markov chains with a finite, but usually very large,

More information

Notes on uniform convergence

Notes on uniform convergence Notes on uniform convergence Erik Wahlén erik.wahlen@math.lu.se January 17, 2012 1 Numerical sequences We begin by recalling some properties of numerical sequences. By a numerical sequence we simply mean

More information

Die-out Probability in SIS Epidemic Processes on Networks

Die-out Probability in SIS Epidemic Processes on Networks Die-out Probability in SIS Epidemic Processes on etworks Qiang Liu and Piet Van Mieghem Abstract An accurate approximate formula of the die-out probability in a SIS epidemic process on a network is proposed.

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

Probability Distributions

Probability Distributions Lecture 1: Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function

More information

Elementary Probability. Exam Number 38119

Elementary Probability. Exam Number 38119 Elementary Probability Exam Number 38119 2 1. Introduction Consider any experiment whose result is unknown, for example throwing a coin, the daily number of customers in a supermarket or the duration of

More information

Homework 3 posted, due Tuesday, November 29.

Homework 3 posted, due Tuesday, November 29. Classification of Birth-Death Chains Tuesday, November 08, 2011 2:02 PM Homework 3 posted, due Tuesday, November 29. Continuing with our classification of birth-death chains on nonnegative integers. Last

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Branching within branching: a general model for host-parasite co-evolution

Branching within branching: a general model for host-parasite co-evolution Branching within branching: a general model for host-parasite co-evolution Gerold Alsmeyer (joint work with Sören Gröttrup) May 15, 2017 Gerold Alsmeyer Host-parasite co-evolution 1 of 26 1 Model 2 The

More information

DS-GA 1002 Lecture notes 2 Fall Random variables

DS-GA 1002 Lecture notes 2 Fall Random variables DS-GA 12 Lecture notes 2 Fall 216 1 Introduction Random variables Random variables are a fundamental tool in probabilistic modeling. They allow us to model numerical quantities that are uncertain: the

More information

6.207/14.15: Networks Lecture 3: Erdös-Renyi graphs and Branching processes

6.207/14.15: Networks Lecture 3: Erdös-Renyi graphs and Branching processes 6.207/14.15: Networks Lecture 3: Erdös-Renyi graphs and Branching processes Daron Acemoglu and Asu Ozdaglar MIT September 16, 2009 1 Outline Erdös-Renyi random graph model Branching processes Phase transitions

More information

Scaling limits for random trees and graphs

Scaling limits for random trees and graphs YEP VII Probability, random trees and algorithms 8th-12th March 2010 Scaling limits for random trees and graphs Christina Goldschmidt INTRODUCTION A taste of what s to come We start with perhaps the simplest

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days? IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross

More information

Chapter 3. Erdős Rényi random graphs

Chapter 3. Erdős Rényi random graphs Chapter Erdős Rényi random graphs 2 .1 Definitions Fix n, considerthe set V = {1,2,...,n} =: [n], andput N := ( n 2 be the numberofedgesonthe full graph K n, the edges are {e 1,e 2,...,e N }. Fix also

More information

Convergence of Random Walks and Conductance - Draft

Convergence of Random Walks and Conductance - Draft Graphs and Networks Lecture 0 Convergence of Random Walks and Conductance - Draft Daniel A. Spielman October, 03 0. Overview I present a bound on the rate of convergence of random walks in graphs that

More information

Lecture 2: Review of Basic Probability Theory

Lecture 2: Review of Basic Probability Theory ECE 830 Fall 2010 Statistical Signal Processing instructor: R. Nowak, scribe: R. Nowak Lecture 2: Review of Basic Probability Theory Probabilistic models will be used throughout the course to represent

More information

Model Counting for Logical Theories

Model Counting for Logical Theories Model Counting for Logical Theories Wednesday Dmitry Chistikov Rayna Dimitrova Department of Computer Science University of Oxford, UK Max Planck Institute for Software Systems (MPI-SWS) Kaiserslautern

More information

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get 18:2 1/24/2 TOPIC. Inequalities; measures of spread. This lecture explores the implications of Jensen s inequality for g-means in general, and for harmonic, geometric, arithmetic, and related means in

More information

8.5 Taylor Polynomials and Taylor Series

8.5 Taylor Polynomials and Taylor Series 8.5. TAYLOR POLYNOMIALS AND TAYLOR SERIES 50 8.5 Taylor Polynomials and Taylor Series Motivating Questions In this section, we strive to understand the ideas generated by the following important questions:

More information

Mathematical Modeling and Analysis of Infectious Disease Dynamics

Mathematical Modeling and Analysis of Infectious Disease Dynamics Mathematical Modeling and Analysis of Infectious Disease Dynamics V. A. Bokil Department of Mathematics Oregon State University Corvallis, OR MTH 323: Mathematical Modeling May 22, 2017 V. A. Bokil (OSU-Math)

More information

Chapter 2 SOME ANALYTICAL TOOLS USED IN THE THESIS

Chapter 2 SOME ANALYTICAL TOOLS USED IN THE THESIS Chapter 2 SOME ANALYTICAL TOOLS USED IN THE THESIS 63 2.1 Introduction In this chapter we describe the analytical tools used in this thesis. They are Markov Decision Processes(MDP), Markov Renewal process

More information