6.2.2 Point processes and counting processes

Size: px
Start display at page:

Download "6.2.2 Point processes and counting processes"

Transcription

1 56 CHAPTER 6. MASTER EQUATIONS which yield a system of differential equations The solution to the system of equations is d p (t) λp, (6.3) d p k(t) λ (p k p k 1 (t)), (6.4) k 1. (6.5) p n (t) (λt)n e λt (6.6) n! which is called Poisson process. We see that the key assumption underlying a Poisson process is the uniform rate, i.e., constant λ; non-bunching Pr{N(τ) 2} o(τ); and independence of the events Point processes and counting processes More precisely, Eq. (6.6) is called Poisson counting process. There is also another way to look at the events: the Poisson point process T k, the stochastic time for the kth event to occur. It can be shown that Pr{T k t} Pr{N(t) k} (6.7) Then we can show that f Tk (x) λk x k 1 (k 1)! e λx (6.8) which is known as the Gamma distribution of kth order. Note that T 1 is exponentially distributed. It has the memoryless property Pr{T 1 >t+ τ T 1 >t} Pr{T 1 >τ}. (6.9) It can be shown that exponential distribution is the only one with the memoryless property. T 2 can be shown to be T (1) 1 + T (2) 1, the sum of two iid T 1. In fact, T k is the sum of k exponential distributed iid. Both N(t) and T k are Markov processes; Neither are statioanry processes but both have stationary independent increments Relation to two simple ordinary differential equations We now ask the expected value and variance of the Poisson counting process: E[N(t)] λt. (6.1) Hence, d E[N(t)] λ, (6.11)

2 6.3. BIRTH-DEATH PROCESSES IN ONE-DIMENSION 57 representing a process with constant growth. On the other hand, if we consider m number of identially independent radioactive nuclei, then the probabiltiy of an individual nucleus survives to time t is e λt. Therefore, the distribution of k( m ) number of nuclei survive to time t is binomially distributed, with the mean being m(t) which satisfies the ODE dm λm(t), m() m. (6.12) More importantly, the variance of the binomial distribution shows that for large m the pribability distribution has a very small relative variance. Hence it is justified to be modeled by a deterministic ODE (6.12). 6.3 Birth-Death Processes in One-Dimension It should not escape our notice that Eq. (6.3) is in fact a master equation. Poisson process is a discrete-state, continuous time Markov process with countable number of states in N. It represents a simple birth process with a uniform growth rate. A more general class of master equations is the one-dimensional birth-death process, N(t), with transition probability rate (i.e., Eq. 4.55) W (n m, t) μ(m)δ n,m+1 + λ(m)δ n,m 1. (6.13) A birth process and a death process are represented in the Eq. (6.13): Therefore, the general master equation has the form n n +1: μ(n) growth rate, (6.14) n n 1: λ(n) death rate. (6.15) dp n (t) μ(n 1)p n 1 [λ(n)+μ(n)]p n + λ(n +1)p n+1. (6.16) As for one-dimensional Kolmogorov forward equations, there is no general solution to this time-dependent equation. However, its stationary solution can be solved. In particular, if there is no probability flux in the system, then we have the stationary distribution p ss n p ss n k Rate equation for expected value μ(k 1). (6.17) λ(k) Even though one can not solve the time-dependent master equation (6.16) in general, one can obtain a differential equation for its expected value E[N(t)] m(t) np n (t). (6.18) n

3 58 CHAPTER 6. MASTER EQUATIONS We have dm(t) nμ(n 1)p n 1 n[λ(n)+μ(n)]p n + nλ(n +1)p n+1 n μ(n 1)p n 1 λ(n +1)p n+1 n1 n E [μ(n)] E [λ(n)]. (6.19) If the variance of N(t) is very small, then one can approximate E [μ(n)] μ (E[N]) μ(m(t)), E[λ(N)] λ (E[N]) λ(m(t)). (6.2) Then, dm μ(m) λ(m). (6.21) We see that the fixed points of ODE (6.21) are at μ(m) λ(m), which correspond to the modal values of stationary dsitribution in Eq. (6.17). Here is also the place to emphasize the difference between the fixed points of the ODE (6.21) and the stationary distribution (6.17): they are clearly different things, even though they are intimately related. The latter is a generalization of the former. It contains more informations about the relative importance of different fixed points. Note in literature, a fixed point is often refered to as a steady state ; and a stationary distribution is also often refered to as a steady-state distribution. The terminology here can be confusing, and one needs to be more careful and precise. 6.4 Examples A simple example Let us consider the chemical reaction A k1 X (6.22) in which the species A has a fixed concentration of a, while the number of molecules of species X, N, changes with time. Ordinary Differential Equation Based on the Law of Mass Action. First, if we denote the concentration of X by x, then the traditional differential equation model based on the Law of Mass Action gives dx k 1a x. (6.23) The time-dependent solution to the ODE is ( x(t) x() k ) 1a e k 1t + k 1a. (6.24)

4 6.4. EXAMPLES 59 Birth-Death Process Formulation of Chemical Reaction Systems. In terms of a birth-death process, we have Hence, the master equation for the birth-death process is dp n (t) μ(n) k 1 a, λ(n) n. (6.25) k 1 ap n 1 (k 1 a + n)p n +(n +1)p n+1. (6.26) We use the method of generating function to solve the time-dependent problem. Let G(s, t) E [ s N(t)], therefore, G(s, t) t k 1 a(s 1)G(s, t) (s 1) G(s, t). (6.27) s By the method of characteristics, we have the general solution to the first-order partial differential equation: G(s, t) F [ (s 1)e k 1t] { } (s 1)k1 a exp. (6.28) in which F () is an arbitrary function to be determined by the initial condition. Note F () G(1,t)1. Stationary Poisson Distribution. We see that for t, { } (s 1)k1 a G(s, ) exp, (6.29) which is the generating function of Poisson distribution with mean k 1 a/. This is precisely the same result as the x( ) in Eq. (6.24). Time-dependent Solution to the Stochastic Model. Furthermore, let us assume the initial distribution for N(t) is N() N. Then Therefore, { } (s 1)k1 a G(s, ) s N F (s 1) exp. (6.3) G(s, t) [ 1+(s 1)e k 1t] { N (s 1)k1 a ( exp 1 e k ) } 1t. (6.31) The generating function in Eq. (6.31) gives the complete dynamics of the birth-

5 6 CHAPTER 6. MASTER EQUATIONS death process, including the expected value and variance: [ ] G(s, t) E[N(t)] s k ( 1a + N k ) 1a e k 1t, (6.32) [ 2 ] G(s, t) E [N(t)(N(t) 1)] s 2 Var[N(t)] s1 s1 (E[N(t)]) 2 N e 2k 1t, (6.33) ( N e k 1t + k ) 1a (1 e k ) 1t. (6.34) Eq. (6.32) agrees with Eq. (6.24). Stationary Process. The stationary process has a Poisson distribution according to Eq. (6.29): p ss n And the autocovariance function A system with bistability (k 1a/ ) n e k1a/k 1. (6.35) n! E[N(t)N()] (E[N]) 2 k 1a e k 1t. (6.36) We note that the chemical reactions in system (6.22) are first-order, i.e., the differential equation (6.23) based on the Law of Mass Action is linear. The linearity is responsible for the solvability of the model in the previous example. We now consider a system of nonlinear chemcal reactions: A +2X FGGG GGGB k1 3X, X FGGG GGGB k2 B, (6.37) k 2 in which the species A and B have again fixed concentrations of a and b. Ordinary Differential Equation Based on the Law of Mass Action. The dynamics of the centration of the X, denoted by x(t), follows the ordinary differential equation based on the Law of Mass Action: dx bk 2 k 2 x + k 1 ax 2 x 3. (6.38) This equation can be solved by the method of separation of variables, which yields: ( ) x3 x x 2 ( ) x1 x x1 x 3 ( ) x2 x x2 x 1 x3 x x 1 x x 2 x x 3

6 6.5. APPROXIMATING BIRTH-DEATH PROCESSES 61 exp[ (x 1 x 2 )(x 2 x 3 )(x 3 x 1 )t], (6.39) in which x is the initial condition x() x, and x 1 x 2 x 3 are the three roots of the cubic equation bk 2 k 2 x + k 1 ax 2 x 3 (x x 1 )(x x 2 )(x x 3 ). (6.4) Birth-Death Process Formulation of Chemical Reaction Systems. In terms of a birth-death process, we have μ(n) k 1an(n 1) V + k 2 bv, λ(n) n(n 1)(n 2) V 2 + k 2 n, (6.41) in which V is the volume of the chemical reaction system: n/v x is the concentration of the species X. Therefore, the stationary distribution for the master equation, according to Eq. (6.17), is where p ss l p ss α k 1a V, l n1 [ ] α(n 1)(n 2) + β, (6.42) n(n 1)(n 2) + γn β k 2b V 2, γ k 2 V 2. (6.43) The master equation for the birth-death process is p n (t) μ(n 1)p n 1 (μ(n)+λ(n))p n + λ(n +1)p n+1. (6.44) Chemical Equilibrium and Stationary Poisson Distribution. The chemical reactions in (6.37) have the net transformation of A B, via the intermediate steps with intermediate species X. Hence, if the concentrations of A and B satisfy a b k 2, (6.45) k 1 k 2 then the system will eventually reach a chemical equilibrium. The condition (6.45) means α βv/γ, which in turn means that the distribution in (6.42) is Poissonian p eq l (αv )l e αv, (6.46) l! where αv k 1 av/ x eq V is the number of X molecule in chemical equilibrium; x eq is the equilibrium concentration of X. 6.5 Approximating Birth-Death Processes by Fokker- Planck Equations We note that the μ(n) and λ(n) in Eq. (6.41) are functions of the parameter V, the volume of the chemical reaction system. When V tends to infinity while the concentrations of A and B, a and b, are kept at constant, the number of X molecules in the system is expected to be n Vxwhere x is the concentration.

7 62 CHAPTER 6. MASTER EQUATIONS In mathematical terms, if we identify 1 V dx, p μ(n) n(t) f(x, t)dx, v(x), and λ(n) w(x), (6.47) V V then we have the master equation for a birth-death process f(x, t) μ(x dx)f(x dx, t) (μ(x)+λ(x)) f(x, t)+λ(x +dx)f(x +dx, t) t (wf)(x +dx/2) (vf)(x dx/2) x ( x ) 2 v(x)+w(x) x 2 f(x, t) (v(x) w(x))f(x, t), (6.48) 2V x in which x. Eq. (6.48) is a Fokker-Planck equation with a(x) v(x)+w(x), and b(x) v(x) w(x). (6.49) V One notices that when V, the diffusion term vanishes, and the Eq. (6.48) is a first-order partial differential equation whose characteristics is simply Eq. (6.38), the traditional deterministic dynamics for the reaction in Eq. (6.37). This relation shows that the ODEs are the rigorous limit of the master equation. The stochastic model is not an alternative to the deterministic kinetics, it is a more complete kinetic description which is capable of modeling chemical reaction systems with and without fluctuations The Chemical Master Equation One can look at the problem from a different angle. We shall now focus on the CME. One of the most important features of the CME is the natural parameter V which tends to infinity in the thermodynamic limit, giving rise to a system of ordinary differential equations. For example, for the birth and death processes with μ n and λ n, in the thermodynamic limit, we have μ n V { μ(x)+o ( V 1)}, λ n V { λ(x)+o ( V 1)}. (6.5) Now let us consider the standard Fokker-Planck equation: f t 1 2 x 2 (a(x)f) (b(x)f). (6.51) x If we descretize the x in terms of uniform interval δ, wehave [ ] df (x, t) a(x δ) b(x δ) 2δ 2 + f(x δ) 2δ [( a(x) + 2δ 2 + b(x) ) ( a(x) + 2δ 2δ 2 b(x) )] f(x) 2δ [ ] a(x + δ) b(x + δ) + 2δ 2 f(x + δ). (6.52) 2δ 2

8 6.5. APPROXIMATING BIRTH-DEATH PROCESSES 63 Therefore, if μ n and λ n are in the forms of μ n V ( ( a(x)v + b(x)+o V 1 )), λ n V ( ( a(x)v b(x)+o V 1 )), 2 2 (6.53) then the master equation will have a legitimate limit, with a proper Fokker-Planck equation. In fact, μ n + λ n a(x) lim V V 2, (6.54) μ n λ n b(x) lim, (6.55) V V Now let us take a look of the CME. We realize that for the CME, the a(x) 1/V! Hence, there is only first-order PDE in the limit of V. The diffusion is a higher order term. So will this asymptotic expansion be useful as an approximation to the CME? The answer is yes for finite time t, but not for stationary behavior Keizer s Paradox for System with Multi-stability One can obtain the stationary distribution from Eq. (6.48): f ss (x) 2V { v(x) w(x) v(x)+w(x) exp 2V x } v(z) w(z) v(z)+w(z) dz (6.56) On the other hand, from the exact expression for the stationary distribution given in Eq. (6.17), we have in the limit of V, ln p ss k k ln j1 k j1 k/v V μ(j 1) λ(j) + C ln v(j/v ) ( 1 w(j/v ) + o V ln ) + C ( ) v(z) dz + C (6.57) w(z) in which C lnp ss. Noting that probability density function f ss (x) Vp ss Vx : 1 2V ln f ss (x) 1 2V ln pss Vx+ ln V 2V + C 1 x ( ) v(z) ln dz + ln V + C. (6.58) 2 w(z) 2V Compare Eq. (6.58) with Eq. (6.56), both can be written in compact forms { x } f ss q(z) 1 (x) C 1 V exp 2V q(z)+1 dz (6.59a) { x } f ss (x) C 2 V exp V ln q(z)dz (6.59b)

9 64 CHAPTER 6. MASTER EQUATIONS where q(u) w(u)/v(u). Both functions have identical local extrema since d dx f ss (x) and d dx f ss (x) lead to the same equation for the roots of q(x) 1. Furthermore, the corresponding local maxima of f ss (x) and f ss (x) are asymptotically equivalent in the limit of V. To show this, let x be a root of q(x) 1. Then we have { f ss (x) C 1 V exp 2V x } q(z) 1 q(z)+1 dz Vq (x )(x x ) 2, (6.6a) 2 and { f ss (x) C 2 V exp V x } ln q(z)dz Vq (x )(x x ) 2. (6.6b) 2 Stationary distributions f ss (x) and f ss (x) have their minima and maxima at same place with same curvature. However, something is very different between f ss (x) and f ss (x). To see this, let x and x be two roots of q(x) 1with q (x) >, each represent an local maximum for functions f ss (x) and f ss (x). The important question, then, is which maximum is the larger one. This is determined by the signs of x x ln q(z)dz and 2 x q(z) 1 dz. (6.61) x q(z)+1 A simple counter example that shows the terms in (6.61) having different signs can be constructed by using q(z) z 3 + k 2 z k 1 az 2 + k 2 b. Therefore, in the limit of V, f ss (x) and f ss (x) could converge to different local maxima. In the limit of V, f ss (x) and f ss converging to different fixed points of the deterministic dynamics is called Keizer s paradox. The reason of this has to do with the non-uniform convergence of f ss (x) lim lim Vp t Vx(t) lim lim Vp Vx(t) f ss (x). (6.62) V V t

10 6.6. KRAMERS-MOYAL EXPANSION AND VAN KAMPEN S CONDITIONAL DIFFUSION Kramers-Moyal Expansion and van Kampen s Conditional Diffusion f(x) t where W (x y)f(y)dy W (x + z z x z)f(x z)dz W (x + z x)f(x)dz ( ) zw(x + z x)f(x)dz x ( ) 2 x 2 z 2 1 n ( ) W (x + z x)f(x)dz + n! x n ( z) n W (x + z x)f(x)dz 2 n3 x (b(x)f(x)) + 1 (a(x)f(x)) + (6.63) 2 x2 a(x) z 2 W (x + z x)dz, b(x) zw(x + z x)dz. (6.64) This seemingly elegant derivation, however, has a problem. Note that W (y x) is for jump process; it contains many Dirac-delta functions. So it is not a smooth function of x and y! Hence, it is not legitimate to expand the function in Taylor expansion. Van Kampen developed a more appropriate expansion. Following the ideas from the chemical master equation, he considerd the natural parameter V as the system s size, which tends to infinity. Furthermore, the variable in the continuous limit should be density φ n/v, where n is the discrete random variable in the birth-death processes. The limit is such that V, n,but n(t) φ(t)v + z(t)v 1/2 in whch φ(t) satisfies ordinary differential equation dφ/ μ(φ) λ(φ), and stochastic process z(t) has a continuous path and it satisfies a conditional diffusion equation f z (z,t) t where a(x) and b(x) are given in Eq. (6.64). a(φ(t)) fz 2 (z,t) 2 z 2 b (φ(t)) z (zf z(z,t)), (6.65)

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde Gillespie s Algorithm and its Approximations Des Higham Department of Mathematics and Statistics University of Strathclyde djh@maths.strath.ac.uk The Three Lectures 1 Gillespie s algorithm and its relation

More information

Handbook of Stochastic Methods

Handbook of Stochastic Methods C. W. Gardiner Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences Third Edition With 30 Figures Springer Contents 1. A Historical Introduction 1 1.1 Motivation I 1.2 Some Historical

More information

Stochastic model of mrna production

Stochastic model of mrna production Stochastic model of mrna production We assume that the number of mrna (m) of a gene can change either due to the production of a mrna by transcription of DNA (which occurs at a rate α) or due to degradation

More information

Problems 5: Continuous Markov process and the diffusion equation

Problems 5: Continuous Markov process and the diffusion equation Problems 5: Continuous Markov process and the diffusion equation Roman Belavkin Middlesex University Question Give a definition of Markov stochastic process. What is a continuous Markov process? Answer:

More information

Northwestern University Department of Electrical Engineering and Computer Science

Northwestern University Department of Electrical Engineering and Computer Science Northwestern University Department of Electrical Engineering and Computer Science EECS 454: Modeling and Analysis of Communication Networks Spring 2008 Probability Review As discussed in Lecture 1, probability

More information

16. Working with the Langevin and Fokker-Planck equations

16. Working with the Langevin and Fokker-Planck equations 16. Working with the Langevin and Fokker-Planck equations In the preceding Lecture, we have shown that given a Langevin equation (LE), it is possible to write down an equivalent Fokker-Planck equation

More information

Stochastic Modelling Unit 1: Markov chain models

Stochastic Modelling Unit 1: Markov chain models Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson

More information

Handbook of Stochastic Methods

Handbook of Stochastic Methods Springer Series in Synergetics 13 Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences von Crispin W Gardiner Neuausgabe Handbook of Stochastic Methods Gardiner schnell und portofrei

More information

1 Delayed Renewal Processes: Exploiting Laplace Transforms

1 Delayed Renewal Processes: Exploiting Laplace Transforms IEOR 6711: Stochastic Models I Professor Whitt, Tuesday, October 22, 213 Renewal Theory: Proof of Blackwell s theorem 1 Delayed Renewal Processes: Exploiting Laplace Transforms The proof of Blackwell s

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

7. Kinetics controlled by fluctuations: Kramers theory of activated processes

7. Kinetics controlled by fluctuations: Kramers theory of activated processes 7. Kinetics controlled by fluctuations: Kramers theory of activated processes Macroscopic kinetic processes (time dependent concentrations) Elementary kinetic process Reaction mechanism Unimolecular processes

More information

Probability Distributions - Lecture 5

Probability Distributions - Lecture 5 Probability Distributions - Lecture 5 1 Introduction There are a number of mathematical models of probability density functions that represent the behavior of physical systems. In this lecture we explore

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 20, 2012 Introduction Introduction The world of the model-builder

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

07. Stochastic Processes: Applications

07. Stochastic Processes: Applications University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 10-19-2015 07. Stochastic Processes: Applications Gerhard Müller University of Rhode Island, gmuller@uri.edu

More information

Session 1: Probability and Markov chains

Session 1: Probability and Markov chains Session 1: Probability and Markov chains 1. Probability distributions and densities. 2. Relevant distributions. 3. Change of variable. 4. Stochastic processes. 5. The Markov property. 6. Markov finite

More information

F n = F n 1 + F n 2. F(z) = n= z z 2. (A.3)

F n = F n 1 + F n 2. F(z) = n= z z 2. (A.3) Appendix A MATTERS OF TECHNIQUE A.1 Transform Methods Laplace transforms for continuum systems Generating function for discrete systems This method is demonstrated for the Fibonacci sequence F n = F n

More information

Continuous Time Processes

Continuous Time Processes page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

More information

Asymptotics of rare events in birth death processes bypassing the exact solutions

Asymptotics of rare events in birth death processes bypassing the exact solutions INSTITUTE OF PHYSICS PUBLISHING JOURNAL OF PHYSICS: CONDENSED MATTER J. Phys.: Condens. Matter 9 (27) 6545 (2pp) doi:.88/953-8984/9/6/6545 Asymptotics of rare events in birth death processes bypassing

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ) Chapter 4 Stochastic Processes 4. Definition In the previous chapter we studied random variables as functions on a sample space X(ω), ω Ω, without regard to how these might depend on parameters. We now

More information

Chapter 6: Random Processes 1

Chapter 6: Random Processes 1 Chapter 6: Random Processes 1 Yunghsiang S. Han Graduate Institute of Communication Engineering, National Taipei University Taiwan E-mail: yshan@mail.ntpu.edu.tw 1 Modified from the lecture notes by Prof.

More information

Continuous Distributions

Continuous Distributions A normal distribution and other density functions involving exponential forms play the most important role in probability and statistics. They are related in a certain way, as summarized in a diagram later

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Performance Modelling of Computer Systems

Performance Modelling of Computer Systems Performance Modelling of Computer Systems Mirco Tribastone Institut für Informatik Ludwig-Maximilians-Universität München Fundamentals of Queueing Theory Tribastone (IFI LMU) Performance Modelling of Computer

More information

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

More information

1. Stochastic Process

1. Stochastic Process HETERGENEITY IN QUANTITATIVE MACROECONOMICS @ TSE OCTOBER 17, 216 STOCHASTIC CALCULUS BASICS SANG YOON (TIM) LEE Very simple notes (need to add references). It is NOT meant to be a substitute for a real

More information

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There

More information

1.3 Forward Kolmogorov equation

1.3 Forward Kolmogorov equation 1.3 Forward Kolmogorov equation Let us again start with the Master equation, for a system where the states can be ordered along a line, such as the previous examples with population size n = 0, 1, 2,.

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

Fokker-Planck Equation with Detailed Balance

Fokker-Planck Equation with Detailed Balance Appendix E Fokker-Planck Equation with Detailed Balance A stochastic process is simply a function of two variables, one is the time, the other is a stochastic variable X, defined by specifying: a: the

More information

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or Physics 7b: Statistical Mechanics Brownian Motion Brownian motion is the motion of a particle due to the buffeting by the molecules in a gas or liquid. The particle must be small enough that the effects

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Week 1 Quantitative Analysis of Financial Markets Distributions A

Week 1 Quantitative Analysis of Financial Markets Distributions A Week 1 Quantitative Analysis of Financial Markets Distributions A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October

More information

Lecturer: Olga Galinina

Lecturer: Olga Galinina Renewal models Lecturer: Olga Galinina E-mail: olga.galinina@tut.fi Outline Reminder. Exponential models definition of renewal processes exponential interval distribution Erlang distribution hyperexponential

More information

04. Random Variables: Concepts

04. Random Variables: Concepts University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 215 4. Random Variables: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

6 Continuous-Time Birth and Death Chains

6 Continuous-Time Birth and Death Chains 6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.

More information

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18. IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

More information

1 Probability and Random Variables

1 Probability and Random Variables 1 Probability and Random Variables The models that you have seen thus far are deterministic models. For any time t, there is a unique solution X(t). On the other hand, stochastic models will result in

More information

Week 9 Generators, duality, change of measure

Week 9 Generators, duality, change of measure Week 9 Generators, duality, change of measure Jonathan Goodman November 18, 013 1 Generators This section describes a common abstract way to describe many of the differential equations related to Markov

More information

MARKOV PROCESSES. Valerio Di Valerio

MARKOV PROCESSES. Valerio Di Valerio MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some

More information

Probability Distributions

Probability Distributions Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Basic Concepts and Tools in Statistical Physics

Basic Concepts and Tools in Statistical Physics Chapter 1 Basic Concepts and Tools in Statistical Physics 1.1 Introduction Statistical mechanics provides general methods to study properties of systems composed of a large number of particles. It establishes

More information

2012 NCTS Workshop on Dynamical Systems

2012 NCTS Workshop on Dynamical Systems Barbara Gentz gentz@math.uni-bielefeld.de http://www.math.uni-bielefeld.de/ gentz 2012 NCTS Workshop on Dynamical Systems National Center for Theoretical Sciences, National Tsing-Hua University Hsinchu,

More information

z x = f x (x, y, a, b), z y = f y (x, y, a, b). F(x, y, z, z x, z y ) = 0. This is a PDE for the unknown function of two independent variables.

z x = f x (x, y, a, b), z y = f y (x, y, a, b). F(x, y, z, z x, z y ) = 0. This is a PDE for the unknown function of two independent variables. Chapter 2 First order PDE 2.1 How and Why First order PDE appear? 2.1.1 Physical origins Conservation laws form one of the two fundamental parts of any mathematical model of Continuum Mechanics. These

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

Random variables. DS GA 1002 Probability and Statistics for Data Science.

Random variables. DS GA 1002 Probability and Statistics for Data Science. Random variables DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Motivation Random variables model numerical quantities

More information

Poisson Variables. Robert DeSerio

Poisson Variables. Robert DeSerio Poisson Variables Robert DeSerio Poisson processes A Poisson process is one in which events are randomly distributed in time, space or some other variable with the number of events in any non-overlapping

More information

Local vs. Nonlocal Diffusions A Tale of Two Laplacians

Local vs. Nonlocal Diffusions A Tale of Two Laplacians Local vs. Nonlocal Diffusions A Tale of Two Laplacians Jinqiao Duan Dept of Applied Mathematics Illinois Institute of Technology Chicago duan@iit.edu Outline 1 Einstein & Wiener: The Local diffusion 2

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

1 Types of stochastic models

1 Types of stochastic models 1 Types of stochastic models Models so far discussed are all deterministic, meaning that, if the present state were perfectly known, it would be possible to predict exactly all future states. We have seen

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Problem Set 8

Problem Set 8 Eli H. Ross eross@mit.edu Alberto De Sole November, 8.5 Problem Set 8 Exercise 36 Let X t and Y t be two independent Poisson processes with rate parameters λ and µ respectively, measuring the number of

More information

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

More information

New Physical Principle for Monte-Carlo simulations

New Physical Principle for Monte-Carlo simulations EJTP 6, No. 21 (2009) 9 20 Electronic Journal of Theoretical Physics New Physical Principle for Monte-Carlo simulations Michail Zak Jet Propulsion Laboratory California Institute of Technology, Advance

More information

4 Introduction to First-Order Partial Differential

4 Introduction to First-Order Partial Differential 4 Introduction to First-Order Partial Differential Equations You have already encountered examples of first-order PDEs in Section 3 when we discussed the pure advection mechanism, and mentioning the interesting

More information

1 Elementary probability

1 Elementary probability 1 Elementary probability Problem 1.1 (*) A coin is thrown several times. Find the probability, that at the n-th experiment: (a) Head appears for the first time (b) Head and Tail have appeared equal number

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

1 Separation of Variables

1 Separation of Variables Jim ambers ENERGY 281 Spring Quarter 27-8 ecture 2 Notes 1 Separation of Variables In the previous lecture, we learned how to derive a PDE that describes fluid flow. Now, we will learn a number of analytical

More information

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted

More information

07. Stochastic Processes: Applications

07. Stochastic Processes: Applications University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 2015 07. Stochastic Processes: Applications Gerhard Müller University of Rhode Island, gmuller@uri.edu

More information

Slides 8: Statistical Models in Simulation

Slides 8: Statistical Models in Simulation Slides 8: Statistical Models in Simulation Purpose and Overview The world the model-builder sees is probabilistic rather than deterministic: Some statistical model might well describe the variations. An

More information

Nonlinear Control. Nonlinear Control Lecture # 3 Stability of Equilibrium Points

Nonlinear Control. Nonlinear Control Lecture # 3 Stability of Equilibrium Points Nonlinear Control Lecture # 3 Stability of Equilibrium Points The Invariance Principle Definitions Let x(t) be a solution of ẋ = f(x) A point p is a positive limit point of x(t) if there is a sequence

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

The dynamics of small particles whose size is roughly 1 µmt or. smaller, in a fluid at room temperature, is extremely erratic, and is

The dynamics of small particles whose size is roughly 1 µmt or. smaller, in a fluid at room temperature, is extremely erratic, and is 1 I. BROWNIAN MOTION The dynamics of small particles whose size is roughly 1 µmt or smaller, in a fluid at room temperature, is extremely erratic, and is called Brownian motion. The velocity of such particles

More information

Lectures on Elementary Probability. William G. Faris

Lectures on Elementary Probability. William G. Faris Lectures on Elementary Probability William G. Faris February 22, 2002 2 Contents 1 Combinatorics 5 1.1 Factorials and binomial coefficients................. 5 1.2 Sampling with replacement.....................

More information

EDRP lecture 7. Poisson process. Pawe J. Szab owski

EDRP lecture 7. Poisson process. Pawe J. Szab owski EDRP lecture 7. Poisson process. Pawe J. Szab owski 2007 Counting process Random process fn t ; t 0g is called a counting process, if N t is equal total number of events that have happened up to moment

More information

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

3. Poisson Processes (12/09/12, see Adult and Baby Ross) 3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous

More information

Alignment processes on the sphere

Alignment processes on the sphere Alignment processes on the sphere Amic Frouvelle CEREMADE Université Paris Dauphine Joint works with : Pierre Degond (Imperial College London) and Gaël Raoul (École Polytechnique) Jian-Guo Liu (Duke University)

More information

Sampling Distributions

Sampling Distributions In statistics, a random sample is a collection of independent and identically distributed (iid) random variables, and a sampling distribution is the distribution of a function of random sample. For example,

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

Exercises. (a) Prove that m(t) =

Exercises. (a) Prove that m(t) = Exercises 1. Lack of memory. Verify that the exponential distribution has the lack of memory property, that is, if T is exponentially distributed with parameter λ > then so is T t given that T > t for

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Stochastic Population Kinetics and Its Underlying Mathematicothermodynamics

Stochastic Population Kinetics and Its Underlying Mathematicothermodynamics Stochastic Population Kinetics and Its Underlying Mathematicothermodynamics Hong Qian Department of Applied Mathematics, University of Washington Seattle, WA 98195-3925 January 23, 2018 Abstract Based

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

Lecture 12: Detailed balance and Eigenfunction methods

Lecture 12: Detailed balance and Eigenfunction methods Miranda Holmes-Cerfon Applied Stochastic Analysis, Spring 2015 Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility),

More information

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00 Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013

More information

Sloppy derivations of Ito s formula and the Fokker-Planck equations

Sloppy derivations of Ito s formula and the Fokker-Planck equations Sloppy derivations of Ito s formula and the Fokker-Planck equations P. G. Harrison Department of Computing, Imperial College London South Kensington Campus, London SW7 AZ, UK email: pgh@doc.ic.ac.uk April

More information

Stochastic Particle Methods for Rarefied Gases

Stochastic Particle Methods for Rarefied Gases CCES Seminar WS 2/3 Stochastic Particle Methods for Rarefied Gases Julian Köllermeier RWTH Aachen University Supervisor: Prof. Dr. Manuel Torrilhon Center for Computational Engineering Science Mathematics

More information

Ordinary Differential Equation Theory

Ordinary Differential Equation Theory Part I Ordinary Differential Equation Theory 1 Introductory Theory An n th order ODE for y = y(t) has the form Usually it can be written F (t, y, y,.., y (n) ) = y (n) = f(t, y, y,.., y (n 1) ) (Implicit

More information

Math 2a Prac Lectures on Differential Equations

Math 2a Prac Lectures on Differential Equations Math 2a Prac Lectures on Differential Equations Prof. Dinakar Ramakrishnan 272 Sloan, 253-37 Caltech Office Hours: Fridays 4 5 PM Based on notes taken in class by Stephanie Laga, with a few added comments

More information

Lecture 1: Brief Review on Stochastic Processes

Lecture 1: Brief Review on Stochastic Processes Lecture 1: Brief Review on Stochastic Processes A stochastic process is a collection of random variables {X t (s) : t T, s S}, where T is some index set and S is the common sample space of the random variables.

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

18.175: Lecture 13 Infinite divisibility and Lévy processes

18.175: Lecture 13 Infinite divisibility and Lévy processes 18.175 Lecture 13 18.175: Lecture 13 Infinite divisibility and Lévy processes Scott Sheffield MIT Outline Poisson random variable convergence Extend CLT idea to stable random variables Infinite divisibility

More information

Simulation methods for stochastic models in chemistry

Simulation methods for stochastic models in chemistry Simulation methods for stochastic models in chemistry David F. Anderson anderson@math.wisc.edu Department of Mathematics University of Wisconsin - Madison SIAM: Barcelona June 4th, 21 Overview 1. Notation

More information

Stat410 Probability and Statistics II (F16)

Stat410 Probability and Statistics II (F16) Stat4 Probability and Statistics II (F6 Exponential, Poisson and Gamma Suppose on average every /λ hours, a Stochastic train arrives at the Random station. Further we assume the waiting time between two

More information

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models Fatih Cavdur fatihcavdur@uludag.edu.tr March 29, 2014 Introduction Introduction The world of the model-builder

More information

Statistics 150: Spring 2007

Statistics 150: Spring 2007 Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

More information

Department of Mathematics

Department of Mathematics Department of Mathematics Ma 3/103 KC Border Introduction to Probability and Statistics Winter 2017 Supplement 2: Review Your Distributions Relevant textbook passages: Pitman [10]: pages 476 487. Larsen

More information

Branching Processes II: Convergence of critical branching to Feller s CSB

Branching Processes II: Convergence of critical branching to Feller s CSB Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied

More information

Solution. For one question the mean grade is ḡ 1 = 10p = 8 and the standard deviation is 1 = g

Solution. For one question the mean grade is ḡ 1 = 10p = 8 and the standard deviation is 1 = g Exam 23 -. To see how much of the exam grade spread is pure luck, consider the exam with ten identically difficult problems of ten points each. The exam is multiple-choice that is the correct choice of

More information

Derivation of Itô SDE and Relationship to ODE and CTMC Models

Derivation of Itô SDE and Relationship to ODE and CTMC Models Derivation of Itô SDE and Relationship to ODE and CTMC Models Biomathematics II April 23, 2015 Linda J. S. Allen Texas Tech University TTU 1 Euler-Maruyama Method for Numerical Solution of an Itô SDE dx(t)

More information