07. Stochastic Processes: Applications

Size: px
Start display at page:

Download "07. Stochastic Processes: Applications"

Transcription

1 University of Rhode Island Nonequilibrium Statistical Physics Physics Course Materials Stochastic Processes: Applications Gerhard Müller University of Rhode Island, Creative Commons License This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. Follow this and additional works at: nonequilibrium_statistical_physics Part of the Physics Commons Abstract Part seven of course materials for Nonequilibrium Statistical Physics (Physics 626), taught by Gerhard Müller at the University of Rhode Island. Entries listed in the table of contents, but not shown in the document, exist only in handwritten form. Documents will be updated periodically as more entries become presentable. Updated with version 2 on 5/3/2016. Recommended Citation Müller, Gerhard, "07. Stochastic Processes: Applications" (2015). Nonequilibrium Statistical Physics. Paper 7. This Course Material is brought to you for free and open access by the Physics Course Materials at DigitalCommons@URI. It has been accepted for inclusion in Nonequilibrium Statistical Physics by an authorized administrator of DigitalCommons@URI. For more information, please contact digitalcommons@etal.uri.edu.

2 Contents of this Document [ntc7] 7. Stochastic Processes: Applications Diffusion process [nex27] Cauchy process [nex98] Random walk in one dimension [nln60] Random walk in one dimension: unit steps at unit times [nex34] Random walk in one dimension: unit steps at random times [nex33] Random walk in one dimension: tiny steps at frequent times [nex100] Random walk in Las Vegas: chance and necessity [nex40] Poisson process [nex25] Free particle with uncertain position and velocity [nex36] Fokker-Planck equation with constant coefficients [nex101] House of the mouse: two-way doors only [nex102] House of the mouse: some one-way doors [nex103] House of the mouse: one-way doors only [nex104] House of the mouse: mouse with inertia [nex105] House of the mouse: mouse with memory [nex43] Mixing marbles red and white [nex42] Random traffic around city block [nex86] Modeling a Markov chain [nex87] Ornstein-Uhlenbeck process [nln62] [nex31] [nex41] Predator-prey system: deterministic, stochastic, observational [nsl3] Populations with linear birth and death rates I [nex44] Populations with linear birth and death rates II [nex112] Populations with linear birth and death rates III [nex130] Catalyst-driven chemical reaction: stationary state [nex46] Catalyst driven chemical reaction: dynamics [nex107]

3 Catalyst driven chemical reaction: total rate of reactions [nex108] Air in leaky tank I: generating function [nex48] Air in leaky tank II: probability distribution [nex109] Air in leaky tank III: detailed balance [nex49] Air in leaky tank IV: evolution of mean and variance [nex110] Pascal distribution and Planck radiation law [nex50] Effects of nonlinear death rate I: Malthus-Verhulst equation [nex111] Effects of nonlinear death rate II: stationarity and fluctuations [nex51] Modified linear birth rate I: stationarity [nex113] Modified linear birth rate II: evolution of mean and variance [nex114] Modified linear birth rate III: generating function [nex115] Modified linear birth rate IV: probability distribution [nex116] Bistable chemical system [nex52] Ultracold neutrons in an ideal Steyerl bottle [nex47] Random light switch [nex45]

4 [nex27] Specifications of diffusion process Examine the diffusion process, P (x x 0 ; t) = 1 4πD t exp ( (x x 0) 2 4D t as a special solution of the differential Chapman-Kolmogorov equation by determining the three specifications (two of which are zero): W (x x 0 ) = lim t 0 1 A(x) = lim dx(x x 0 )P (x x 0 ; t), t 0 t x x 0 <ɛ 1 t P (x x 0; t), B(x) = lim t 0 ), 1 dx(x x 0 ) 2 P (x x 0 ; t). t x x 0 <ɛ Use the results for W (x x 0 ), A(x), and B(x) to simplify the differential Chapman-Kolmogorov equation from [nln56] nto the diffusion equation.

5 [nex98] Specifications of Cauchy process Examine the Cauchy process, P (x x 0 ; t) = 1 π t (x x 0 ) 2 + ( t) 2, as a special solution of the differential Chapman-Kolmogorov equation by determining the three specifications (two of which are zero): W (x x 0 ) = lim t 0 1 A(x) = lim dx(x x 0 )P (x x 0 ; t), t 0 t x x 0 <ɛ 1 t P (x x 0; t), B(x) = lim t 0 1 dx(x x 0 ) 2 P (x x 0 ; t). t x x 0 <ɛ

6 Random Walk in One Dimension [nln60] In the following, we analyze this very common process in different ways and also examine variations of it. Walker takes units steps in unit time. [nex34] Position x n = nl of walker at time t = Nτ. Conditional probability: P (x n, t N 0, 0). Chapman-Kolmogorov equation (discretized version). Binomial distribution. Walker takes smaller steps more frequently. [nex100] Steps left or right with probabilities p and q, respectively. Limit: l 0, τ 0, p q 0 with l 2 /2τ = D and (p q)l/τ = v. Fokker-Planck equation with constant drift and diffusion. Walker s destination drifts and diffuses [nex101] Solution of Fokker-Planck equation via characteristic equation. Gaussian with drifting peak and runaway broadening. Walker takes discrete steps randomly in continuous time. [nex33] Master equation for discrete random variable. Walker takes step left or right with equal probability. Mean time interval between steps: τ. Position distributed via modified Bessel function. Gaussian function in the limit l 0, τ 0 with l 2 /2τ = D. Walk that is random only in time. [nex25] Poisson process. Walker takes one step in time τ on average (same direction). Master equation solved via characteristic equation. Poisson distribution rises as a power law and fades out exponentially. Random in Las Vegas. [nex40] Gambling is a biased random walk near a precipice. First bias: the odds favor the casino. Second bias: the casino has more resources. Third bias: the gambler imagines Markovian features.

7 [nex34] Random walk in one dimension: unit steps at unit times Consider the conditional probability distribution P (n, t N 0, 0) describing a biased random walk in one dimension as determined by the (discrete) Chapman-Kolmogorov equation, P (n, t N+1 0, 0) = m P (n, t N+1 m, t N )P (m, t N 0, 0), where t N = Nτ and P (n, t N+1 m, t N ) = pδ m,n 1 + qδ m,n+1 expresses the instruction that the walker takes a step of unit size forward (with probability p) or backward (with probability q = 1 p) after one time unit τ. Convert this equation into an equation for the characteristic function Φ(k, t N ) = n eikn P (n, t N 0, 0), then solve solve that equation, and determine P (n, t N 0, 0) from it, all by elementary means.

8 [nex33] Random walk in one dimension: unit steps at random times Consider the conditional probability distribution P (n, t 0, 0) describing an unbiased random walk in one dimension as determined by the master equation, with transition rates d dt P (n, t 0, 0) = m [ ] W (n m)p (m, t 0, 0) W (m n)p (n, t 0, 0), W (n m) = σδ n+1,m + σδ n 1,m. Here 2σ is the time rate at which the walker takes steps of unit size. The mean time interval between steps is then τ = 1/2σ. (a) Convert the master equation into an ordinary differential equation for the characteristic function, Φ(k, t) = n eikn P (n, t 0, 0), solve it, and determine the probability distribution P (n, t 0, 0) from it via inverse Fourier transform. (b) Set nl = x for the position of the walker, where l is the step size, and consider the limit l 0, σ such that l 2 σ = D. Then determine P (x, t 0, 0) in this limit. (c) Plot P (n, t 0, 0) versus n for various fixed t in comparison with the asymptotic P (x, t 0, 0).

9 [nex100] Random walk in one dimension: tiny steps at frequent times Consider the conditional probability distribution P (n, t N 0, 0) describing a biased random walk in one dimension as determined by the (discrete) Chapman-Kolmogorov equation, P (n, t N+1 0, 0) = m P (n, t N+1 m, t N )P (m, t N 0, 0), where P (n, t N+1 m, t N ) = pδ m,n 1 + qδ m,n+1 expresses the instruction that the walker takes a step of fixed size forward (with probability p) or backward (with probability q = 1 p) after one time unit. Set nl = x n for the position of the walker and Nτ = t N for the elapsed time, where l is the (constant) step size and τ is the length of the (constant) time unit between steps. Now consider the limit l 0, τ 0, p q 0 such that l 2 /2τ = D and (p q)l/τ = v. Show that this limit converts the Chapman-Kolmogorov equation into a Fokker-Planck equation with constant coefficients v (drift) and D (diffusion). What is the solution P (x, t 0, 0) in this limit?

10 [nex40] Random walk in Las Vegas: chance and necessity A gambler with $1 in his pocket starts playing a game against a casino with infinite monetary resources. In each round of the game, the gambler wins $1 (with probability p) or loses $1 (with probability 1 p). The game ends when the gambler is bankrupt. (a) Express the probability P C that the gambler goes bankrupt eventually as a function of p. (b) Plot P C versus p for 0 < p < 1. (c) For what value of p is it a fair game in the sense that the gambler has a 50% chance of staying in the game forever?

11 [nex25] Poisson process. Consider the discrete Poisson process specified by the master equation, t P (n, t) = m [ ] W (n m)p (m, t 0, 0) W (m n)p (n, t 0, 0), W (n m) = λδ n 1,m, for the discrete stochastic variable n = 0, 1, 2,... and with the initial condition P (n, 0 0, 0) = δ n,0. Convert the master equation into a differential equation for the generating function G(z, t), then solve that equation, and determine P (n, t 0, 0) via power expansion. Applications of the Poisson process include the following: (i) Radioactive decay. Macroscopic sample of radioactive nuclei observed over a time interval that is short compared to the mean decay time of individual nuclei. The average decay rate is λ. P (n, t 0, 0) is the probability that exactly n nuclei have decayed until time t. (ii) Shot noise. Electrical current in a vacuum tube. Electrons arrive at the anode randomly. The average rate of arrivals is λ. P (n, t 0, 0) is the probability that exactly n electrons have arrived at the anode until time t.

12 [nex36] Free particle with uncertain position and velocity Consider a physical ensemble of free particles with unit mass moving along the x-axis. The initial positions and velocities, x 0, v 0, are specified by a Gaussian joint probability distribution: P 0 (x 0, v 0 ) = (2π) 1 exp( x 2 0/2 v0/2). 2 (a) Find the joint probability distribution P (x, v; t) for the position and velocity at time t. Infer from this result the probability distributions P (x; t), P (v; t) for the position and the velocity separately. Calculate the average position x(t) and the variance x 2 (t) thereof. (b) Find the conditional probability distribution P (x v; t) for the positions x at time t of particles that have velocity v. Calculate the conditional averages x n (t) v dx x n P (x v; t), n = 1, 2 for the positions of particles that have velocity v, and infer from these results the conditional variance x 2 (t) v.

13 [nex101] Fokker-Planck equation with constant coefficients Convert the Fokker-Planck equation with constant coefficients of drift and diffusion, t P (x, t x 0) = A x P (x, t x 0) 1 2 B 2 x 2 P (x, t x 0), into an ordinary differential equation for the characteristic function, Φ(k, t). = + dx e ikx P (x, t x 0 ). (a) Solve this differential equation (by elementary means) and infer P (x, t x 0 ) via inverse Fourier transform. Use the initial condition P (x, 0 x 0 ) = δ(x x 0 ). (b) Identify the mean x and the variance x 2 in the solution P (x, t x 0 ). (c) Simplify the solution P (x, t x 0 ) for the special case B = 0 (no diffusion).

14 [nex102] House of the mouse: two-way doors only A trained mouse lives in a house with floor plan as shown. A bell rings at regular time intervals, prompting the mouse to go to an adjacent room through any door with equal probability. (a) If the mouse starts from room A or room B or room C, with what probability is he in each room after two rings of the bell? Begin by constructing the transition matrix W and the initial vector P (0). (b) Calculate the probability of the mouse being in each room after the bell has rung a great many times. Does it matter where the mouse was initially? Carry out this part in two ways: (i) Solve the left-eigenvector problem of matrix W to get the stationary distribution π. (ii) Calculate the stationary distribution π directly from the detailed-balance condition. A B C

15 [nex103] House of the mouse: some one-way doors A trained mouse lives in a house with floor plan as shown in two versions. The house has three rooms and three doors. One or two doors are open one way only. A bell rings at regular time intervals, prompting the mouse to go to an adjacent room through any open door with equal probability. (a) Construct the transition matrix W for both floor plans. (b) The one-way doors are incompatible with the detailed-balance condition. Show that the transition matrix is regular, nevertheless, in both cases. For what minimum exponent s does W s have no zero elements in each case? (c) Regularity of W guarantees that the probability distribution for the location of the mouse is unique after the bell has rung a great many times. Calculate that stationary distribution for both cases by solving the left-eigenvector problem of matrix W. A A B C B C

16 [nex104] House of the mouse: one-way doors only A trained mouse lives in a house with floor plan as shown in two versions. The house has three rooms and three doors. All doors are open one way only. A bell rings at regular time intervals, prompting the mouse to make an attempt to go to an adjacent room through any open door without preference. (a) Construct the transition matrix W for both floor plans and calculate W s for s = 1, 2,.... Describe the resulting pattern of change in each case. (b) Solve the left-eigenvector problem of matrix W in each case and interpret the results. A A B C B C

17 [nex105] House of the mouse: mouse with inertia A trained mouse lives in a house with floor plan as shown in two versions. The house has three rooms and three doors. All doors are open one way only. A bell rings at regular time intervals, prompting the mouse with equal probability to either stay put or to make an attempt to go to an adjacent room through any open door without preference. (a) Construct the transition matrix W for both floor plans and calculate W s for s = 1, 2, 3. Describe the resulting pattern of change in each case. (b) Solve the left-eigenvector problem of matrix W in each case and interpret the results. A A B C B C

18 [nex43] House of the mouse: mouse with memory A trained mouse lives in a house with floor plan as shown. A bell rings at regular time intervals, prompting the mouse to either stay put or to go to an adjacent room through any door with no preference. When the mouse is in rooms B or C he reacts at the first ring of the bell but when he is in room A he reacts only at the second ring. What fraction of the time does the mouse spend in each room on average? A B C

19 [nex42] Mixing marbles red and white There are two white marbles in cup A and four red marbles in cup B. In every step of a Markov process a marble is selected at random from each cup and placed into the opposite cup. (a) What are the probabilities that after three steps cup A contains (i) two white marbles, (ii) one white and one red marble, (iii) two red marbles? (b) What is the limiting probability distribution of the three configurations after many steps?

20 [nex86] Random traffic around city block A city block is surrounded by streets with one-way and two-way traffic as shown. At intersections where drivers have a choice they choose randomly from the available options. U-turns are prohibited. With what probability does a car that starts out at point A end up (i) at point B, (ii) at point C? C A B

21 [nex87] Modeling a Markov chain If the following sequence of states observed in a system for the duration of 50 transitions is a Markov process, what would be a reasonable model for its transition matrix W? Given your model, determine the stationary distribution (π 0, π 1 ) of W and compare it with the frequency of states 0 and 1 occurring in the observed sequence. Observed sequence (to be read in five rows from left to right):

22 Ornstein-Uhlenbeck Process [nln62] The Fokker-Planck equation of the Ornstein-Uhlenbeck process, P t = x (κxp ) + γ 2 P 2 x, (1) 2 features the standard (constant) diffusion term, B(x, t) = γ > 0 and a position-dependent drift term, A(x, t) = κx with κ > 0. This sort of drift is directed toward a particular position, namely x = 0. In a sense the drift counteracts the diffusion here. The diffusion term alone would broaden and flatten the probability distribution. A normal drift term, as in [nex101], has no effect on the broadening. If a stationary solution P S (x) of (1) exists it must satisfy the equation d [ κxp S (x) + γ d ] dx 2 dx P S(x) = 0. (2) Normalizability requires that P S (± ) = 0 and P S (± ) = 0. Consequence: the content of the square baracket in (2) must vanish. Separation of variables and integration then yields a Gaussian centered at x = 0, κ P S (x) = /γ πγ e κx2. (3) The ratio κ/γ in the prefactor and in the exponent represents the competition between diffusion and (restoring) drift. For the dynamic solution of (1) we take two different approaches: [nex31] We derive from the second-order PDE (1) for the probability distribution P (x, t) with initial condition P (x, 0) = δ(x x 0 ) a firstorder PDE for the characteristic function and then solve that PDE. The result is a Gaussian whose mean and variance relax toward the values of the stationary solution (3). The relaxation rate is governed by the drift coefficient κ alone. [nex41] We search for solutions that permit a product ansatz P (x, t). = U(t)V (x) and aim to express the general solution as sum of such solutions. This is a reasonable goal for a linear PDE such as (1). The result expresses U(t) as an exponential function and V (x) as a Hermite polynomial multiplied by a Gaussian.

23 [nex31] Fokker-Planck equation for Ornstein-Uhlenbeck process. Consider the Ornstein -Uhlenbeck process as specified by the Fokker-Planck equation, P t = x (κxp ) γ 2 P x 2, (1) for the conditional probability distribution P (x, t x 0 ), where x 0 specifies the intial value of all sample paths: P (x, 0 x 0 ) = δ(x x 0 ). (a) Derive from the 2 nd order PDE (1) the 1 st order PDE for the characteristic function: Φ t + κs s Φ(s, t) = 1 2 γs2 Φ(s, t), Φ(s, t). = (b) Solve (2) by the method of characteristics, + dx e isx P (x, t x 0 ). (2) 1 dt = κs 1 ds = 2 γs2 Φ dφ. (3) (c) Infer from the solution Φ(s, t) an explicit expression for P (x, t x 0 ).

24 [nex41] Ornstein Uhlenbeck process: general solution. (a) Show that the Fokker-Planck equation of the Ornstein-Uhlenbeck process can be solved by separation of variables and that the general solution can be expressed in terms of Hermite polynomials: P t = x (κxp ) γ 2 P x 2 ; P (x, t) = ( ) κ a n H n γ x e nκt e κx2 /γ. (b) Show that a unique stationary solution P S (x) is approached in the limit t for arbitrary of initial conditions. (c) Determine the expansion coefficients a n for the particular initial distribution P (x, 0) = δ(x x 0 ). n=0

25 Predator-Prey System [nsl3] A population of foxes (predator F ) feeds on a population of hares (prey H). The birth rate of foxes is proportional to the fox population and to the amount of food available. Foxes die naturally, i.e. at a rate proportional to the fox population. Hares die primarily through encounters with foxes and are born at a rate proportional to the hare population. Deterministic time evolution: Lotka-Volterra model. dh dt = ah bhf, df dt = bhf df. Anharmonic oscillation. Hyperbolic fixed point at (0, 0). Elliptic fixed point at (d/b, a/b). Fluctuations excluded. Environmental effects in initial conditions only. Stochastic time evolution: master equation. t P (H, F, t) = [ ] W (H H ; F F )P (H, F, t) W (H H; F F )P (H, F, t), H F Non-vanishing transition rates: W (H + 1 H; F F ) = ah W (H 1 H; F + 1 F ) = bhf W (H H; F 1 F ) = df Fluctuations now included. (prey is born) (predator thrives on prey) (predator dies) Environmental effects in intial conditions and in contingencies of stochastic time evolution.

26 Time evolution of deterministic system: Lotka-Volterra model Computer simulation of stochastic system: master equation Observation of real system: two species of mites [images from Gardiner 1985] 2

27 [nex44] Populations with linear birth and death rates I Consider the master equation d dt P (n, t) = [W (n m)p (m, t) W (m n)p (n, t)] m for the probability distribution P (n, t) of the linear birth-death process. transition rates W (m n) = nλδ m,n+1 + nµδ m,n 1, It is specified by the where λ and µ represent the birth and death rates, respectively, of individuals in some population. (a) Determine the jump moments α l (m) = n (n m)l W (n m) for l = 1, 2. (b) Calculate the time evolution of the mean value n and the variance n 2 for the initial condition P (n, 0) = δ n,n0 by solving the equations of motion for the expectation values, d dt n = α 1(n), d dt n2 = α 2 (n) + 2 nα 1 (n), introduced in [nln59]. (c) Plot n 2 versus t for three cases with (i) λ > µ, (ii) λ = µ, and (iii) λ < µ. Interpret the shape of each curve.

28 [nex112] Populations with linear birth and death rates II Consider the master equation d dt P (n, t) = [W (n m)p (m, t) W (m n)p (n, t)] m for the probability distribution P (n, t) of the linear birth-death process with initial population n 0. It is specified by the transition rates W (m n) = nλδ m,n+1 + nµδ m,n 1, where λ and µ represent the birth and death rates, respectively, of individuals in some population. (a) Convert the the master equation into a linear 1 st order PDE for the generating function as follows: G t (z 1)(λz µ) G z = 0, G(z, t). = (b) Solve the PDE by using the method of characteristics: The result reads G(z, t) = 1 1)(λz µ) = (z = 0 dt dz dg. z n P (n, t). n=0 ( ) n0 u(z, t) µ, u(z, t) = λz µ u(z, t) λ z 1 e (λ µ)t. (c) Simplify the expression of G(z, t) for the special case λ = µ. (d) For the case λ = µ calculate mean n(t) and variance n 2 (t) from derivatives of G(z, t).

29 [nex130] Populations with linear birth and death rates III Consider the master equation d dt P (n, t) = [W (n m)p (m, t) W (m n)p (n, t)] m for the probability distribution P (n, t) of the linear birth-death process with initial population n 0. It is specified by the transition rates W (m n) = nλδ m,n+1 + nµδ m,n 1, where λ and µ represent the birth and death rates, respectively, of individuals in some population. In [nex112] we have determined the following expression for the generating function pertaining to the special case λ = µ of equal birth and death rates: G(z, t). = z n P (n, t) = n=0 ( ) n0 λ(z 1)t z. λ(z 1)t 1 (a) Expand the generating function in powers of z for the special case n 0 = 1 to derive analytic expressions for the probabilities P (n, t). (b) Verify the normalization condition n P (n, t) = 1. (c) Plot P (n, t) versus λt for n = 0, 1,..., 4 (five curves). (d) Find the time t n where P (n, t) reaches its maximum value. (e) Discuss the compatibility of the paradoxical results (i) lim t P (0, t) = 1 (certainty of death) and from [nex44] (ii) n(t) = n 0 = 1, (iii) lim t n 2 (t) (persistent signs of life and uncertainty). (f) Explore the determination of analytic expressions of P (n, t) for n 0 = 2, 3,... or for generic n 0.

30 [nex46] Catalyst driven chemical reaction: stationary state In the chemical reaction A + X A + Y, the molecule A is a catalyst at constant concentration. The total number of reacting molecules, n x + n y = N, is also constant. K 1 is the probability per unit time that a molecule X interacts with a molecule A to turn into a molecule Y, and K 2 is the probability per unit time that a Y interacts with an A to produce an X. The dynamics may be described by a master equation for P (n, t), where n. = n x, n y = N n. The transition rates are W (m n) = K 1 nδ m,n 1 + K 2 (N n)δ m,n+1. To calculate the mean value n(t) and the variance n 2 (t) proceed as in [nex44] via the equations of motion, d n /dt = α 1 (n), d n 2 /dt = α 2 (n) + 2 nα 1 (n), with jump moments α l (m) = n (n m)l W (n m). (a) Construct the equations of motion for n(t), n 2 (t). (b) Infer the long-time asymptotic values n( ), n 2 ( ) directly. (c) Plot n( ), n 2 ( ) versus γ for K 1 = γ, K 2 = 1 γ (thus fixing the time scale) and explain the location of the maximum for each quantity.

31 [nex107] Catalyst driven chemical reaction: dynamics In the chemical reaction A + X A + Y, the molecule A is a catalyst at constant concentration. The total number of reacting molecules, n x + n y = N, is also constant. K 1 is the probability per unit time that a molecule X interacts with a molecule A to turn into a molecule Y, and K 2 is the probability per unit time that a Y interacts with an A to produce an X. The dynamics may be described by a master equation for P (n, t), where n n x, n y = N n. The transition rates are W (m n) = K 1 nδ m,n 1 + K 2 (N n)δ m,n+1. (a) Solve the equations of motion for n(t), n 2 (t) as constructed in [nex46]. Use initial values n(0) = n 0, n 2 (0) = 0. (b) Plot n(t), n 2 (t) in separate frames for n 0 = 0, K 1 = γ, K 2 = 1 γ, and various γ. This fixes the time scale. Identify any interesting features in the curves and try to explain them.

32 [nex108] Catalyst driven chemical reaction: total rate of reactions In the chemical reaction A + X A + Y, the molecule A is a catalyst at constant concentration. The total number of reacting molecules, n x + n y = N, is also constant. K 1 is the probability per unit time that a molecule X interacts with a molecule A to turn into a molecule Y, and K 2 is the probability per unit time that a Y interacts with an A to produce an X. The dynamics may be described by a master equation for P (n, t), where n n x, n y = N n. The transition rates are W (m n) = K 1 nδ m,n 1 + K 2 (N n)δ m,n+1. The total rate of chemical reactions is defined as follows: R(t). = nm W (n m)p (m, t). (a) Express R(t) in terms of n(t). (b) Use the result of n( ) from [nex46] to calculate the total rate of chemical reactions in the stationary state. Set K 1 = γ, K 2 = 1 γ and compare the γ-dependence of R( ) with that of n 2 ( ) from [nex46], which is a measure of the fluctuations in the population of molecules. (c) Use the result of n(t) from [nex107] to calculate the time evolution of R(t). Plot R(t) for n 0 = 0, K 1 = γ, K 2 = 1 γ and various fixed values of γ. The time scale is thus set. Compare the graph of R(t) with the graph of n 2 (t) from [nex107]. Explain the similarities and differences.

33 [nex48] Air in leaky tank I: generating function At time t = 0 a tank of volume V contains n 0 molecules of air (disregarding chemical distinctions). The tank has a tiny leak and exchanges molecules with the environment, which has a constant density ρ of air molecules. (a) Set up the master equation for the probability distribution P (n, t) under the assumption that a molecule leaves the tank with probability (n/v )dt and enters the tank with probability ρdt, implying transition rates W (m n) = ρδ m,n+1 + (n/v )δ m,n 1. (b) Derive the following linear partial differential equation (PDE) for the generating function G(z, t) from that master equation: G t + z 1 G V z = ρ(z 1)G, G(z, t) =. (c) Solve the PDE by the method of characteristics, z n P (n, t). n=0 to obtain the result 1 dt = z 1 V dz ρ(z 1)G =, dg G(z, t) = e V ρ(z 1)[1 e t/v ] [ ] e t/v n0 (z 1) + 1 for the nonequilibrium state. (d) Find the characteristic function G(z, ) for the equilibrium situation. (e) If we set n 0 /V (density inside) equal to ρ (density outside), the generating function still depends on time. Explain the reason. (f) Show that for n 0 /V = ρ the function G(z, t) at arbitrary t converges, as ρv, toward the stationary result determined in (e).

34 [nex109] Air in leaky tank II: probability distribution At time t = 0 a tank of volume V contains n 0 molecules of air (disregarding chemical distinctions). The tank has a tiny leak and exchanges molecules with the environment, which has a constant density ρ of air molecules. (a) Infer from the generating function, G(z, t) = e V ρ(z 1)[1 e t/v ] [ e t/v (z 1) + 1] n0, calculated in [nex48], via a power series expansion, the probability distribution, P (n, t) = e V ρ[1 e t/v ] min(n,n0) m=0 n 0! m!(n 0 m)!(n m)! (ρv )n m (1 e t/v ) n+n0 2m e mt/v, for the number of molecules in the tank. (b) Demonstrate consistency of this result with the initial condition: lim t 0 P (n, t) = δ n,n0. (c) Show that the equilibrium state is described by a Poisson distribution by (i) expanding the equilibrium generating function G(z, ) from [nex48] into a power series, and (ii) by taking the limit P (n, t ).

35 [nex49] Air in leaky tank III: detailed balance A tank of volume V has a small leak and exchanges molecules of air with the environment. The environment has a constant density ρ of molecules. The master equation for the probability distribution P (n, t) of air molecules in the container is specified by transition rates of the form W (m n) = T + (n)δ m,n+1 + T (n)δ m,n 1 with T + (n) = ρ and T (n) = n/v. (a) Determine the stationary distribution P s (n) = P (n, t ) from the detailed balance condition, T (n)p s (n) = T + (n 1)P s (n 1), via the recurrence relation derived in [nln17]. (b) Compare the peak position n p of the stationary distribution with the mean value n.

36 [nex110] Air in leaky tank IV: evolution of mean and variance At time t = 0 a tank of volume V contains n 0 molecules of air (disregarding chemical distinctions). The tank has a tiny leak and exchanges molecules with the environment, which has a constant density ρ of air molecules. Given the transition rate W (m n) and generating function G(z, t) derived in [nex48] for this process we are in a position to calculate the the time evolution of the mean n(t) and the variance n 2 (t) in three different ways with moderate effort. (a) Calculate n(t) f and n 2 (t) f directly from the (known) G(z, t) and infer mean and variance from these factorial moments. (b) Infer equations of motions for the n m (t) f, m = 1, 2,... from the (known) PDE for G(z, t). Solve these equations for m = 1, 2. In this system a closed set of equations results for any m. (c) Calculate the jump moments α l (m) = n (n m)l W (n m) from the known transition rates and show that the associated equations of motion d n /dt = α 1 (n), d n 2 /dt = α 2 (n) + 2 nα 1 (n) for the first two moments are equivalent to those previously derived for the factorial moments.

37 [nex50] Pascal distribution and Planck radiation law Consider a quantum harmonic oscillator in thermal equilibrium at temperature T. We know from [nex22] that the population of energy levels E n = n ω 0, n = 0, 1, 2,... is described by the Pascal distribution, P (n) = (1 γ)γ n, γ = exp( ω 0 /k B T ). Suppose that the heat bath is provided by a blackbody radiation field of energy density u(ω) and that the oscillator interacts with that field exclusively via emission and absorption of photons with energy ω 0. If we describe the aproach to thermal equilibrium of the oscillator by a master equation, the transition rates must, therefore, be of the type, W (m n) = T + (n)δ m,n+1 +T (n)δ m,n 1, familiar from birth-death processes (see [nln17]). Here T + (n) reflects the absorption of a photon and T (n) the emission of a photon if the oscillator is in energy level n. Now we assume (as Einstein did) that the transition rates are of the form T + (n) = Bu(ω 0 ) and T (n) = A + Bu(ω 0 ), where the term A reflects spontaneous emission and the terms Bu(ω 0 ) induced emission or induced absorption. Infer from the compatibility condition of these transition rates with the Pascal distribution at equilibrium the T -dependence of u(ω) at fixed frequency ω 0.

38 [nex111] Effects of nonlinear death rates I: Malthus-Verhulst equation Consider the master equation of the birth-death process with transition rates, W (m n) = nλδ m,n+1 + [nµ + γ ] N n(n 1) δ m,n 1. It describes a population with a linear birth rate, nλ, and a linear death rate, nµ, as in [nex44]. To account for the unhealthy environment in crowded conditions (n N), a nonlinear death rate has been added. The nonlinearity suppresses the continued exponential growth found in [nex44] for the case λ > µ and stabilizes a stationary state. (a) Construct the equations of motion for n(t) from the first jump moment as in [nex44]. Then neglect fluctuations by setting n 2 (t) n(t) 2 and n(t) = Nx(t) + O( N) to arrive (in leading order) at the Malthus-Verhulst equation, dx dt = (λ µ)x γx2, for the time-dependence of the (scaled) average population. Derive the analytic solution, x 0 e (λ µ)t x(t) = [ 1 + γx0 λ µ e (λ µ)t 1 ], pertaining to initial value x 0 = n 0 /N and parameters λ µ. (b) Show how the exponential long-time asymptotics crosses over to power-law asymptotics in the limit λ µ 0. (c) Find the dependence of the stationary population x( ) on λ, µ, γ. (d) Plot x(t)/x 0 versus t for µ = 1 and (i) various values of λ at fixed γ = 0.2 and (ii) various values of γ at fixed λ = 2. Interpret your results.

39 [nex51] Effects of nonlinear death rate II: stationarity and fluctuations Consider the master equation of the birth-death process with transition rates W (m n) = (n + 1)λδ m,n+1 + [nµ + γ ] N n(n 1) δ m,n 1. It describes a population with a linear birth rate, (n+1)λ, and a linear death rate, nµ. To account for the unhealthy environment under crowded circumstances (n N), a nonlinear death rate has been added to the process. Use the recurrence relation, P s (n) = [T + (n 1)/T (n)]p s (n 1) for the stationary distribution P s (n) derived in [nln17] from the detailed balance condition for the following tasks. Consider a system that easily accommodates N = 20 individuals of some population with fixed (linear) death rate µ = 1, fixed birth rate λ = 1.5, and variable environmental factor γ. (a) Compute the mean n and the variance n 2 for γ = 0.2. (b) Plot the distribution P (n) versus n for γ = 0.2, 0.4,..., 1.0 in the same diagram. (c) Plot the mean n across the range 0.2 < γ < 1 and compare the result with the function Nx(t) derived in [nex111] from the Malthus-Verhulst equation, which ignores fluctuations. (d) Plot the variance n 2 across the range 0.2 < γ < 1.

40 [nex113] Modified linear birth rate I: stationarity Consider a linear birth-death process with a modified birth rate, W (m n) = λ(n + 1)δ m,n+1 + µnδ m,n 1, to be used in the master equation. In [nex44] we had shown that the original linear birth rate T + (n) = nλ leads to the extinction of the population if λ < µ. (a) Use the recurrence relation of [nln17] to show that the modified linear birth rate T + (n) = (n + 1)λ leads to a nonvanishing stationary distribution P s (n) for λ < µ. Use the ratio γ = λ/µ as parameter of that distribution. Show that P s (n) is the Pascal distribution (see [nex22]). (b) Determine the stationary values of n and n 2. Describe the differences between the longtime asymptotics of the original linear birth-death process with λ = µ as analyzed in [nex130] and the results obtained here for the case of modified birth rates in the limit µ λ 0.

41 [nex114] Modified linear birth rate II: evolution of mean and variance Consider a linear birth-death process with a modified birth rate to be used in the master equation: W (m n) = λ(n + 1)δ m,n+1 + µnδ m,n 1, λ < µ. In [nex44] we had shown that the original linear birth rate T + (n) = nλ leads to the extinction of the population. In [nex113] we showed that the modified linear birth rate T + (n) = (n + 1)λ leads to a nonvanishing stationary distribution P s (n): the Pascal distribution. (a) Establish from the jump moments, α l (n) = m (m n)l W (m n), the equations of motion, d dt n. = α 1 (n) = (λ µ) n + λ, d dt n2. = α 2 (n) + 2 nα 1 (n) = 2(λ µ) n 2 + (3λ + µ) n(t) + λ, for the first and second moment. (b) Calculate the time evolution of mean n(t) and variance n 2 (t) by solving these equations for initial conditions n(0) = n 2 (0) = 0.

42 [nex115] Modified linear birth rate III: generating function Consider a linear birth-death process with a modified birth rate to be used in the master equation: W (m n) = λ(n + 1)δ m,n+1 + µnδ m,n 1, λ < µ. In [nex44] we had shown that the original linear birth rate T + (n) = nλ leads to the extinction of the population. In [nex113] we showed that the modified linear birth rate T + (n) = (n + 1)λ leads to a nonvanishing stationary distribution P s (n): the Pascal distribution. (a) Construct the PDE, G t (λz µ)(z 1) G z = λ(z 1)G, for the generating function G(z, t) = n zn P (n, t) and to solve that PDE for the case of zero initial population by the method of characteristics (see e.g. [nex112]). The result reads G(z, t) = λ µ λz µ λ(z 1)e (λ µ)t. (b) Calculate mean n(t) and variance n 2 (t) from derivative of the G(z, t) for comparison with the results obtained in [nex114] via a different route. (c) Calculate the stationary distribution P s (n) from the asymptotic generating function G(z, ).

43 [nex116] Modified linear birth rate IV: probability distribution Consider a linear birth-death process with a modified birth rate to be used in the master equation: W (m n) = λ(n + 1)δ m,n+1 + µnδ m,n 1, λ < µ. In [nex44] we had shown that the original linear birth rate T + (n) = nλ leads to the extinction of the population. In [nex113] we showed that the modified linear birth rate T + (n) = (n + 1)λ leads to a nonvanishing stationary distribution P s (n): the Pascal distribution. In [nex115] we have calculated the generating function G(z, t) for the case of zero initial population. Use that result here to calculate the explicit result, P (n, t) = µ λ µ λe (λ µ)t ( λ[1 e (λ µ)t ) n ], µ λe (λ µ)t for the time-dependence of the probability distribution P (n, t). Check the limit t 0 to verify the imposed initial condition and the limit t to confirm the result of P s (n) previously obtained by other means.

44 [nex52] Bistable chemical system Consider the master equation of the birth-death process specified by transition rates of the form W (m n) = T + (n)δ m,n+1 + T (n)δ m,n 1 with T + (n) = k 1 An(n 1) + k 3 A, T (n) = k 2 n(n 1)(n 2) + k 4 n. This process describes two simultaneous chemical reactions A + 2X 3X, A X that exhibit bistable states in a certain parameter range. The concentration of A is taken to be constant. (a) Construct a product expression for P s (n) from the detailed-balance condition as explained in [nln17]. Use the three parameters B = k 1 A/k 2, R = k 4 /k 2, Q = k 3 /k 1. (b) Show that that for R/Q = 1 we thus obtain the Poisson distribution. (c) Plot the solution of the extremum condition T + (n 1) = T (n) as a graph B versus n over the range 0 n 100 for the two cases (i) Q = 100, R = 500 and (ii) Q = 100, R = Identify the extremum positions n extr for B = 70 in both cases. (d) Plot P s (n) versus n over the range 0 n 100 for B = 70 and cases (i), (ii) for Q, R.

45 [nex47] Ultracold neutrons in an ideal Steyerl bottle Consider a container whose walls are perfect mirrors for ultracold neutrons. At time t = 0 the bottle is known to contain exactly n 0 neutrons. The decay rate of a neutron is K. (a) Set up the master equation, P (n, t) = (n + 1)KP (n + 1, t) KnP (n, t), t for the probability distribution P (n, t), and derive the PDE, G t + K(z 1) G z = 0, for the generating function G(z, t). = n zn P (n, t). (b) Solve the PDE by the method of characteristics (see e.g. [nex112]) to obtain (c) Infer therefrom the probability distribution G(z, t) = [ (z 1)e Kt + 1 ] n 0. P (n, t) = ( n 0! ) 1 e Kt n 0 n!(n 0 n)! (e Kt 1) n. (d) Determine (via derivatives of the generating function) the average number n(t) of remaining neutrons and the variance n 2 (t) thereof. (e) Design a contour plot of P (n, t) for 0 < n < n 0 = 20 and 0 < Kt < 3. Design a line graph of P (n, t) for 0 < Kt < 5 for fixed n = 0, 1, 2, 5, 10, 15, 18, 19. Interpret these graphs. (f) Derive equations of motion for n and n(n 1) directly from the master equation and solve them to reproduce the solutions obtained in (d).

46 [nex45] Random light switch. The position of the light switch is described by the stochastic variable X, which can assume the two values x = 0 (lights off) and x = 1 (lights on). Some agent switches the lights on/off randomly at the rate γ. This means that the average interval of continuous brightness/darkness is τ = 1/γ. (a) Set up the master equation for P (x, t x 0 ) and solve it. (b) Find the asymptotic distribution P s (x) = lim t P (x, t x 0 ). (c) Find the conditional average X(t) x 0. = x xp (x, t x 0) and then X(t) s = lim t X(t) x 0. (d) Use the regression theorem X(t)X(t ) s. = xx P (x, t x, t )P s (x ) to determine the (stationary) autocorrelation function X(t)X(t ) s. = X(t)X(t ) s X(t) s X(t ) s.

07. Stochastic Processes: Applications

07. Stochastic Processes: Applications University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 10-19-2015 07. Stochastic Processes: Applications Gerhard Müller University of Rhode Island, gmuller@uri.edu

More information

Table of Contents [ntc]

Table of Contents [ntc] Table of Contents [ntc] 1. Introduction: Contents and Maps Table of contents [ntc] Equilibrium thermodynamics overview [nln6] Thermal equilibrium and nonequilibrium [nln1] Levels of description in statistical

More information

06. Stochastic Processes: Concepts

06. Stochastic Processes: Concepts University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 2015 06. Stochastic Processes: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu

More information

04. Random Variables: Concepts

04. Random Variables: Concepts University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 215 4. Random Variables: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

08. Brownian Motion. University of Rhode Island. Gerhard Müller University of Rhode Island,

08. Brownian Motion. University of Rhode Island. Gerhard Müller University of Rhode Island, University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 1-19-215 8. Brownian Motion Gerhard Müller University of Rhode Island, gmuller@uri.edu Follow this

More information

Handbook of Stochastic Methods

Handbook of Stochastic Methods C. W. Gardiner Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences Third Edition With 30 Figures Springer Contents 1. A Historical Introduction 1 1.1 Motivation I 1.2 Some Historical

More information

11. Recursion Method: Concepts

11. Recursion Method: Concepts University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 16 11. Recursion Method: Concepts Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

10. Zwanzig-Mori Formalism

10. Zwanzig-Mori Formalism University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 0-9-205 0. Zwanzig-Mori Formalism Gerhard Müller University of Rhode Island, gmuller@uri.edu Follow

More information

Handbook of Stochastic Methods

Handbook of Stochastic Methods Springer Series in Synergetics 13 Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences von Crispin W Gardiner Neuausgabe Handbook of Stochastic Methods Gardiner schnell und portofrei

More information

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes

More information

6 Continuous-Time Birth and Death Chains

6 Continuous-Time Birth and Death Chains 6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.

More information

10. Zwanzig-Mori Formalism

10. Zwanzig-Mori Formalism University of Rhode Island DigitalCommons@URI Nonequilibrium Statistical Physics Physics Course Materials 205 0. Zwanzig-Mori Formalism Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

10. Scattering from Central Force Potential

10. Scattering from Central Force Potential University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 215 1. Scattering from Central Force Potential Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI Equilibrium Statistical Physics Physics Course Materials 2015 07. Kinetic Theory I Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons

More information

03. Simple Dynamical Systems

03. Simple Dynamical Systems University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 2015 03. Simple Dynamical Systems Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

1 Elementary probability

1 Elementary probability 1 Elementary probability Problem 1.1 (*) A coin is thrown several times. Find the probability, that at the n-th experiment: (a) Head appears for the first time (b) Head and Tail have appeared equal number

More information

Contents of this Document [ntc5]

Contents of this Document [ntc5] Contents of this Document [ntc5] 5. Random Variables: Applications Reconstructing probability distributions [nex14] Probability distribution with no mean value [nex95] Variances and covariances [nex20]

More information

The dynamics of small particles whose size is roughly 1 µmt or. smaller, in a fluid at room temperature, is extremely erratic, and is

The dynamics of small particles whose size is roughly 1 µmt or. smaller, in a fluid at room temperature, is extremely erratic, and is 1 I. BROWNIAN MOTION The dynamics of small particles whose size is roughly 1 µmt or smaller, in a fluid at room temperature, is extremely erratic, and is called Brownian motion. The velocity of such particles

More information

14. Ideal Quantum Gases II: Fermions

14. Ideal Quantum Gases II: Fermions University of Rhode Island DigitalCommons@URI Equilibrium Statistical Physics Physics Course Materials 25 4. Ideal Quantum Gases II: Fermions Gerhard Müller University of Rhode Island, gmuller@uri.edu

More information

13. Ideal Quantum Gases I: Bosons

13. Ideal Quantum Gases I: Bosons University of Rhode Island DigitalCommons@URI Equilibrium Statistical Physics Physics Course Materials 5 3. Ideal Quantum Gases I: Bosons Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative

More information

02. Newtonian Gravitation

02. Newtonian Gravitation University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 2015 02. Newtonian Gravitation Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

M4A42 APPLIED STOCHASTIC PROCESSES

M4A42 APPLIED STOCHASTIC PROCESSES M4A42 APPLIED STOCHASTIC PROCESSES G.A. Pavliotis Department of Mathematics Imperial College London, UK LECTURE 1 12/10/2009 Lectures: Mondays 09:00-11:00, Huxley 139, Tuesdays 09:00-10:00, Huxley 144.

More information

16. Working with the Langevin and Fokker-Planck equations

16. Working with the Langevin and Fokker-Planck equations 16. Working with the Langevin and Fokker-Planck equations In the preceding Lecture, we have shown that given a Langevin equation (LE), it is possible to write down an equivalent Fokker-Planck equation

More information

Dynamical Systems and Chaos Part II: Biology Applications. Lecture 6: Population dynamics. Ilya Potapov Mathematics Department, TUT Room TD325

Dynamical Systems and Chaos Part II: Biology Applications. Lecture 6: Population dynamics. Ilya Potapov Mathematics Department, TUT Room TD325 Dynamical Systems and Chaos Part II: Biology Applications Lecture 6: Population dynamics Ilya Potapov Mathematics Department, TUT Room TD325 Living things are dynamical systems Dynamical systems theory

More information

Stochastic Modelling Unit 1: Markov chain models

Stochastic Modelling Unit 1: Markov chain models Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson

More information

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or Physics 7b: Statistical Mechanics Brownian Motion Brownian motion is the motion of a particle due to the buffeting by the molecules in a gas or liquid. The particle must be small enough that the effects

More information

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

More information

Path integrals for classical Markov processes

Path integrals for classical Markov processes Path integrals for classical Markov processes Hugo Touchette National Institute for Theoretical Physics (NITheP) Stellenbosch, South Africa Chris Engelbrecht Summer School on Non-Linear Phenomena in Field

More information

Brownian Motion and Langevin Equations

Brownian Motion and Langevin Equations 1 Brownian Motion and Langevin Equations 1.1 Langevin Equation and the Fluctuation- Dissipation Theorem The theory of Brownian motion is perhaps the simplest approximate way to treat the dynamics of nonequilibrium

More information

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18. IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

More information

Lecture 4: Introduction to stochastic processes and stochastic calculus

Lecture 4: Introduction to stochastic processes and stochastic calculus Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London

More information

STOCHASTIC PROCESSES IN PHYSICS AND CHEMISTRY

STOCHASTIC PROCESSES IN PHYSICS AND CHEMISTRY STOCHASTIC PROCESSES IN PHYSICS AND CHEMISTRY Third edition N.G. VAN KAMPEN Institute for Theoretical Physics of the University at Utrecht ELSEVIER Amsterdam Boston Heidelberg London New York Oxford Paris

More information

6.2.2 Point processes and counting processes

6.2.2 Point processes and counting processes 56 CHAPTER 6. MASTER EQUATIONS which yield a system of differential equations The solution to the system of equations is d p (t) λp, (6.3) d p k(t) λ (p k p k 1 (t)), (6.4) k 1. (6.5) p n (t) (λt)n e λt

More information

The Kramers problem and first passage times.

The Kramers problem and first passage times. Chapter 8 The Kramers problem and first passage times. The Kramers problem is to find the rate at which a Brownian particle escapes from a potential well over a potential barrier. One method of attack

More information

12. Rigid Body Dynamics I

12. Rigid Body Dynamics I University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 015 1. Rigid Body Dynamics I Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

8 Ecosystem stability

8 Ecosystem stability 8 Ecosystem stability References: May [47], Strogatz [48]. In these lectures we consider models of populations, with an emphasis on the conditions for stability and instability. 8.1 Dynamics of a single

More information

13. Rigid Body Dynamics II

13. Rigid Body Dynamics II University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 2015 13. Rigid Body Dynamics II Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Local vs. Nonlocal Diffusions A Tale of Two Laplacians

Local vs. Nonlocal Diffusions A Tale of Two Laplacians Local vs. Nonlocal Diffusions A Tale of Two Laplacians Jinqiao Duan Dept of Applied Mathematics Illinois Institute of Technology Chicago duan@iit.edu Outline 1 Einstein & Wiener: The Local diffusion 2

More information

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ) Chapter 4 Stochastic Processes 4. Definition In the previous chapter we studied random variables as functions on a sample space X(ω), ω Ω, without regard to how these might depend on parameters. We now

More information

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Expectation. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,

More information

15. Hamiltonian Mechanics

15. Hamiltonian Mechanics University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 2015 15. Hamiltonian Mechanics Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Elementary Applications of Probability Theory

Elementary Applications of Probability Theory Elementary Applications of Probability Theory With an introduction to stochastic differential equations Second edition Henry C. Tuckwell Senior Research Fellow Stochastic Analysis Group of the Centre for

More information

A Short Introduction to Diffusion Processes and Ito Calculus

A Short Introduction to Diffusion Processes and Ito Calculus A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,

More information

Data analysis and stochastic modeling

Data analysis and stochastic modeling Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days? IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 11. RC Circuits Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 08. Capacitors II Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 09. Resistors I Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Is the superposition of many random spike trains a Poisson process?

Is the superposition of many random spike trains a Poisson process? Is the superposition of many random spike trains a Poisson process? Benjamin Lindner Max-Planck-Institut für Physik komplexer Systeme, Dresden Reference: Phys. Rev. E 73, 2291 (26) Outline Point processes

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Chapter 3: Birth and Death processes

Chapter 3: Birth and Death processes Chapter 3: Birth and Death processes Thus far, we have ignored the random element of population behaviour. Of course, this prevents us from finding the relative likelihoods of various events that might

More information

Lecture 12: Detailed balance and Eigenfunction methods

Lecture 12: Detailed balance and Eigenfunction methods Miranda Holmes-Cerfon Applied Stochastic Analysis, Spring 2015 Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility),

More information

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather

LTCC. Exercises. (1) Two possible weather conditions on any day: {rainy, sunny} (2) Tomorrow s weather depends only on today s weather 1. Markov chain LTCC. Exercises Let X 0, X 1, X 2,... be a Markov chain with state space {1, 2, 3, 4} and transition matrix 1/2 1/2 0 0 P = 0 1/2 1/3 1/6. 0 0 0 1 (a) What happens if the chain starts in

More information

Harmonic oscillator. U(x) = 1 2 bx2

Harmonic oscillator. U(x) = 1 2 bx2 Harmonic oscillator The harmonic oscillator is a familiar problem from classical mechanics. The situation is described by a force which depends linearly on distance as happens with the restoring force

More information

5 Applying the Fokker-Planck equation

5 Applying the Fokker-Planck equation 5 Applying the Fokker-Planck equation We begin with one-dimensional examples, keeping g = constant. Recall: the FPE for the Langevin equation with η(t 1 )η(t ) = κδ(t 1 t ) is = f(x) + g(x)η(t) t = x [f(x)p

More information

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Expectation. DS GA 1002 Probability and Statistics for Data Science.   Carlos Fernandez-Granda Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,

More information

2D-Volterra-Lotka Modeling For 2 Species

2D-Volterra-Lotka Modeling For 2 Species Majalat Al-Ulum Al-Insaniya wat - Tatbiqiya 2D-Volterra-Lotka Modeling For 2 Species Alhashmi Darah 1 University of Almergeb Department of Mathematics Faculty of Science Zliten Libya. Abstract The purpose

More information

LINEAR RESPONSE THEORY

LINEAR RESPONSE THEORY MIT Department of Chemistry 5.74, Spring 5: Introductory Quantum Mechanics II Instructor: Professor Andrei Tokmakoff p. 8 LINEAR RESPONSE THEORY We have statistically described the time-dependent behavior

More information

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

A Detailed Look at a Discrete Randomw Walk with Spatially Dependent Moments and Its Continuum Limit

A Detailed Look at a Discrete Randomw Walk with Spatially Dependent Moments and Its Continuum Limit A Detailed Look at a Discrete Randomw Walk with Spatially Dependent Moments and Its Continuum Limit David Vener Department of Mathematics, MIT May 5, 3 Introduction In 8.366, we discussed the relationship

More information

GENERATION OF COLORED NOISE

GENERATION OF COLORED NOISE International Journal of Modern Physics C, Vol. 12, No. 6 (2001) 851 855 c World Scientific Publishing Company GENERATION OF COLORED NOISE LORENZ BARTOSCH Institut für Theoretische Physik, Johann Wolfgang

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship

Deterministic. Deterministic data are those can be described by an explicit mathematical relationship Random data Deterministic Deterministic data are those can be described by an explicit mathematical relationship Deterministic x(t) =X cos r! k m t Non deterministic There is no way to predict an exact

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 07. Capacitors I Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

F n = F n 1 + F n 2. F(z) = n= z z 2. (A.3)

F n = F n 1 + F n 2. F(z) = n= z z 2. (A.3) Appendix A MATTERS OF TECHNIQUE A.1 Transform Methods Laplace transforms for continuum systems Generating function for discrete systems This method is demonstrated for the Fibonacci sequence F n = F n

More information

Week 9 Generators, duality, change of measure

Week 9 Generators, duality, change of measure Week 9 Generators, duality, change of measure Jonathan Goodman November 18, 013 1 Generators This section describes a common abstract way to describe many of the differential equations related to Markov

More information

02. Electric Field II

02. Electric Field II Universit of Rhode Island DigitalCommons@URI PHY 24: Elementar Phsics II Phsics Course Materials 215 2. Electric Field II Gerhard Müller Universit of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Session 1: Probability and Markov chains

Session 1: Probability and Markov chains Session 1: Probability and Markov chains 1. Probability distributions and densities. 2. Relevant distributions. 3. Change of variable. 4. Stochastic processes. 5. The Markov property. 6. Markov finite

More information

1 Types of stochastic models

1 Types of stochastic models 1 Types of stochastic models Models so far discussed are all deterministic, meaning that, if the present state were perfectly known, it would be possible to predict exactly all future states. We have seen

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 18. RL Circuits Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

CHAPTER V. Brownian motion. V.1 Langevin dynamics

CHAPTER V. Brownian motion. V.1 Langevin dynamics CHAPTER V Brownian motion In this chapter, we study the very general paradigm provided by Brownian motion. Originally, this motion is that a heavy particle, called Brownian particle, immersed in a fluid

More information

Asymptotics of rare events in birth death processes bypassing the exact solutions

Asymptotics of rare events in birth death processes bypassing the exact solutions INSTITUTE OF PHYSICS PUBLISHING JOURNAL OF PHYSICS: CONDENSED MATTER J. Phys.: Condens. Matter 9 (27) 6545 (2pp) doi:.88/953-8984/9/6/6545 Asymptotics of rare events in birth death processes bypassing

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 015 14. Oscillations Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License This wor

More information

Random Process. Random Process. Random Process. Introduction to Random Processes

Random Process. Random Process. Random Process. Introduction to Random Processes Random Process A random variable is a function X(e) that maps the set of experiment outcomes to the set of numbers. A random process is a rule that maps every outcome e of an experiment to a function X(t,

More information

Stochastic Volatility and Correction to the Heat Equation

Stochastic Volatility and Correction to the Heat Equation Stochastic Volatility and Correction to the Heat Equation Jean-Pierre Fouque, George Papanicolaou and Ronnie Sircar Abstract. From a probabilist s point of view the Twentieth Century has been a century

More information

1 Assignment 1: Nonlinear dynamics (due September

1 Assignment 1: Nonlinear dynamics (due September Assignment : Nonlinear dynamics (due September 4, 28). Consider the ordinary differential equation du/dt = cos(u). Sketch the equilibria and indicate by arrows the increase or decrease of the solutions.

More information

Reaction time distributions in chemical kinetics: Oscillations and other weird behaviors

Reaction time distributions in chemical kinetics: Oscillations and other weird behaviors Introduction The algorithm Results Summary Reaction time distributions in chemical kinetics: Oscillations and other weird behaviors Ramon Xulvi-Brunet Escuela Politécnica Nacional Outline Introduction

More information

A few principles of classical and quantum mechanics

A few principles of classical and quantum mechanics A few principles of classical and quantum mechanics The classical approach: In classical mechanics, we usually (but not exclusively) solve Newton s nd law of motion relating the acceleration a of the system

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Table of contents Lecture 1: Discrete-valued random variables. Common probability distributions and their properties. Bernoulli trials. Binomial, dist

Table of contents Lecture 1: Discrete-valued random variables. Common probability distributions and their properties. Bernoulli trials. Binomial, dist NPTEL COURSE PHYSICAL APPLICATIONS OF STOCHASTIC PROCESSES V. Balakrishnan Department of Physics, Indian Institute of Technology Madras Chennai 600 036, India This course comprises 9 lectures. The notes

More information

Smoluchowski Diffusion Equation

Smoluchowski Diffusion Equation Chapter 4 Smoluchowski Diffusion Equation Contents 4. Derivation of the Smoluchoswki Diffusion Equation for Potential Fields 64 4.2 One-DimensionalDiffusoninaLinearPotential... 67 4.2. Diffusion in an

More information

Solution. For one question the mean grade is ḡ 1 = 10p = 8 and the standard deviation is 1 = g

Solution. For one question the mean grade is ḡ 1 = 10p = 8 and the standard deviation is 1 = g Exam 23 -. To see how much of the exam grade spread is pure luck, consider the exam with ten identically difficult problems of ten points each. The exam is multiple-choice that is the correct choice of

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Latent voter model on random regular graphs

Latent voter model on random regular graphs Latent voter model on random regular graphs Shirshendu Chatterjee Cornell University (visiting Duke U.) Work in progress with Rick Durrett April 25, 2011 Outline Definition of voter model and duality with

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

C.W. Gardiner. P. Zoller. Quantum Nois e. A Handbook of Markovian and Non-Markovia n Quantum Stochastic Method s with Applications to Quantum Optics

C.W. Gardiner. P. Zoller. Quantum Nois e. A Handbook of Markovian and Non-Markovia n Quantum Stochastic Method s with Applications to Quantum Optics C.W. Gardiner P. Zoller Quantum Nois e A Handbook of Markovian and Non-Markovia n Quantum Stochastic Method s with Applications to Quantum Optics 1. A Historical Introduction 1 1.1 Heisenberg's Uncertainty

More information

14. Magnetic Field III

14. Magnetic Field III University of Rhode sland DigitalCommons@UR PHY 204: Elementary Physics Physics Course Materials 2015 14. Magnetic Field Gerhard Müller University of Rhode sland, gmuller@uri.edu Creative Commons License

More information

7. Kinetics controlled by fluctuations: Kramers theory of activated processes

7. Kinetics controlled by fluctuations: Kramers theory of activated processes 7. Kinetics controlled by fluctuations: Kramers theory of activated processes Macroscopic kinetic processes (time dependent concentrations) Elementary kinetic process Reaction mechanism Unimolecular processes

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion Brownian Motion An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2010 Background We have already seen that the limiting behavior of a discrete random walk yields a derivation of

More information

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License.

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. University of Rhode Island DigitalCommons@URI PHY 204: Elementary Physics II Physics Course Materials 2015 12. Magnetic Field I Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons

More information

Dynamical Analysis of Complex Systems 7CCMCS04

Dynamical Analysis of Complex Systems 7CCMCS04 Dynamical Analysis of Complex Systems 7CCMCS04 A Annibale Department of Mathematics King s College London January 2011 2 Contents Introduction and Overview 7 0.1 Brownian Motion....................................

More information

06. Lagrangian Mechanics II

06. Lagrangian Mechanics II University of Rhode Island DigitalCommons@URI Classical Dynamics Physics Course Materials 2015 06. Lagrangian Mechanics II Gerhard Müller University of Rhode Island, gmuller@uri.edu Creative Commons License

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for

P 1.5 X 4.5 / X 2 and (iii) The smallest value of n for DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X

More information