Stochastic models of biochemical systems
|
|
- Myra Park
- 5 years ago
- Views:
Transcription
1 Stochastic models of biochemical systems David F. Anderson Department of Mathematics University of Wisconsin - Madison University of Amsterdam November 14th, 212
2 Stochastic models of biochemical systems Goal: give broad introduction to stochastic models of biochemical systems, with minimal technical details. Outline 1. Construct useful representation for most common continuous time Markov chain model for population processes. 2. Discuss some computational methods sensitivity analysis. 3. Discuss various approximate models for these CTMCs.
3 Example: ODE Lotka-Volterra predator-prey model Think of A as a prey and B as a predator. A κ 1 2A, A + B κ 2 2B, B κ 3, with κ 1 = 2, κ 2 =.2, κ 3 = 2.
4 Example: ODE Lotka-Volterra predator-prey model Think of A as a prey and B as a predator. A κ 1 2A, A + B κ 2 2B, B κ 3, with κ 1 = 2, κ 2 =.2, κ 3 = 2. Deterministic model. Let x(t) = [# prey at t, # predator at t] T or [ x(t) 1 = κ 1 x 1 (t) ] [ 1 + κ 2 x 1 (t)x 2 (t) 1 ] [ + κ 3 x 2 (t) 1 ] x(t) = x() + κ 1 t [ 1 x 1 (s)ds ] + κ 2 t [ 1 x 1 (s)x 2 (s)ds 1 ] + κ 3 t [ x 2 (s)ds 1 ]
5 Lotka-Volterra Think of A as a prey and B as a predator. A κ 1 2A, A + B κ 2 2B, B κ 3, with κ 1 = 2, κ 2 =.2, κ 3 = Prey Predator
6 Biological example: transcription-translation Gene transcription & translation: G κ 1 G + M M κ 2 M + P M κ 3 P κ 4 G + P κ 5 κ 5 B transcription translation degradation degradation Binding/unbinding of Gene Cartoon representation: 1 N α q 1 X 1 X 2, N α q 2 N λ X 1 N λ 1 M, 2 X2 M, M µ. 1 J. Paulsson, Physics of Life Reviews, 2,
7 Another example: Viral infection Let 1. T = viral template. 2. G = viral genome. 3. S = viral structure. 4. V = virus. Reactions: R1) T + stuff κ 1 T + G κ 1 = 1 R2) G κ 2 T κ 2 =.25 R3) T + stuff κ 3 T + S κ 3 = 1 R4) T κ 4 κ 4 =.25 R5) S κ 5 κ 5 = 2 R6) G + S κ 6 V κ 6 = R. Srivastava, L. You, J. Summers, and J. Yin, J. Theoret. Biol., 22. E. Haseltine and J. Rawlings, J. Chem. Phys, 22. K. Ball, T. Kurtz, L. Popovic, and G. Rempala, Annals of Applied Probability, 26. W. E, D. Liu, and E. Vanden-Eijden, J. Comput. Phys, 26.
8 Some examples E. coli Heat Shock Response Model. 9 species, 18 reactions. 2 2 Hye Won Kang, presentation at SPA in 27.
9 Modeling 1. These models (and much more complicated ones) have historically been predominantly modeled using ODEs. 2. However: 2.1 there are often low numbers of molecules, which makes timing of reactions more random (less averaging), 2.2 when a reaction occurs, the system jumps to new state by non-trivial amount: 1.
10 Modeling 1. These models (and much more complicated ones) have historically been predominantly modeled using ODEs. 2. However: 2.1 there are often low numbers of molecules, which makes timing of reactions more random (less averaging), 2.2 when a reaction occurs, the system jumps to new state by non-trivial amount: Researchers (mostly) lived with these shortcomings until the late 199s and early 2s when it was shown ODE models can not capture important qualitative behavior of certain models: λ-phage lysis-lysogeny decision mechanism (Arkin-McAdams 1998). Green fluorescent protein. ODEs were often the wrong modeling choice.
11 Specifying infinitesimal behavior Q: What is a better modeling choice? Should be 1. discrete space, since counting molecules, and 2. stochastic dynamics. Let s return to development of ODEs. An ordinary differential equation is specified by describing how a function should vary over a small period of time X(t + t) X(t) F(X(t)) t
12 Specifying infinitesimal behavior Q: What is a better modeling choice? Should be 1. discrete space, since counting molecules, and 2. stochastic dynamics. Let s return to development of ODEs. An ordinary differential equation is specified by describing how a function should vary over a small period of time X(t + t) X(t) F(X(t)) t A more precise description (consider a telescoping sum) X(t) = X() + t F(X(s))ds
13 Infinitesimal behavior for jump processes We are interested in functions that are piecewise constant and random. Changes, when they occur, won t be small. If reaction k occurs at time t, X(t) X(t ) = ζ k Z d
14 Infinitesimal behavior for jump processes We are interested in functions that are piecewise constant and random. Changes, when they occur, won t be small. If reaction k occurs at time t, X(t) X(t ) = ζ k Z d What is small? The probability of seeing a jump of a particular size. P{X(t + t) X(t) = ζ k F t} λ ζk (t) t Question: Can we specify the λ ζk in some way that determines X? For the ODE, F depended on X. Maybe λ ζk should depend on X?
15 Simple model For example, consider the simple system A + B C where one molecule each of A and B is being converted to one of C.
16 Simple model For example, consider the simple system A + B C where one molecule each of A and B is being converted to one of C. Intuition for standard stochastic model: P{reaction occurs in (t, t + t] F t} κx A (t)x B (t) t where κ is a positive constant, the reaction rate constant. F t is all the information pertaining to the process up through time t. Can we specify a reasonable model satisfying this assumption?
17 Background information: The Poisson process Will view a Poisson process, Y ( ), through the lens of an underlying point process. (a) Let {e i } be i.i.d. exponential random variables with parameter one.
18 Background information: The Poisson process Will view a Poisson process, Y ( ), through the lens of an underlying point process. (a) Let {e i } be i.i.d. exponential random variables with parameter one. (b) Now, put points down on a line with spacing equal to the e i : x x x x x x x x e 1 e2 e3 t Let Y1 (t) denote the number of points hit by time t. In the figure above, Y1 (t) = λ =
19 The Poisson process Let Y 1 be a unit rate Poisson process. Define Y λ (t) Y 1 (λt), Then Y λ is a Poisson process with parameter λ. x x x x x x x x e 1 e2 e3 t Intuition: The Poisson process with rate λ is simply the number of points hit (of the unit-rate point process) when we run along the time frame at rate λ λ =
20 The Poisson process There is no reason λ needs to be constant in time, in which case ( t ) Y λ (t) Y λ(s)ds is a non-homogeneous Poisson process with propensity/intensity λ(t). Thus P{Y λ (t + t) Y λ (t) > F t} = 1 exp { t+ t } λ(s)ds λ(t) t. t
21 The Poisson process There is no reason λ needs to be constant in time, in which case ( t ) Y λ (t) Y λ(s)ds is a non-homogeneous Poisson process with propensity/intensity λ(t). Thus P{Y λ (t + t) Y λ (t) > F t} = 1 exp { t+ t } λ(s)ds λ(t) t. t Points: 1. We have changed time to convert a unit-rate Poisson process to one which has rate or intensity or propensity λ(t). 2. Will use similar time changes of unit-rate processes to build models of interest.
22 Return to models of interest Consider the simple system A + B C where one molecule each of A and B is being converted to one of C. Intuition for standard stochastic model: P{reaction occurs in (t, t + t] F t} κx A (t)x B (t) t where κ is a positive constant, the reaction rate constant. F t is all the information pertaining to the process up through time t.
23 Models of interest A + B C Simple book-keeping says: if gives the state at time t, then where X(t) = X A (t) X B (t) X C (t) X(t) = X() + R(t) 1 1 1, R(t) is the # of times the reaction has occurred by time t and X() is the initial condition. Goal: represent R(t) in terms of Poisson process.
24 Models of interest Recall that for A + B C our intuition was to specify infinitesimal behavior P{reaction occurs in (t, t + t] Ft} κx A (t)x B (t) t,
25 Models of interest Recall that for A + B C our intuition was to specify infinitesimal behavior P{reaction occurs in (t, t + t] Ft} κx A (t)x B (t) t, and that for a counting process with specified intensity λ(t) we have P{Y λ (t + t) Y λ (t) = 1 F t} λ(t) t.
26 Models of interest Recall that for A + B C our intuition was to specify infinitesimal behavior P{reaction occurs in (t, t + t] Ft} κx A (t)x B (t) t, and that for a counting process with specified intensity λ(t) we have P{Y λ (t + t) Y λ (t) = 1 F t} λ(t) t. This suggests we can model ( t R(t) = Y where Y is a unit-rate Poisson process. ) κx A (s)x B (s)ds
27 Models of interest Recall that for A + B C our intuition was to specify infinitesimal behavior P{reaction occurs in (t, t + t] Ft} κx A (t)x B (t) t, and that for a counting process with specified intensity λ(t) we have P{Y λ (t + t) Y λ (t) = 1 F t} λ(t) t. This suggests we can model ( t R(t) = Y where Y is a unit-rate Poisson process. Hence X A (t) X B (t) X C (t) X(t) = X() + ) κx A (s)x B (s)ds ( t Y This equation uniquely determines X for all t. ) κx A (s)x B (s)ds.
28 Build up model: Random time change representation of Kurtz Now consider a network of reactions involving d chemical species, S 1,..., S d : d d ν ik S i i=1 i=1 ν iks i Denote reaction vector as ζ k = ν k ν k, so that if reaction k occurs at time t X(t) = X(t ) + ζ k.
29 Build up model: Random time change representation of Kurtz Now consider a network of reactions involving d chemical species, S 1,..., S d : d d ν ik S i i=1 i=1 ν iks i Denote reaction vector as ζ k = ν k ν k, so that if reaction k occurs at time t X(t) = X(t ) + ζ k. The intensity (or propensity) of kth reaction is λ k : Z d R. By analogy with before: X(t) = X() + k R k (t)ζ k, with X(t) = X() + ( t ) Y k λ k (X(s))ds ζ k, k Y k are independent, unit-rate Poisson processes.
30 Mass-action kinetics The standard intensity function chosen is mass-action kinetics: λ k (x) = κ k ( ( ) x x i! ν ik!) = κ k ν k (x i ν ik )!. i i Example: If S 1 anything, then λ k (x) = κ k x 1. Example: If S 1 + S 2 anything, then λ k (x) = κ k x 1 x 2. Example: If 2S 2 anything, then λ k (x) = κ k x 2 (x 2 1).
31 Other ways to understand model The infinitesimal generator of a Markov process determines the process: Af (x) def 1 = lim [Exf (X(h)) f (x)] h h
32 Other ways to understand model The infinitesimal generator of a Markov process determines the process: Af (x) def 1 = lim [Exf (X(h)) f (x)] h h [ ] 1 = lim (f (x + ζ k ) f (x))p(r k (h) = 1) + O(h) h h k
33 Other ways to understand model The infinitesimal generator of a Markov process determines the process: Af (x) def 1 = lim [Exf (X(h)) f (x)] h h [ ] 1 = lim (f (x + ζ k ) f (x))p(r k (h) = 1) h h k [ ] 1 = lim (f (x + ζ k ) f (x))λ k (x)h + O(h) h h k + O(h)
34 Other ways to understand model The infinitesimal generator of a Markov process determines the process: Af (x) def 1 = lim [Exf (X(h)) f (x)] h h [ ] 1 = lim (f (x + ζ k ) f (x))p(r k (h) = 1) h h k [ ] 1 = lim (f (x + ζ k ) f (x))λ k (x)h h h k + O(h) + O(h) = k λ k (x)(f (x + ζ k ) f (x)).
35 Other ways to understand model And we have Dynkin s formula (See Ethier and Kurtz, 1986, Ch. 1) Ef (X(t)) f (X ) = E t Af (X(s))ds,
36 Other ways to understand model And we have Dynkin s formula (See Ethier and Kurtz, 1986, Ch. 1) Ef (X(t)) f (X ) = E t Af (X(s))ds, Letting f (y) = 1 x(y), above so that E[f (X(t))] = P{X(t) = x} = p x(t), gives Kolmogorov forward equation (chemical master equation) p t (x) = k λ(x ζ k )p t(x ζ k ) p t(x) k λ k (x)
37 Equivalence of formulations We now have three ways of making the infinitesimal specification precise: P{X(t + t) X(t) = ξ k F X t } λ k (X(t)) t 1. The stochastic equation: X(t) = X() + k Y k ( t ) λ k (X(s))ds ζ k 2. The process is Markov with infinitesimal generator (Af )(x) = k λ k (x)(f (x + ζ k ) f (x)) 3. The master (forward) equation for the probability distributions: p x(t) = k λ k (x ζ k )p t(x ζ k ) p t(x) k λ k (x) Fortunately, if the solution of the stochastic equation doesn t blow up, the three are equivalent. This model is an example of a continuous time Markov chain.
38 Example: ODE Lotka-Volterra predator-prey model Think of A as a prey and B as a predator. A κ 1 2A, A + B κ 2 2B, B κ 3, with κ 1 = 2, κ 2 =.2, κ 3 = 2. Deterministic model. Let x(t) = [#prey, #predators] T x(t) = x() + κ 1 t [ 1 x 1 (s)ds ] + κ 2 t [ 1 x 1 (s)x 2 (s)ds 1 ] + κ 3 t [ x 2 (s)ds 1 ]
39 Example: ODE Lotka-Volterra predator-prey model Think of A as a prey and B as a predator. A κ 1 2A, A + B κ 2 2B, B κ 3, with κ 1 = 2, κ 2 =.2, κ 3 = 2. Deterministic model. Let x(t) = [#prey, #predators] T x(t) = x() + κ 1 t [ 1 x 1 (s)ds ] + κ 2 t [ 1 x 1 (s)x 2 (s)ds 1 ] + κ 3 t [ x 2 (s)ds 1 ] Stochastic model. Let X(t) = [#prey, #predators] T ( t X(t) = X() + Y 1 κ 1 ( t + Y 3 κ 3 ) [ 1 X 1 (s)ds ) [ X 2 (s)ds 1 ] + Y 2 (κ 2 t ] ) [ 1 X 1 (s)x 2 (s)ds 1 ]
40 Another example: Viral infection Let 1. T = viral template. 2. G = viral genome. 3. S = viral structure. 4. V = virus. Reactions: R1) T + stuff κ 1 T + G κ 1 = 1 R2) G κ 2 T κ 2 =.25 R3) T + stuff κ 3 T + S κ 3 = 1 R4) T κ 4 κ 4 =.25 R5) S κ 5 κ 5 = 2 R6) G + S κ 6 V κ 6 = R. Srivastava, L. You, J. Summers, and J. Yin, J. Theoret. Biol., 22. E. Haseltine and J. Rawlings, J. Chem. Phys, 22. K. Ball, T. Kurtz, L. Popovic, and G. Rempala, Annals of Applied Probability, 26. W. E, D. Liu, and E. Vanden-Eijden, J. Comput. Phys, 26.
41 Another example: Viral infection Stochastic equations for X = (X G, X S, X T, X V ) are ( t ) ( X 1 (t) = X 1 () + Y 1 X 3 (s)ds Y 2.25 Y 6 ( t t ) X 1 (s)x 2 (s)ds t ) ( X 2 (t) = X 2 () + Y 3 (1 X 3 (s)ds Y 5 2 ( t ) Y X 1 (s)x 2 (s)ds t X 3 (t) = X 3 () + Y 2 (.25 X 1 (s)ds X 4 (t) = X 4 () + Y 6 ( t t ) ( Y 4.25 ) X 1 (s)x 2 (s)ds. ) X 1 (s)ds ) X 2 (s)ds t ) X 3 (s)ds
42 Computational methods These are continuous time Markov chains! Simulation/computation should be easy. The most common simulation methods include 1. Gillespie s Algorithm Answer where and when independently. 2. The next reaction method of Gibson and Bruck. 3. Each is an example of discrete event simulation.
43 Numerical methods Each exact method produces sample paths that can approximate values such as (which I will talk about tomorrow at CWI) Ef (X(t)) 1 n For example, 1. Means expected virus yield. 2. Variances. 3. Probabilities. n f (X [i] (t)) i=1 or sensitivities d dθ Ef (θ, X θ (t)).
44 Numerical methods Each exact method produces sample paths that can approximate values such as (which I will talk about tomorrow at CWI) Ef (X(t)) 1 n n f (X [i] (t)) i=1 For example, 1. Means expected virus yield. 2. Variances. 3. Probabilities. or sensitivities d dθ Ef (θ, X θ (t)). Problem: solving using these algorithms can be computationally expensive: 1. Each path may require significant number of computational steps. 2. May require significant number of paths. Solution: Need to use novel stochastic representations to get good methods.
45 Specific computational problem: Gradient estimation/sensitivity analysis We have X θ (t) = X θ () + ( t ) Y k λ k (θ, X θ (s))ds ζ k, k with θ R s, and we define J(θ) = Ef (θ, X θ (t)]. We know how to estimate J(θ) using Monte Carlo.
46 Specific computational problem: Gradient estimation/sensitivity analysis We have X θ (t) = X θ () + ( t ) Y k λ k (θ, X θ (s))ds ζ k, k with θ R s, and we define J(θ) = Ef (θ, X θ (t)]. We know how to estimate J(θ) using Monte Carlo. However, what if we want J (θ) = d dθ Ef (θ, X θ (t)). Thus, we want to know how sensitive our statistic is to perturbations in θ. Tells us, for example: 1. Robustness of system to perturbations in parameters. 2. Which parameters we need to estimate well from data, etc.
47 Specific computational problem: Gradient estimation/sensitivity analysis We have X θ (t) = X θ () + ( t ) Y k λ k (θ, X θ (s))ds ζ k, k with θ R s, and we define J(θ) = Ef (θ, X θ (t)]. We know how to estimate J(θ) using Monte Carlo. However, what if we want J (θ) = d dθ Ef (θ, X θ (t)). Thus, we want to know how sensitive our statistic is to perturbations in θ. Tells us, for example: 1. Robustness of system to perturbations in parameters. 2. Which parameters we need to estimate well from data, etc. There are multiple methods. We will consider: Finite differences.
48 Finite differencing This method is pretty straightforward and is therefore used most.
49 Finite differencing This method is pretty straightforward and is therefore used most. Simply note that J (θ) = J(θ + ɛ) J(θ) ɛ Centered differencing reduces bias to O(ɛ 2 ). [ f (θ + ɛ, X θ+ɛ (t)) f (θ, X θ (t)) + O(ɛ) = E ɛ ] + O(ɛ).
50 Finite differencing This method is pretty straightforward and is therefore used most. Simply note that J (θ) = J(θ + ɛ) J(θ) ɛ Centered differencing reduces bias to O(ɛ 2 ). [ f (θ + ɛ, X θ+ɛ (t)) f (θ, X θ (t)) + O(ɛ) = E ɛ ] + O(ɛ). The usual finite difference estimator is D N (ɛ) = 1 N N i=1 f (θ + ɛ, X θ+ɛ [i] (t)) f (θ, X[i](t)) θ ɛ Letting δ > be some desired accuracy (for confidence interval), we need N so that Var(DN (ɛ)) δ.
51 Finite differencing Want Var(DN (ɛ)) δ. with D N (ɛ) = 1 N N i=1 f (θ, X θ+ɛ [i] (t)) f (θ, X[i](t)) θ ɛ If paths generated independently, then implying Var(D N (ɛ)) = N 1 ɛ 2 Var(f (θ, X θ+ɛ [i] (t)) f (θ, X[i](t))) θ = O(N 1 ɛ 2 ), Terrible. Worse than expectations. 1 1 N ɛ = O(δ) = N = O(ɛ 2 δ 2 ) How about common random numbers for variance reduction?
52 Common random numbers It s exactly what it sounds like. Reuse the random numbers used in the generation of X θ+ɛ [i] (t) and X[i](t). θ Why?
53 Common random numbers It s exactly what it sounds like. Reuse the random numbers used in the generation of X θ+ɛ [i] (t) and X[i](t). θ Why? Because: Var(f (θ, X θ+ɛ [i] (t)) f (θ, X θ+ɛ (t))) = Var(f (θ, X θ+ɛ (t))) + Var(f (θ, X[i](t))) θ [i] [i] 2Cov(f (θ, X θ+ɛ [i] (t)), f (θ, X[i](t))). θ So, if we can couple the random variables, we can get a variance reduction! Sometimes substantial.
54 Common random numbers In the context of Gillespie s algorithm, we simply reuse all the same random numbers (uniforms). This can be achieved simply by setting the seed of the random number generator before generating X θ+ɛ and X θ.
55 Common random numbers CRN + Gillespie is good idea. 1. Costs little in terms of implementation. 2. Variance reduction and gains in efficiency can be huge. Thus, it is probably the most common method used today. But: Over time, the processes decouple, often completely. Can we do better?
56 Coupling Using common random numbers in previous fashion is a way of coupling the two processes together.
57 Coupling Using common random numbers in previous fashion is a way of coupling the two processes together. Is there a natural way to couple processes using random time change? Can we couple the Poisson processes?
58 Coupling Using common random numbers in previous fashion is a way of coupling the two processes together. Is there a natural way to couple processes using random time change? Can we couple the Poisson processes? Answer: yes. Multiple ways. I will show one which works very well.
59 How do we generate processes simultaneously Suppose I want to generate: A Poisson process with intensity A Poisson process with intensity 13.
60 How do we generate processes simultaneously Suppose I want to generate: A Poisson process with intensity A Poisson process with intensity 13. We could let Y 1 and Y 2 be independent, unit-rate Poisson processes, and set Z 13.1 (t) = Y 1 (13.1t), Z 13 (t) = Y 2 (13t), Using this representation, these processes are independent and, hence, not coupled.
61 How do we generate processes simultaneously Suppose I want to generate: A Poisson process with intensity A Poisson process with intensity 13. We could let Y 1 and Y 2 be independent, unit-rate Poisson processes, and set Z 13.1 (t) = Y 1 (13.1t), Z 13 (t) = Y 2 (13t), Using this representation, these processes are independent and, hence, not coupled. The variance of difference is large: Var(Z 13.1 (t) Z 13 (t)) = Var(Y 1 (13.1t)) + Var(Y 2 (13t)) = 26.1t.
62 How do we generate processes simultaneously Suppose I want to generate: A Poisson process with intensity A Poisson process with intensity 13.
63 How do we generate processes simultaneously Suppose I want to generate: A Poisson process with intensity A Poisson process with intensity 13. We could let Y 1 and Y 2 be independent unit-rate Poisson processes, and set Z 13.1 (t) = Y 1 (13t) + Y 2 (.1t) Z 13 (t) = Y 1 (13t), The variance of difference is much smaller: Var(Z 13.1 (t) Z 13 (t)) = Var (Y 2 (.1t)) =.1t. Using a fact: sum of homogeneous Poisson process is again a Poisson process.
64 How do we generate processes simultaneously More generally, suppose we want 1. non-homogeneous Poisson process with intensity f (t) and 2. non-homogeneous Poisson process with intensity g(t).
65 How do we generate processes simultaneously More generally, suppose we want 1. non-homogeneous Poisson process with intensity f (t) and 2. non-homogeneous Poisson process with intensity g(t). We can can let Y 1, Y 2, and Y 3 be independent, unit-rate Poisson processes and define ( t ) ( t ) Z f (t) = Y 1 f (s) g(s)ds + Y 2 f (s) (f (s) g(s)) ds, ( t ) ( t Z g(t) = Y 1 f (s) g(s)ds + Y 3 ) g(s) (f (s) g(s)) ds,
66 How do we generate processes simultaneously More generally, suppose we want 1. non-homogeneous Poisson process with intensity f (t) and 2. non-homogeneous Poisson process with intensity g(t). We can can let Y 1, Y 2, and Y 3 be independent, unit-rate Poisson processes and define ( t ) ( t ) Z f (t) = Y 1 f (s) g(s)ds + Y 2 f (s) (f (s) g(s)) ds, ( t ) ( t Z g(t) = Y 1 f (s) g(s)ds + Y 3 ) g(s) (f (s) g(s)) ds, where we are using that, for example, ( t ) ( t ) ( t Y 1 f (s) g(s)ds + Y 2 f (s) (f (s) g(s)) ds = Y where Y is a unit rate Poisson process. ) f (s)ds,
67 Parameter sensitivities. Couple the processes. X θ+ɛ (t) = X θ+ɛ () + k + k X θ (t) = X θ () + k + k Y k,2 ( t Y k,1 ( t Y k,1 ( t Y k,3 ( t λ θ+ɛ k ) λ θ+ɛ k (X θ+ɛ (s)) λ θ k (X θ (s))ds ζ k (X θ+ɛ (s)) λ θ+ɛ k ) (X θ+ɛ (s)) λ θ k (X θ (s))ds ) λ θ+ɛ k (X θ+ɛ (s)) λ θ k (X θ (s))ds ζ k ) λ θ k (X θ (s)) λ θ+ɛ k (X θ+ɛ (s)) λ θ k (X θ (s))ds ζ k, ζ k
68 Parameter sensitivities. Theorem 3 Suppose (X θ+ɛ, X θ ) satisfy coupling. Then, for any T > there is a C T,f > for which ( 2 E sup f (θ + ɛ, X θ+ɛ (t)) f (θ, X (t))) θ CT,f ɛ. t T 3 David F. Anderson, An Efficient Finite Difference Method for Parameter Sensitivities of Continuous Time Markov Chains, SIAM: Journal on Numerical Analysis, Vol. 5, No. 5, 212.
69 Parameter sensitivities. Theorem 3 Suppose (X θ+ɛ, X θ ) satisfy coupling. Then, for any T > there is a C T,f > for which ( 2 E sup f (θ + ɛ, X θ+ɛ (t)) f (θ, X (t))) θ CT,f ɛ. t T This lowers variance of estimator from to Lowered by order of magnitude (in ɛ). O(N 1 ɛ 2 ), O(N 1 ɛ 1 ). Point: a deeper mathematical understanding led to better computational method. 3 David F. Anderson, An Efficient Finite Difference Method for Parameter Sensitivities of Continuous Time Markov Chains, SIAM: Journal on Numerical Analysis, Vol. 5, No. 5, 212.
70 Analysis Theorem Suppose (X θ+ɛ, X θ ) satisfy coupling. Then, for any T > there is a C T,f > for which ( 2 E sup f (θ + ɛ, X θ+ɛ (t)) f (θ, X (t))) θ CT,f ɛ. t T Proof:
71 Analysis Theorem Suppose (X θ+ɛ, X θ ) satisfy coupling. Then, for any T > there is a C T,f > for which ( 2 E sup f (θ + ɛ, X θ+ɛ (t)) f (θ, X (t))) θ CT,f ɛ. t T Proof: Key observation of proof: X θ+ɛ (t) X θ (t) = M θ,ɛ (t) + t where most of the jumps have vanished. F θ+ɛ (X θ+ɛ (s)) F θ (X θ (s))ds, Now work on Martingale and absolutely continuous part.
72 Example: gene transcription and translation G 2 G + M, M 1 M + P, M k, P 1. Want θ E [ X θ protein(3) ], θ 1/4.
73 Example: gene transcription and translation G 2 G + M, M 1 M + P, M k, P 1. Want [ ] θ E Xprotein(3) θ, θ 1/4. Method R 95% CI # updates CPU Time Likelihood 689, ± ,56.6 S CMC 246, ± ,364.8 S CRP/CRN 25, ± S CFD 4, ± S Table: Each finite difference method used ɛ = 1/4. The exact value is J(1/4) =
74 Comparison from 5, samples each with ɛ = 1/ Variance Coupled Finite Differences Common Reaction Path Variance 3 2 Crude Monte Carlo Time Time 3 25 Girsanov Transformation 2 Variance Time
75 Example: genetic toggle switch λ 1(X) X 1, λ 2 (X) λ 3(X) X 2, (1) λ 4 (X) with intensity functions λ 1 (X(t)) = α X 2 (t) β, λ 2(X(t)) = X 1 (t) and parameter choice λ 3 (X(t)) = α X 1 (t) γ. λ 4(X(t)) = X 2 (t), α 1 = 5, α 2 = 16, β = 2.5, γ = 1. Begin the process with initial condition [, ] and consider the sensitivity of X 1 as a function of α 1.
76 Example: genetic toggle switch Coupled Finite Differences Common Reaction Path Variance Time (a) Variance to time T = 4 Figure: Time plot of the variance of the Coupled Finite Difference estimator versus the Common Reaction Path estimator for the model (1). Each plot was generated using 1, sample paths. A perturbation of ɛ = 1/1 was used.
77 Are these representations only good for simulation? LLN and ODEs. Tom Kurtz 197 s Suppose X N (t) = O(N). Denote concentrations via X N (t) = N 1 X N (t) = O(1).
78 Are these representations only good for simulation? LLN and ODEs. Tom Kurtz 197 s Suppose X N (t) = O(N). Denote concentrations via Under mild assumptions, have X N (t) = N 1 X N (t) = O(1). λ k (X N (t)) = λ k (N X N (t)/n) = N λ k (X N (t)).
79 Are these representations only good for simulation? LLN and ODEs. Tom Kurtz 197 s Suppose X N (t) = O(N). Denote concentrations via Under mild assumptions, have X N (t) = N 1 X N (t) = O(1). λ k (X N (t)) = λ k (N X N (t)/n) = N λ k (X N (t)). becomes X N (t) = X N () + k X N (t) = X N () + k Y k ( t ) λ k (X N (s))ds ξ k t ) N 1 Y k (N λ k (X N (s))ds ξ k
80 Are these representations only good for simulation? LLN and ODEs. Tom Kurtz 197 s Suppose X N (t) = O(N). Denote concentrations via Under mild assumptions, have X N (t) = N 1 X N (t) = O(1). λ k (X N (t)) = λ k (N X N (t)/n) = N λ k (X N (t)). becomes use that X N (t) = X N () + k X N (t) = X N () + k lim sup N {u U} Y k ( t ) λ k (X N (s))ds ξ k t ) N 1 Y k (N λ k (X N (s))ds ξ k N 1 Y (Nu) u =,
81 Are these representations only good for simulation? LLN and ODEs. Tom Kurtz 197 s Suppose X N (t) = O(N). Denote concentrations via Under mild assumptions, have X N (t) = N 1 X N (t) = O(1). λ k (X N (t)) = λ k (N X N (t)/n) = N λ k (X N (t)). becomes X N (t) = X N () + k X N (t) = X N () + k Y k ( t ) λ k (X N (s))ds ξ k t ) N 1 Y k (N λ k (X N (s))ds ξ k use that lim sup N {u U} N 1 Y (Nu) u =, find X N (t) converges to solution of classical ODE x(t) = x() + k t λ k (x(s))ds ξ k x() + t F(X(s))ds.
82 Diffusions? Argument due to Tom Kurtz Suppose X N (t) = O(N). Denote concentrations via X N (t) = N 1 X N (t) = O(1).
83 Diffusions? Argument due to Tom Kurtz Suppose X N (t) = O(N). Denote concentrations via X N (t) = N 1 X N (t) = O(1). becomes X N (t) = X N () + k X N (t) = X N () + k Y k ( t ) λ k (X N (s))ds ξ k t ) N 1 Y k (N λ k (X N (s))ds ξ k
84 Diffusions? Argument due to Tom Kurtz Suppose X N (t) = O(N). Denote concentrations via X N (t) = N 1 X N (t) = O(1). becomes use that X N (t) = X N () + k X N (t) = X N () + k Y k ( t ) λ k (X N (s))ds ξ k t ) N 1 Y k (N λ k (X N (s))ds ξ k 1 N (Y k (Nu) Nu) W k (u)
85 Diffusions? Argument due to Tom Kurtz Suppose X N (t) = O(N). Denote concentrations via X N (t) = N 1 X N (t) = O(1). becomes use that X N (t) = X N () + k X N (t) = X N () + k Y k ( t ) λ k (X N (s))ds ξ k t ) N 1 Y k (N λ k (X N (s))ds ξ k 1 N (Y k (Nu) Nu) W k (u) find X N (t) well approximated by chemical Langevin process X(t) = X() + k t ζ k λ k (X(s))ds + 1 N t k λk (X(s))dW k (s). or dx(t) = F(X(t))dt + N 1/2 ζ k λk (X(t))dW k (t). k
86 Central limit theorem - Kurtz/ Van Kampen Suppose X N (t) = O(N). Denote concentrations via Let X N (t) = N 1 X N (t) = O(1), x(t) = ODE solution. U N (t) = ( ) Xn(t) Nx(t) N X N (t) x(t) =. N
87 Central limit theorem - Kurtz/ Van Kampen Suppose X N (t) = O(N). Denote concentrations via Let Then, U N (t) = N X N (t) = N 1 X N (t) = O(1), x(t) = ODE solution. U N (t) = ( ) Xn(t) Nx(t) N X N (t) x(t) =. N ( + N N 1 k t k t ) ζ k Ỹ k (N ) λ k (X n(s))ds (F(X n(s)) F (x(s)))ds 1 t ) t ζ k Ỹ k (N λ k (X n(s))ds + DF(x(s))U n(s)ds. N
88 Central limit theorem - Kurtz/ Van Kampen Suppose X N (t) = O(N). Denote concentrations via Let Then, U N (t) = N X N (t) = N 1 X N (t) = O(1), x(t) = ODE solution. U N (t) = ( ) Xn(t) Nx(t) N X N (t) x(t) =. N ( + N N 1 k t k t ) ζ k Ỹ k (N ) λ k (X n(s))ds (F(X n(s)) F (x(s)))ds 1 t ) t ζ k Ỹ k (N λ k (X n(s))ds + DF(x(s))U n(s)ds. N use martingale central limit theorem to show that 1 N Ỹ k (N ) W k ( ), get U n U, U(t) = k ζ k W k ( t ) t λ k (x(s))ds + DF(x(s))U(s)ds
89 Thanks! References: 1. David F. Anderson, An Efficient Finite Difference Method for Parameter Sensitivities of Continuous Time Markov Chains, SIAM: Journal on Numerical Analysis, Vol. 5, No. 5, David F. Anderson and Thomas G. Kurtz, Continuous Time Markov Chain Models for Chemical Reaction Networks, in Design and Analysis of biomolecular circuits, Springer, 211, Eds. Heinz Koeppl et al. Funding: NSF-DMS
Computational methods for continuous time Markov chains with applications to biological processes
Computational methods for continuous time Markov chains with applications to biological processes David F. Anderson anderson@math.wisc.edu Department of Mathematics University of Wisconsin - Madison Penn.
More informationSimulation methods for stochastic models in chemistry
Simulation methods for stochastic models in chemistry David F. Anderson anderson@math.wisc.edu Department of Mathematics University of Wisconsin - Madison SIAM: Barcelona June 4th, 21 Overview 1. Notation
More information2008 Hotelling Lectures
First Prev Next Go To Go Back Full Screen Close Quit 1 28 Hotelling Lectures 1. Stochastic models for chemical reactions 2. Identifying separated time scales in stochastic models of reaction networks 3.
More informationStochastic analysis of biochemical reaction networks with absolute concentration robustness
Stochastic analysis of biochemical reaction networks with absolute concentration robustness David F. Anderson anderson@math.wisc.edu Department of Mathematics University of Wisconsin - Madison Boston University
More informationChemical reaction network theory for stochastic and deterministic models of biochemical reaction systems
Chemical reaction network theory for stochastic and deterministic models of biochemical reaction systems University of Wisconsin at Madison anderson@math.wisc.edu MBI Workshop for Young Researchers in
More informationStochastic and deterministic models of biochemical reaction networks
Stochastic and deterministic models of biochemical reaction networks David F. Anderson Department of Mathematics University of Wisconsin - Madison Case Western Reserve November 9th, 215 Goals of the talk
More informationLongtime behavior of stochastically modeled biochemical reaction networks
Longtime behavior of stochastically modeled biochemical reaction networks David F. Anderson Department of Mathematics University of Wisconsin - Madison ASU Math Biology Seminar February 17th, 217 Overview
More informationStochastic models for chemical reactions
First Prev Next Go To Go Back Full Screen Close Quit 1 Stochastic models for chemical reactions Formulating Markov models Two stochastic equations Simulation schemes Reaction Networks Scaling limit Central
More informationExtending the Tools of Chemical Reaction Engineering to the Molecular Scale
Extending the Tools of Chemical Reaction Engineering to the Molecular Scale Multiple-time-scale order reduction for stochastic kinetics James B. Rawlings Department of Chemical and Biological Engineering
More informationGillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde
Gillespie s Algorithm and its Approximations Des Higham Department of Mathematics and Statistics University of Strathclyde djh@maths.strath.ac.uk The Three Lectures 1 Gillespie s algorithm and its relation
More informationFast Probability Generating Function Method for Stochastic Chemical Reaction Networks
MATCH Communications in Mathematical and in Computer Chemistry MATCH Commun. Math. Comput. Chem. 71 (2014) 57-69 ISSN 0340-6253 Fast Probability Generating Function Method for Stochastic Chemical Reaction
More informationarxiv: v2 [q-bio.mn] 31 Aug 2007
A modified Next Reaction Method for simulating chemical systems with time dependent propensities and delays David F. Anderson Department of Mathematics, University of Wisconsin-Madison, Madison, Wi 53706
More informationEfficient Leaping Methods for Stochastic Chemical Systems
Efficient Leaping Methods for Stochastic Chemical Systems Ioana Cipcigan Muruhan Rathinam November 18, 28 Abstract. Well stirred chemical reaction systems which involve small numbers of molecules for some
More informationPersistence and Stationary Distributions of Biochemical Reaction Networks
Persistence and Stationary Distributions of Biochemical Reaction Networks David F. Anderson Department of Mathematics University of Wisconsin - Madison Discrete Models in Systems Biology SAMSI December
More informationStochastic Simulation.
Stochastic Simulation. (and Gillespie s algorithm) Alberto Policriti Dipartimento di Matematica e Informatica Istituto di Genomica Applicata A. Policriti Stochastic Simulation 1/20 Quote of the day D.T.
More informationModelling in Systems Biology
Modelling in Systems Biology Maria Grazia Vigliotti thanks to my students Anton Stefanek, Ahmed Guecioueur Imperial College Formal representation of chemical reactions precise qualitative and quantitative
More informationParameter Estimation in Stochastic Chemical Kinetic Models. Rishi Srivastava. A dissertation submitted in partial fulfillment of
Parameter Estimation in Stochastic Chemical Kinetic Models by Rishi Srivastava A dissertation submitted in partial fulfillment of the requirement for the degree of Doctor of Philosophy (Chemical Engineering)
More informationMean-square Stability Analysis of an Extended Euler-Maruyama Method for a System of Stochastic Differential Equations
Mean-square Stability Analysis of an Extended Euler-Maruyama Method for a System of Stochastic Differential Equations Ram Sharan Adhikari Assistant Professor Of Mathematics Rogers State University Mathematical
More informationFiltrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition
Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,
More informationSupporting Information for The Stochastic Quasi-Steady-State Assumption: Reducing the Model but Not the Noise
TWCCC Texas Wisconsin California Control Consortium Technical report number 2011-03 Supporting Information for The Stochastic Quasi-Steady-State Assumption: Reducing the Model but Not the Noise Rishi Srivastava
More informationComparison of Two Parameter Estimation Techniques for Stochastic Models
East Tennessee State University Digital Commons @ East Tennessee State University Electronic Theses and Dissertations 8-2015 Comparison of Two Parameter Estimation Techniques for Stochastic Models Thomas
More informationExtending the multi-level method for the simulation of stochastic biological systems
Extending the multi-level method for the simulation of stochastic biological systems Christopher Lester Ruth E. Baker Michael B. Giles Christian A. Yates 29 February 216 Abstract The multi-level method
More informationNested Stochastic Simulation Algorithm for Chemical Kinetic Systems with Disparate Rates. Abstract
Nested Stochastic Simulation Algorithm for Chemical Kinetic Systems with Disparate Rates Weinan E Department of Mathematics and PACM, Princeton University, Princeton, NJ 08544, USA Di Liu and Eric Vanden-Eijnden
More informationIntroduction to Stochastic Processes with Applications in the Biosciences
Introduction to Stochastic Processes with Applications in the Biosciences David F. Anderson University of Wisconsin at Madison Copyright c 213 by David F. Anderson. Contents 1 Introduction 4 1.1 Why study
More informationON THE APPROXIMATION OF STOCHASTIC CONCURRENT CONSTRAINT PROGRAMMING BY MASTER EQUATION
ON THE APPROXIMATION OF STOCHASTIC CONCURRENT CONSTRAINT PROGRAMMING BY MASTER EQUATION Luca Bortolussi 1,2 1 Department of Mathematics and Informatics University of Trieste luca@dmi.units.it 2 Center
More informationNested Stochastic Simulation Algorithm for KMC with Multiple Time-Scales
Nested Stochastic Simulation Algorithm for KMC with Multiple Time-Scales Eric Vanden-Eijnden Courant Institute Join work with Weinan E and Di Liu Chemical Systems: Evolution of an isothermal, spatially
More informationKinetic Monte Carlo. Heiko Rieger. Theoretical Physics Saarland University Saarbrücken, Germany
Kinetic Monte Carlo Heiko Rieger Theoretical Physics Saarland University Saarbrücken, Germany DPG school on Efficient Algorithms in Computational Physics, 10.-14.9.2012, Bad Honnef Intro Kinetic Monte
More informationLAW OF LARGE NUMBERS FOR THE SIRS EPIDEMIC
LAW OF LARGE NUMBERS FOR THE SIRS EPIDEMIC R. G. DOLGOARSHINNYKH Abstract. We establish law of large numbers for SIRS stochastic epidemic processes: as the population size increases the paths of SIRS epidemic
More informationLecture 4 The stochastic ingredient
Lecture 4 The stochastic ingredient Luca Bortolussi 1 Alberto Policriti 2 1 Dipartimento di Matematica ed Informatica Università degli studi di Trieste Via Valerio 12/a, 34100 Trieste. luca@dmi.units.it
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationIntroduction to Random Diffusions
Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales
More informationCybergenetics: Control theory for living cells
Department of Biosystems Science and Engineering, ETH-Zürich Cybergenetics: Control theory for living cells Corentin Briat Joint work with Ankit Gupta and Mustafa Khammash Introduction Overview Cybergenetics:
More informationMultilevel Monte Carlo for Lévy Driven SDEs
Multilevel Monte Carlo for Lévy Driven SDEs Felix Heidenreich TU Kaiserslautern AG Computational Stochastics August 2011 joint work with Steffen Dereich Philipps-Universität Marburg supported within DFG-SPP
More informationStochastic Chemical Kinetics
Stochastic Chemical Kinetics Joseph K Scott November 10, 2011 1 Introduction to Stochastic Chemical Kinetics Consider the reaction I + I D The conventional kinetic model for the concentration of I in a
More informationLecture 1 Modeling in Biology: an introduction
Lecture 1 in Biology: an introduction Luca Bortolussi 1 Alberto Policriti 2 1 Dipartimento di Matematica ed Informatica Università degli studi di Trieste Via Valerio 12/a, 34100 Trieste. luca@dmi.units.it
More informationContinuous-time Markov Chains
Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Bayesian inference Bayes rule. Monte Carlo integation.
More informationp 1 ( Y p dp) 1/p ( X p dp) 1 1 p
Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p
More informationStochastic Differential Equations.
Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)
More informationFrequency Spectra and Inference in Population Genetics
Frequency Spectra and Inference in Population Genetics Although coalescent models have come to play a central role in population genetics, there are some situations where genealogies may not lead to efficient
More informationBernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012
1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.
More informationComputational complexity analysis for Monte Carlo approximations of classically scaled population processes
Computational complexity analysis for Monte Carlo approximations of classically scaled population processes David F. Anderson, Desmond J. Higham, Yu Sun arxiv:52.588v3 [math.a] 4 Jun 28 June 5, 28 Abstract
More informationMA 777: Topics in Mathematical Biology
MA 777: Topics in Mathematical Biology David Murrugarra Department of Mathematics, University of Kentucky http://www.math.uky.edu/~dmu228/ma777/ Spring 2018 David Murrugarra (University of Kentucky) Lecture
More information1 Delayed Renewal Processes: Exploiting Laplace Transforms
IEOR 6711: Stochastic Models I Professor Whitt, Tuesday, October 22, 213 Renewal Theory: Proof of Blackwell s theorem 1 Delayed Renewal Processes: Exploiting Laplace Transforms The proof of Blackwell s
More informationMultifidelity Approaches to Approximate Bayesian Computation
Multifidelity Approaches to Approximate Bayesian Computation Thomas P. Prescott Wolfson Centre for Mathematical Biology University of Oxford Banff International Research Station 11th 16th November 2018
More informationA Note on the Central Limit Theorem for a Class of Linear Systems 1
A Note on the Central Limit Theorem for a Class of Linear Systems 1 Contents Yukio Nagahata Department of Mathematics, Graduate School of Engineering Science Osaka University, Toyonaka 560-8531, Japan.
More informationPredator-Prey Population Dynamics
Predator-Prey Population Dynamics Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 2,
More informationc 2018 Society for Industrial and Applied Mathematics
SIAM J. APPL. MATH. Vol. 78, No. 5, pp. 2692 2713 c 2018 Society for Industrial and Applied Mathematics SOME NETWORK CONDITIONS FOR POSITIVE RECURRENCE OF STOCHASTICALLY MODELED REACTION NETWORKS DAVID
More informationGene Expression as a Stochastic Process: From Gene Number Distributions to Protein Statistics and Back
Gene Expression as a Stochastic Process: From Gene Number Distributions to Protein Statistics and Back June 19, 2007 Motivation & Basics A Stochastic Approach to Gene Expression Application to Experimental
More informationMalliavin Calculus in Finance
Malliavin Calculus in Finance Peter K. Friz 1 Greeks and the logarithmic derivative trick Model an underlying assent by a Markov process with values in R m with dynamics described by the SDE dx t = b(x
More informationLecture on Parameter Estimation for Stochastic Differential Equations. Erik Lindström
Lecture on Parameter Estimation for Stochastic Differential Equations Erik Lindström Recap We are interested in the parameters θ in the Stochastic Integral Equations X(t) = X(0) + t 0 µ θ (s, X(s))ds +
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationThis is now an algebraic equation that can be solved simply:
Simulation of CTMC Monday, November 23, 2015 1:55 PM Homework 4 will be posted by tomorrow morning, due Friday, December 11 at 5 PM. Let's solve the Kolmogorov forward equation for the Poisson counting
More informationLatent voter model on random regular graphs
Latent voter model on random regular graphs Shirshendu Chatterjee Cornell University (visiting Duke U.) Work in progress with Rick Durrett April 25, 2011 Outline Definition of voter model and duality with
More informationStochastic Simulation of Biochemical Reactions
1 / 75 Stochastic Simulation of Biochemical Reactions Jorge Júlvez University of Zaragoza 2 / 75 Outline 1 Biochemical Kinetics 2 Reaction Rate Equation 3 Chemical Master Equation 4 Stochastic Simulation
More informationMethods of Data Analysis Random numbers, Monte Carlo integration, and Stochastic Simulation Algorithm (SSA / Gillespie)
Methods of Data Analysis Random numbers, Monte Carlo integration, and Stochastic Simulation Algorithm (SSA / Gillespie) Week 1 1 Motivation Random numbers (RNs) are of course only pseudo-random when generated
More informationBayesian Inference for DSGE Models. Lawrence J. Christiano
Bayesian Inference for DSGE Models Lawrence J. Christiano Outline State space-observer form. convenient for model estimation and many other things. Preliminaries. Probabilities. Maximum Likelihood. Bayesian
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationMartingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi
s Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi Lectures on Probability and Stochastic Processes III Indian Statistical Institute, Kolkata 20 24 November
More informationDynamic Risk Measures and Nonlinear Expectations with Markov Chain noise
Dynamic Risk Measures and Nonlinear Expectations with Markov Chain noise Robert J. Elliott 1 Samuel N. Cohen 2 1 Department of Commerce, University of South Australia 2 Mathematical Insitute, University
More informationWeek 9 Generators, duality, change of measure
Week 9 Generators, duality, change of measure Jonathan Goodman November 18, 013 1 Generators This section describes a common abstract way to describe many of the differential equations related to Markov
More informationStability of Stochastic Differential Equations
Lyapunov stability theory for ODEs s Stability of Stochastic Differential Equations Part 1: Introduction Department of Mathematics and Statistics University of Strathclyde Glasgow, G1 1XH December 2010
More information(implicitly assuming time-homogeneity from here on)
Continuous-Time Markov Chains Models Tuesday, November 15, 2011 2:02 PM The fundamental object describing the dynamics of a CTMC (continuous-time Markov chain) is the probability transition (matrix) function:
More informationDynamic Partitioning of Large Discrete Event Biological Systems for Hybrid Simulation and Analysis
Dynamic Partitioning of Large Discrete Event Biological Systems for Hybrid Simulation and Analysis Kaustubh R. Joshi 1, Natasha Neogi 1, and William H. Sanders 1 Coordinated Science Laboratory, University
More informationDerivation of Itô SDE and Relationship to ODE and CTMC Models
Derivation of Itô SDE and Relationship to ODE and CTMC Models Biomathematics II April 23, 2015 Linda J. S. Allen Texas Tech University TTU 1 Euler-Maruyama Method for Numerical Solution of an Itô SDE dx(t)
More informationPositive Harris Recurrence and Diffusion Scale Analysis of a Push Pull Queueing Network. Haifa Statistics Seminar May 5, 2008
Positive Harris Recurrence and Diffusion Scale Analysis of a Push Pull Queueing Network Yoni Nazarathy Gideon Weiss Haifa Statistics Seminar May 5, 2008 1 Outline 1 Preview of Results 2 Introduction Queueing
More informationCIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc
CIMPA SCHOOL, 27 Jump Processes and Applications to Finance Monique Jeanblanc 1 Jump Processes I. Poisson Processes II. Lévy Processes III. Jump-Diffusion Processes IV. Point Processes 2 I. Poisson Processes
More informationReflected Brownian Motion
Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide
More informationSensitivity analysis of complex stochastic processes
Sensitivity analysis of complex stochastic processes Yannis Pantazis join work with Markos Katsoulakis University of Massachusetts, Amherst, USA Partially supported by DOE contract: DE-SC0002339 February,
More informationSTOCHASTIC MODELING OF BIOCHEMICAL REACTIONS
STOCHASTIC MODELING OF BIOCHEMICAL REACTIONS Abhyudai Singh and João Pedro Hespanha* Department of Electrical and Computer Engineering University of California, Santa Barbara, CA 93101. Abstract The most
More informationDiscrepancies between extinction events and boundary equilibria in reaction networks
Discrepancies between extinction events and boundary equilibria in reaction networks David F. nderson Daniele Cappelletti September 2, 208 bstract Reaction networks are mathematical models of interacting
More informationABC methods for phase-type distributions with applications in insurance risk problems
ABC methods for phase-type with applications problems Concepcion Ausin, Department of Statistics, Universidad Carlos III de Madrid Joint work with: Pedro Galeano, Universidad Carlos III de Madrid Simon
More informationPoint Process Control
Point Process Control The following note is based on Chapters I, II and VII in Brémaud s book Point Processes and Queues (1981). 1 Basic Definitions Consider some probability space (Ω, F, P). A real-valued
More informationAdaptive timestepping for SDEs with non-globally Lipschitz drift
Adaptive timestepping for SDEs with non-globally Lipschitz drift Mike Giles Wei Fang Mathematical Institute, University of Oxford Workshop on Numerical Schemes for SDEs and SPDEs Université Lille June
More informationLecture 4: Ito s Stochastic Calculus and SDE. Seung Yeal Ha Dept of Mathematical Sciences Seoul National University
Lecture 4: Ito s Stochastic Calculus and SDE Seung Yeal Ha Dept of Mathematical Sciences Seoul National University 1 Preliminaries What is Calculus? Integral, Differentiation. Differentiation 2 Integral
More informationLet (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t
2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationLMI Methods in Optimal and Robust Control
LMI Methods in Optimal and Robust Control Matthew M. Peet Arizona State University Lecture 15: Nonlinear Systems and Lyapunov Functions Overview Our next goal is to extend LMI s and optimization to nonlinear
More informationThe Wiener Itô Chaos Expansion
1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in
More informationLognormal Moment Closures for Biochemical Reactions
Lognormal Moment Closures for Biochemical Reactions Abhyudai Singh and João Pedro Hespanha Abstract In the stochastic formulation of chemical reactions, the dynamics of the the first M -order moments of
More informationstochnotes Page 1
stochnotes110308 Page 1 Kolmogorov forward and backward equations and Poisson process Monday, November 03, 2008 11:58 AM How can we apply the Kolmogorov equations to calculate various statistics of interest?
More informationMatrix Solutions to Linear Systems of ODEs
Matrix Solutions to Linear Systems of ODEs James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 3, 216 Outline 1 Symmetric Systems of
More informationIntroduction Probabilistic Programming ProPPA Inference Results Conclusions. Embedding Machine Learning in Stochastic Process Algebra.
Embedding Machine Learning in Stochastic Process Algebra Jane Hillston Joint work with Anastasis Georgoulas and Guido Sanguinetti, School of Informatics, University of Edinburgh 16th August 2017 quan col....
More informationCan we do statistical inference in a non-asymptotic way? 1
Can we do statistical inference in a non-asymptotic way? 1 Guang Cheng 2 Statistics@Purdue www.science.purdue.edu/bigdata/ ONR Review Meeting@Duke Oct 11, 2017 1 Acknowledge NSF, ONR and Simons Foundation.
More informationStat 451 Lecture Notes Monte Carlo Integration
Stat 451 Lecture Notes 06 12 Monte Carlo Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 23 in Lange, and Chapters 3 4 in Robert & Casella 2 Updated:
More informationFE610 Stochastic Calculus for Financial Engineers. Stevens Institute of Technology
FE610 Stochastic Calculus for Financial Engineers Lecture 3. Calculaus in Deterministic and Stochastic Environments Steve Yang Stevens Institute of Technology 01/31/2012 Outline 1 Modeling Random Behavior
More informationSTAT 331. Martingale Central Limit Theorem and Related Results
STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal
More informationBranching Processes II: Convergence of critical branching to Feller s CSB
Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied
More informationWhat do we know about EnKF?
What do we know about EnKF? David Kelly Kody Law Andrew Stuart Andrew Majda Xin Tong Courant Institute New York University New York, NY April 10, 2015 CAOS seminar, Courant. David Kelly (NYU) EnKF April
More informationLarge Deviations for Small-Noise Stochastic Differential Equations
Chapter 22 Large Deviations for Small-Noise Stochastic Differential Equations This lecture is at once the end of our main consideration of diffusions and stochastic calculus, and a first taste of large
More informationLecture 8. Instructor: Haipeng Luo
Lecture 8 Instructor: Haipeng Luo Boosting and AdaBoost In this lecture we discuss the connection between boosting and online learning. Boosting is not only one of the most fundamental theories in machine
More informationUniformly Uniformly-ergodic Markov chains and BSDEs
Uniformly Uniformly-ergodic Markov chains and BSDEs Samuel N. Cohen Mathematical Institute, University of Oxford (Based on joint work with Ying Hu, Robert Elliott, Lukas Szpruch) Centre Henri Lebesgue,
More informationLarge Deviations for Small-Noise Stochastic Differential Equations
Chapter 21 Large Deviations for Small-Noise Stochastic Differential Equations This lecture is at once the end of our main consideration of diffusions and stochastic calculus, and a first taste of large
More informationEquilibrium Time, Permutation, Multiscale and Modified Multiscale Entropies for Low-High Infection Level Intracellular Viral Reaction Kinetics
Equilibrium Time, Permutation, Multiscale and Modified Multiscale Entropies for Low-High Infection Level Intracellular Viral Reaction Kinetics Fariborz Taherkhani 1, Farid Taherkhani 2* 1 Department of
More informationOn the numerical solution of the chemical master equation with sums of rank one tensors
ANZIAM J. 52 (CTAC21) pp.c628 C643, 211 C628 On the numerical solution of the chemical master equation with sums of rank one tensors Markus Hegland 1 Jochen Garcke 2 (Received 2 January 211; revised 4
More informationDISCRETE-TIME STOCHASTIC MODELS, SDEs, AND NUMERICAL METHODS. Ed Allen. NIMBioS Tutorial: Stochastic Models With Biological Applications
DISCRETE-TIME STOCHASTIC MODELS, SDEs, AND NUMERICAL METHODS Ed Allen NIMBioS Tutorial: Stochastic Models With Biological Applications University of Tennessee, Knoxville March, 2011 ACKNOWLEDGEMENT I thank
More informationarxiv: v2 [math.na] 20 Dec 2016
SAIONARY AVERAGING FOR MULI-SCALE CONINUOUS IME MARKOV CHAINS USING PARALLEL REPLICA DYNAMICS ING WANG, PER PLECHÁČ, AND DAVID ARISOFF arxiv:169.6363v2 [math.na 2 Dec 216 Abstract. We propose two algorithms
More informationSDE Coefficients. March 4, 2008
SDE Coefficients March 4, 2008 The following is a summary of GARD sections 3.3 and 6., mainly as an overview of the two main approaches to creating a SDE model. Stochastic Differential Equations (SDE)
More informationEnKF and filter divergence
EnKF and filter divergence David Kelly Andrew Stuart Kody Law Courant Institute New York University New York, NY dtbkelly.com December 12, 2014 Applied and computational mathematics seminar, NIST. David
More information