Sensitivity analysis of complex stochastic processes

Size: px
Start display at page:

Download "Sensitivity analysis of complex stochastic processes"

Transcription

1 Sensitivity analysis of complex stochastic processes Yannis Pantazis join work with Markos Katsoulakis University of Massachusetts, Amherst, USA Partially supported by DOE contract: DE-SC February, 2013

2 Outline Introduction Discrete-time MC Continuous-time MC p53 reaction network Langevin process ZGB lattice model

3 Sensitivity analysis - Definition Definition: Sensitivity analysis is the study of the impact in the output of a mathematical model or system caused by perturbations in the input. Q: What could be an input? A: Initial data (deterministic or random), intrinsic model parameters (such as temperature, pressure, masses, reaction rates, etc.) Q: What could be an output? A: Output data, observables of the process (such as mean values, variances, higher moments, correlations, etc.), histograms. An input parameter is sensitive when even small changes to this parameter may significantly alter the system output.

4 Sensitivity analysis - Applications Perspectives on the design and control of multiscale systems, Braatz et al., 2006

5 Sensitivity analysis - Applications Optimal experimental design: Construct experiments in such a way that the parameters can be estimated from the resulting experimental data with the highest statistical quality. Various optimality criteria (A-optimality, D-optimality, E-optimality, etc.) are based on the Fisher information matrix. Robustness: The persistence of a system to a desirable state under the (controllable and/or uncontrollable) perturbations of external conditions. Identifiability: The ability of the experimental data and/or the model to estimate the model parameters with high fidelity. Reliability: To ensure that the performance of a system meets some pre-specified target with a given probability. Uncertainty quantification: The science of quantitative characterization and reduction of uncertainties in applications. For instance, not only the values of the model parameters might contain errors but also the model itself could be an approximation.

6 Sensitivity analysis - Global vs Local Global sensitivity analysis: Study the effect of an input parameter to the output under a wide range of values or under a specified distribution. Insensitive input parameters could be treated without big effort (assign a nominal value) while sensitive input parameters should be treated carefully. Variance-based methods for global SA are: Sobol indices (decompose variance in a ANOVA-like manner), Fourier amplitude sensitivity test (FAST), Morris method. Information-based methods for global SA are: relative entropy (or Kullback-Leibler divergence), mutual information (quantifies the input-output mutual dependence). Local sensitivity analysis: Fix input parameters to a value and study the effect of small perturbations around the fixed parameter to the output. Local SA is typically a prerequisite for global SA.

7 Outline Introduction Discrete-time MC Continuous-time MC p53 reaction network Langevin process ZGB lattice model

8 Deterministic SA System of ODEs: ẏ = f (t, y; θ), y(0) = y 0 R N Goal: Perform SA on the model parameters θ R K. Define sensitivity indices: s k = y θ k A new system of ODEs is derived and augmented to the previous: ṡ k = f y s k + f θ k, k = 1,..., K

9 Stochastic SA - Observable-based By stochastic process we merely mean either discrete-time Markov chains (DTMC) with possibly separable state space or continuous-time jump Markov chains (CTMC) with countable state space. Well-mixed reaction networks, spatially-extended on lattices systems, discretizations of stochastic differential equations are models which belong to the above classes of stochastic processes. A typical example for stochastic SA: {xt} is a MC, θ R is a model parameter and f is a test function (or observable). Compute S(θ, t) = θ E Pt θ [f (x)]

10 Stochastic SA - Observable-based Finite difference approach. Approximate parameter sensitivity with finite difference, S(θ, t) = E P [f (x)] E t θ+ɛ P θ t [f (x)] + O(ɛ) ɛ Applying typical sampling approaches, Ŝ(θ, t) = 1 n ( f (x θ+ɛ,i t ) f (xt θ,i ) ) nɛ with variance, i=1 var (f (xt θ+ɛ ) f (xt θ )) = var (f (xt θ+ɛ )) + var (f (xt θ )) 2cov (f (xt θ+ɛ ), f (xt θ )) Importance quantities: Bias due to finite difference. Variance due to sampling of two different stochastic processes.

11 Stochastic SA - Observable-based Likelihood ratio method, P. Glynn ( ). Parameter sensitivity is rewritten as, S(θ, t) = θ E P [f (x)] = f (x) t θ θ Pt θ (x)dx = E P θ t [f (x) θ log P θ t (x)] = E Q θ 0,t [f (x) θ log Q θ 0,t(x)] where Q0,t is the path distribution of the process in the interval [0, t]. Derivative-free approach. Variance increases with time.

12 Stochastic SA - Density-based A more holistic approach based on the probability densities of the process. Assuming that xt θ Pt θ (x)dx and xt θ+ɛ Pt θ+ɛ (x)dx, why not comparing directly the distributions? Majda and Gershgorin (2010) compared the distributions between a coarse-grained climate model and a statistical climate model based on relative entropy. Komorowski et al. (2011) computed the Fisher information matrix of the path distribution of a linearized CTMC process.

13 Outline Introduction Discrete-time MC Continuous-time MC Introduction Discrete-time MC Continuous-time MC p53 reaction network Langevin process ZGB lattice model

14 Preliminaries Introduction Discrete-time MC Continuous-time MC Restrict, for the moment, to DTMC processes and to the steady state regime. Steady state means that the probability density of the process an any time instant is independent of the particular time instant. For the parameter vector θ R K, a DTMC, {σ m } m Z+, is defined with path distribution in time instants 0, 1,..., M given by Q θ 0,M( σ0,, σ M ) = µ θ (σ 0 )p θ (σ 0, σ 1 ) p θ (σ M 1, σ M ) where µ θ (σ) is the stationary (or invariant) probability density function. p θ (σ, σ ) is the transition probability function. Similarly for the DTMC process, { σ m } m Z+, induced from the parameter vector θ + ɛ.

15 Relative entropy in path-space Discrete-time MC Continuous-time MC Suggest performing parameter sensitivity analysis by comparing the path distributions utilizing relative entropy. Definition: The path-wise relative entropy of Q0,M θ w.r.t. Qθ+ɛ 0,M is ( ) ( ) dq θ R Q0,M θ Q θ+ɛ 0,M 0,M := log dq θ+ɛ dq0,m θ 0,M ( ) Properties: (i) R Q0,M θ Qθ+ɛ 0,M 0 and ( ) (ii) R Q0,M θ Qθ+ɛ 0,M = 0 iff Q0,M θ = Qθ+ɛ 0,M ( ) ( a.e. ) (iii) R Q0,M θ Qθ+ɛ 0,M R Q0,M+1 θ Qθ+ɛ 0,M+1

16 Relative entropy decomposition Discrete-time MC Continuous-time MC Its not difficult to see that path-wise relative entropy at stationary regime is decomposed into two parts; one independent of time and one that grows linearly with time: ( ) R Q0,M θ Q θ+ɛ 0,M = E M 1 µ θ (σ 0 ) p θ (σ i, σ i+1 ) E i=0 µ θ (σ 0 ) M 1 i=0 log pθ (σ i, σ i+1 ) µ θ+ɛ (σ 0 ) M 1 i=0 pθ+ɛ (σ i, σ i+1 ) dσ 0 dσ M ( ) = MH Q0,M θ Q θ+ɛ 0,M + R ( µ θ µ θ+ɛ) where ( ) H Q θ 0,M Q θ+ɛ 0,M is the relative entropy rate (RER): ( ) [ ] H Q0,M θ Q θ+ɛ 0,M = E µ θ p θ (σ, σ ) log pθ (σ, σ ) p θ+ɛ (σ, σ ) d σ

17 SA - Relative entropy rate Discrete-time MC Continuous-time MC For long times, RER is a sensible measure of parameter sensitivity. Properties: Infer information about the path distribution. Consequently, it contains information not only for the invariant measure but also for the stationary dynamics such as metastable dynamics, exit times, time correlations, etc.. No need for explicit knowledge of invariant measure. Thus, it is suitable for reaction networks and non-equilibrium steady state systems. RER is an observable of known test function tractable and statistical estimators can provide easily and efficiently its value. Different ɛ s SA at different directions. However, the perturbed process is not needed to be simulated.

18 SA - Fisher information matrix Discrete-time MC Continuous-time MC RER reminds the discrete derivative SA method. Q: Can we avoid the (discrete) perturbation of θ by ɛ? A: Yes, by constructing the corresponding Fisher Information Matrix (FIM). It holds, under smoothness assumption on θ, that ( ) H Q0,M θ Q θ+ɛ 0,M = 1 2 ɛt F H ( Q θ 0,M ) ɛ + O( ɛ 3 ) where Fisher information matrix is defined as [ ] ( F H Q θ ) 0,M = Eµ θ p θ (σ, σ ) θ log p θ (σ, σ ) θ log p θ (σ, σ ) T d σ E

19 SA - Fisher information matrix Discrete-time MC Continuous-time MC FIM constitutes a derivative-free sensitivity analysis method. In optimal experimental design, the maximization of the determinant of the FIM constitutes the D-optimality test while the minimization of the trace of the inverse of the FIM constitutes the A-optimality test. Robustness on parameter perturbations as well as parameter identifiability can be inferred from the FIM. Wider FIM implies a more robust model. Steeper FIM implies more identifiable parameters. Eigenvalue analysis of FIM gives the most/least sensitive directions.

20 Discrete-time MC Continuous-time MC Preliminaries A continuous-time jump Markov process is fully determined by the transition rates, c(σ, σ ). The total rate is defined by λ(σ) = σ c(σ, σ ). For the parameter vector θ R K, a CTMC, {σ t } t R+, is defined. Its path distribution in the interval [0, T ] is given by Q θ 0,T ( {σt } t [0,T ] ). Similarly for the CTMC, { σ m } m Z+, induced from the parameter vector θ + ɛ.

21 Relative entropy decomposition Discrete-time MC Continuous-time MC Q: How to compute the Radon-Nikodym derivative, 0 dq θ 0,T dq θ+ɛ 0,T A: Apply Girsanov theorem which asserts that dq0,t θ { ({σ t}) = µθ (σ T 0) dq θ+ɛ µ θ+ɛ (σ exp log cθ (σ s, σ s) 0,T 0) 0 c θ+ɛ (σ s, σ s) dns T } [λ θ (σ s) λ θ+ɛ (σ s)] ds Substituting the above Girsanov formula to the path-wise relative entropy and exploiting the fact that we are at the steady state regime, we obtain that ( R Q θ 0,T Q θ+ɛ 0,T ) ( = T H Q θ 0,T Q θ+ɛ 0,T? ) + R ( µ θ µ θ+ɛ) Relative Entropy Rate for CTMC is given by ( ) [ H Q0,T θ Q θ+ɛ 0,T = E µ θ c θ (σ, σ ) log cθ (σ, σ ) ] c θ+ɛ (σ, σ ) (λθ (σ) λ θ+ɛ (σ)) σ E

22 Discrete-time MC Continuous-time MC Fisher information matrix Generalize the notion of Fisher information theory to the case of CTMC processes. FIM is defined as [ ] F H (Q0,T θ ) := E µ θ c θ (σ, σ ) θ log c θ (σ, σ ) θ log c θ (σ, σ ) T σ E

23 Discrete-time MC Continuous-time MC Fisher information matrix Generalize the notion of Fisher information theory to the case of CTMC processes. FIM is defined as [ ] F H (Q0,T θ ) := E µ θ c θ (σ, σ ) θ log c θ (σ, σ ) θ log c θ (σ, σ ) T σ E Similar derivations can be obtained for time-inhomogeneous Markov chains, semi-markov processes and probably for processes with memory. Sensitivity analysis on the logarithmic scale is also possible.

24 Connection with previous methods Discrete-time MC Continuous-time MC Pinsker (or Csiszar-Kullback-Pinsker) inequality: ( ) Q0,T θ Q θ+ɛ 0,T TV 2R Q0,T θ Qθ+ɛ 0,T If Q0,T θ = qθ 0,T dx and respectively Qθ+ɛ 0,T = qθ+ɛ 0,T dx, E Q θ 0,T [f ] E Q θ+ɛ[f ] f q0,t θ q θ+ɛ 0,T 0,T 1 ( f 2R q θ 0,T qθ+ɛ 0,T Thus, if RER is insensitive to a parameter then any observable is insensitive to this particular parameter. Overall, in terms of specific observables, RER and FIM are conservative estimates of their parameter sensitivity. )

25 Outline Introduction p53 reaction network Langevin process ZGB lattice model Introduction Discrete-time MC Continuous-time MC p53 reaction network Langevin process ZGB lattice model

26 Well-mixed reaction systems - p53 p53 reaction network Langevin process ZGB lattice model The p53 gene model is an extensively studied system which plays a crucial role for effective tumor suppression in human beings as its universal inactivation in cancer cells suggests. The p53 gene is activated in response to DNA damage and it constitutes a negative feedback loop with the oncogene protein Mdm2. Event Reaction Rate 1 x c 1 (σ) = b x 2 x c 2 (σ) = a x x + a k y x+k x 3 x x + y 0 c 3 (σ) = b y x 4 y 0 y c 4 (σ) = a 0 y 0 5 y c 5 (σ) = a y y x corresponding to p53, y 0 to Mdm2-precursor while y corresponds to Mdm2. Parameter vector: θ = [b x, a x, a k, k, b y, a 0, a y ] T.

27 p53 - Concentrations Introduction p53 reaction network Langevin process ZGB lattice model p53 mdm2 precursor mdm2 120 Number of Molecules Time Figure: Molecule concentration of p53, Mdm2-precursor and Mdm2. Concentration oscillations as well as time delays (phase shifts) between the species are present due to the negative feedback loop. Furthermore, the concentration of p53 periodically hits the zero and since negative concentrations are not allowed, the stochastic characteristics of p53 are far from Gaussian.

28 p53 - RER Introduction p53 reaction network Langevin process ZGB lattice model Relative Entropy Rate Time ε e 0 1 ε 0 e 4 ε e 0 7 Relative Entropy Rate ε 0 FIM based ε 0 0 e1 e2 e3 e4 e5 e6 e7 Figure: Upper panel: The Relative Entropy Rate in time for the parameter perturbation of b x (blue), k (green) and a y (red) by +10%. The relaxation time of the RER as an observable is ultra fast. Lower panel: RER for various perturbation directions computed either directly (blue and red bars) or based on FIM (green bars).

29 p53 - FIM Introduction p53 reaction network Langevin process ZGB lattice model Path wise FIM LNA based FIM b_x a_x a_k k b_y a_0 a_y b_x a_x a_k k b_y a_0 a_y b_x a_x a_k k b_y a_0 a_y b_x a_x a_k k b_y a_0 a_y Figure: The proposed path-wise Fisher Information Matrix (left) based on RER as well as the FIM based on LNA (Komorowski et al., Sensitivity, robustness, and identifiability in stochastic chemical kinetics models, PNAS, 2011).

30 p53 - Comparison Introduction p53 reaction network Langevin process ZGB lattice model unperturbed b perturb x b y perturb p53 50 mdm2 precursor Time unperturbed b x perturb b y perturb Time unperturbed b x perturb b y perturb mdm Time Figure: Time-series of the species for the unperturbed parameter regime (blue), when b x is perturbed by +10% (red) as well as when b y is perturbed by +10%. The same sequence of random numbers is used in all cases.

31 p53 reaction network Langevin process ZGB lattice model Langevin out-of-equilibrium process - Definition SDE system with N particles: dq t = 1 m p tdt dp t = F(q t )dt γ m p tdt + σdb t Force: F(q) = q V (q) + αg(q) Interaction potential: V (q) = i,j<i V M( q i q j ) Morse potential: VM (r) = D e(1 e a(r re ) ) 2 The divergence-free component ( q G = 0) is taken to be a simple antisymmetric force: G i (q) = q i+1 q i 1, i = 1,..., N If α = 0 then Langevin process is reversible. If α 0 then Langevin process is out of equilibrium.

32 p53 reaction network Langevin process ZGB lattice model Langevin out-of-equilibrium process - Scheme Sensitivity analysis on Morse parameters: θ = [D e, a, r e ]. Langevin process is time-discretized utilizing BBK integrator: p i+ 1 = p i F(q 2 i ) t 2 γ m p t i 2 + σ W i q i+1 = q i + 1 m p i+ 1 2 t p i+1 = p i+ 1 F(q i+1) t 2 2 γ m p t i σ W i+ 1 2 Transition probability of the discretized process: where P(q q, p) = 1 Z 0 e m2 P(q, p, q, p ) = P(q q, p)p(p q, q, p) σ 2 t 3 q q+(p F(q) t 2 +p tγ 2m ) t m 2 P(p q, q, p) = 1 Z 1 e 1 γ t σ 2 (1+ t 2m )p ( t m (q q) F(q ) t 2 ) 2

33 a a Introduction p53 reaction network Langevin process ZGB lattice model Langevin out-of-equilibrium process - FIM r e 0 r e D e D e a r e 0 r e D e D e a Figure: Upper plots: Level sets (or neutral spaces) for the reversible case (α = 0). Lower plots: Level sets for the irreversible case (α = 0.1).

34 ZGB - Definition Introduction p53 reaction network Langevin process ZGB lattice model ZGB (Ziff-Gulari-Barshad) is a simplified spatio-temporal CO oxidization model without diffusion. Despite being an idealized model, the ZGB model incorporates the basic mechanisms for the dynamics of adsorbate species during CO oxidation on catalytic surfaces. Event Reaction Rate 1 CO (1 σ(j) 2 )k 1 2 O 2 (1 σ(j) 2 )(1 k 1 ) 3 CO + O CO 2 + des. 1 2 σ(j)(1 + σ(j))k 2 4 O + CO CO 2 + des. 1 2 σ(j)(σ(j) 1)k 2 #vacant n.n. total n.n. #O n.n. total n.n. #CO n.n. total n.n. Table: The rate of the kth event of the jth site given that the current configuration is σ is denoted by c k (j; σ) where n.n. stands for nearest neighbors.

35 ZGB - RER Introduction p53 reaction network Langevin process ZGB lattice model 0.2 Relative Entropy Rate Time e 1 e Relative Entropy Rate ε 0 FIM based, ε 0 0 e1 Various directions e2 Figure: Upper plot: Relative entropy rate as a function of time for perturbations of both k 1 (solid line) and of k 2 (dashed line). An equilibration time until the process reach its metastable regime is evident. Lower plot: RER for various directions. The most sensitive parameter is k 1.

36 ZGB - Configurations p53 reaction network Langevin process ZGB lattice model Original Most sens. direction Least sens. direction Figure: Typical configurations obtained by ɛ 0-perturbations of the most and least sensitive parameters. The comparison with the reference configuration reveals the differences between the most and least sensitive perturbation parameters.

37 Conclusions Introduction p53 reaction network Langevin process ZGB lattice model Sensitivity analysis of steady state stochastic processes utilizing RER and FIM. Even though path-wise distributions was considered, the resulted observables where easy to compute. Long-time stochastic dynamics are also taken into consideration. No need to know the stationary distribution. A Relative Entropy Rate Method for Path Space Sensitivity Analysis of Stationary Complex Stochastic Dynamics, M.K. - Y.P., J. of Chemical Physics (2013).

Sensitivity Analysis, Uncertainty Quantification and Inference in Stochastic Dynamics

Sensitivity Analysis, Uncertainty Quantification and Inference in Stochastic Dynamics Sensitivity Analysis, Uncertainty Quantification and Inference in Stochastic Dynamics Yannis Pantazis University of Crete IPAM, January, 2016 Introduction Mathematical Theory Well-mixed reaction networks

More information

arxiv: v1 [q-bio.mn] 4 Dec 2014

arxiv: v1 [q-bio.mn] 4 Dec 2014 Accelerated Sensitivity Analysis in High-Dimensional Stochastic Reaction Networks Georgios Arampatzis 1, Markos A. Katsoulakis 1,, Yannis Pantazis 1 1 Dep. of Mathematics and Statistics, University of

More information

Parametric Sensitivity Analysis for Stochastic Molecular Systems using Information Theoretic Metrics

Parametric Sensitivity Analysis for Stochastic Molecular Systems using Information Theoretic Metrics University of Massachusetts Amherst From the SelectedWorks of Markos Katsoulakis 13 Parametric Sensitivity Analysis for Stochastic Molecular Systems using Information Theoretic Metrics Anastasios Tsourtis

More information

Information-theoretic tools for parametrized coarse-graining of non-equilibrium extended systems

Information-theoretic tools for parametrized coarse-graining of non-equilibrium extended systems Information-theoretic tools for parametrized coarse-graining of non-equilibrium extended systems Petr Plecháč Dept. Mathematical Sciences, University of Delaware plechac@math.udel.edu http://www.math.udel.edu/

More information

A Relative Entropy Rate Method for Path Space Sensitivity Analysis of Stationary Complex Stochastic Dynamics

A Relative Entropy Rate Method for Path Space Sensitivity Analysis of Stationary Complex Stochastic Dynamics University of Massachusetts Amherst From the SelectedWorks of Markos Katsoulakis 212 A Relative ntropy Rate Method for Path Space Sensitivity Analysis of Stationary Complex Stochastic Dynamics Yannis Pantazis

More information

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde

Gillespie s Algorithm and its Approximations. Des Higham Department of Mathematics and Statistics University of Strathclyde Gillespie s Algorithm and its Approximations Des Higham Department of Mathematics and Statistics University of Strathclyde djh@maths.strath.ac.uk The Three Lectures 1 Gillespie s algorithm and its relation

More information

Nonparametric Drift Estimation for Stochastic Differential Equations

Nonparametric Drift Estimation for Stochastic Differential Equations Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,

More information

arxiv: v1 [math.na] 2 Aug 2015

arxiv: v1 [math.na] 2 Aug 2015 Path-space variational inference for non-equilibrium coarse-grained systems Vagelis Harmandaris c,d, Evangelia Kalligiannaki c,, Markos Katsoulakis b,, Petr Plecháč a arxiv:1508.0089v1 [math.na] Aug 015

More information

Brief Review on Estimation Theory

Brief Review on Estimation Theory Brief Review on Estimation Theory K. Abed-Meraim ENST PARIS, Signal and Image Processing Dept. abed@tsi.enst.fr This presentation is essentially based on the course BASTA by E. Moulines Brief review on

More information

Quantifying Stochastic Model Errors via Robust Optimization

Quantifying Stochastic Model Errors via Robust Optimization Quantifying Stochastic Model Errors via Robust Optimization IPAM Workshop on Uncertainty Quantification for Multiscale Stochastic Systems and Applications Jan 19, 2016 Henry Lam Industrial & Operations

More information

Talk 1 - Introduction to Stochastic Models in Molecular Systems: Kinetic M. Langevin Dynamics

Talk 1 - Introduction to Stochastic Models in Molecular Systems: Kinetic M. Langevin Dynamics Talk 1 - Introduction to Stochastic Models in Molecular Systems: Kinetic Monte Carlo and RMMC, University of Wyoming, June 2014 Outline Overview The Metropolis Algorithm Microkinetic models Overview of

More information

MARKOV PROCESSES. Valerio Di Valerio

MARKOV PROCESSES. Valerio Di Valerio MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some

More information

Covariance function estimation in Gaussian process regression

Covariance function estimation in Gaussian process regression Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian

More information

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012

Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 2012 Phenomena in high dimensions in geometric analysis, random matrices, and computational geometry Roscoff, France, June 25-29, 202 BOUNDS AND ASYMPTOTICS FOR FISHER INFORMATION IN THE CENTRAL LIMIT THEOREM

More information

Harnack Inequalities and Applications for Stochastic Equations

Harnack Inequalities and Applications for Stochastic Equations p. 1/32 Harnack Inequalities and Applications for Stochastic Equations PhD Thesis Defense Shun-Xiang Ouyang Under the Supervision of Prof. Michael Röckner & Prof. Feng-Yu Wang March 6, 29 p. 2/32 Outline

More information

Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos. Christopher C. Strelioff 1,2 Dr. James P. Crutchfield 2

Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos. Christopher C. Strelioff 1,2 Dr. James P. Crutchfield 2 How Random Bayesian Inference and the Symbolic Dynamics of Deterministic Chaos Christopher C. Strelioff 1,2 Dr. James P. Crutchfield 2 1 Center for Complex Systems Research and Department of Physics University

More information

Thermodynamics of computation.

Thermodynamics of computation. Thermodynamics of computation. Dominique Chu School of Computing University of Kent, UK D.F.Chu@kent.ac.uk C 3 Symposium December 11, 2017 Outline 1 Computation and Life itself 2 Living computers 3 Energy

More information

Numerical Algorithms as Dynamical Systems

Numerical Algorithms as Dynamical Systems A Study on Numerical Algorithms as Dynamical Systems Moody Chu North Carolina State University What This Study Is About? To recast many numerical algorithms as special dynamical systems, whence to derive

More information

Introduction to asymptotic techniques for stochastic systems with multiple time-scales

Introduction to asymptotic techniques for stochastic systems with multiple time-scales Introduction to asymptotic techniques for stochastic systems with multiple time-scales Eric Vanden-Eijnden Courant Institute Motivating examples Consider the ODE {Ẋ = Y 3 + sin(πt) + cos( 2πt) X() = x

More information

A PARAMETRIC MODEL FOR DISCRETE-VALUED TIME SERIES. 1. Introduction

A PARAMETRIC MODEL FOR DISCRETE-VALUED TIME SERIES. 1. Introduction tm Tatra Mt. Math. Publ. 00 (XXXX), 1 10 A PARAMETRIC MODEL FOR DISCRETE-VALUED TIME SERIES Martin Janžura and Lucie Fialová ABSTRACT. A parametric model for statistical analysis of Markov chains type

More information

Biology as Information Dynamics

Biology as Information Dynamics Biology as Information Dynamics John Baez Stanford Complexity Group April 20, 2017 What is life? Self-replicating information! Information about what? How to self-replicate! It is clear that biology has

More information

Expectation Maximization

Expectation Maximization Expectation Maximization Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr 1 /

More information

arxiv: v1 [math.pr] 17 May 2018

arxiv: v1 [math.pr] 17 May 2018 Sensitivity Analysis for Rare Events based on Rényi Divergence Paul Dupuis Division of Applied Mathematics, Brown University, Providence, RI 292, USA arxiv:85.697v math.pr] 7 May 28 Markos A. Katsoulakis

More information

Biology as Information Dynamics

Biology as Information Dynamics Biology as Information Dynamics John Baez Biological Complexity: Can It Be Quantified? Beyond Center February 2, 2017 IT S ALL RELATIVE EVEN INFORMATION! When you learn something, how much information

More information

Probing the covariance matrix

Probing the covariance matrix Probing the covariance matrix Kenneth M. Hanson Los Alamos National Laboratory (ret.) BIE Users Group Meeting, September 24, 2013 This presentation available at http://kmh-lanl.hansonhub.com/ LA-UR-06-5241

More information

Stability of Stochastic Differential Equations

Stability of Stochastic Differential Equations Lyapunov stability theory for ODEs s Stability of Stochastic Differential Equations Part 1: Introduction Department of Mathematics and Statistics University of Strathclyde Glasgow, G1 1XH December 2010

More information

DATA-DRIVEN TECHNIQUES FOR ESTIMATION AND STOCHASTIC REDUCTION OF MULTI-SCALE SYSTEMS

DATA-DRIVEN TECHNIQUES FOR ESTIMATION AND STOCHASTIC REDUCTION OF MULTI-SCALE SYSTEMS DATA-DRIVEN TECHNIQUES FOR ESTIMATION AND STOCHASTIC REDUCTION OF MULTI-SCALE SYSTEMS A Dissertation Presented to the Faculty of the Department of Mathematics University of Houston In Partial Fulfillment

More information

6 Continuous-Time Birth and Death Chains

6 Continuous-Time Birth and Death Chains 6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.

More information

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion Brownian Motion An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2010 Background We have already seen that the limiting behavior of a discrete random walk yields a derivation of

More information

Large Deviations for Small-Noise Stochastic Differential Equations

Large Deviations for Small-Noise Stochastic Differential Equations Chapter 21 Large Deviations for Small-Noise Stochastic Differential Equations This lecture is at once the end of our main consideration of diffusions and stochastic calculus, and a first taste of large

More information

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1 Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate

More information

Econometrics I, Estimation

Econometrics I, Estimation Econometrics I, Estimation Department of Economics Stanford University September, 2008 Part I Parameter, Estimator, Estimate A parametric is a feature of the population. An estimator is a function of the

More information

Characterization of cutoff for reversible Markov chains

Characterization of cutoff for reversible Markov chains Characterization of cutoff for reversible Markov chains Yuval Peres Joint work with Riddhi Basu and Jonathan Hermon 3 December 2014 Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff

More information

Variational inference

Variational inference Simon Leglaive Télécom ParisTech, CNRS LTCI, Université Paris Saclay November 18, 2016, Télécom ParisTech, Paris, France. Outline Introduction Probabilistic model Problem Log-likelihood decomposition EM

More information

arxiv: v1 [cs.it] 30 Jun 2017

arxiv: v1 [cs.it] 30 Jun 2017 1 How biased is your model? Concentration Inequalities, Information and Model Bias Konstantinos Gourgoulias, Markos A. Katsoulakis, Luc Rey-Bellet and Jie Wang Abstract arxiv:1706.10260v1 [cs.it] 30 Jun

More information

Nonparametric Bayesian Methods - Lecture I

Nonparametric Bayesian Methods - Lecture I Nonparametric Bayesian Methods - Lecture I Harry van Zanten Korteweg-de Vries Institute for Mathematics CRiSM Masterclass, April 4-6, 2016 Overview of the lectures I Intro to nonparametric Bayesian statistics

More information

Handbook of Stochastic Methods

Handbook of Stochastic Methods C. W. Gardiner Handbook of Stochastic Methods for Physics, Chemistry and the Natural Sciences Third Edition With 30 Figures Springer Contents 1. A Historical Introduction 1 1.1 Motivation I 1.2 Some Historical

More information

Controlled Diffusions and Hamilton-Jacobi Bellman Equations

Controlled Diffusions and Hamilton-Jacobi Bellman Equations Controlled Diffusions and Hamilton-Jacobi Bellman Equations Emo Todorov Applied Mathematics and Computer Science & Engineering University of Washington Winter 2014 Emo Todorov (UW) AMATH/CSE 579, Winter

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Manifold Monte Carlo Methods

Manifold Monte Carlo Methods Manifold Monte Carlo Methods Mark Girolami Department of Statistical Science University College London Joint work with Ben Calderhead Research Section Ordinary Meeting The Royal Statistical Society October

More information

Introduction to Probability and Statistics (Continued)

Introduction to Probability and Statistics (Continued) Introduction to Probability and Statistics (Continued) Prof. icholas Zabaras Center for Informatics and Computational Science https://cics.nd.edu/ University of otre Dame otre Dame, Indiana, USA Email:

More information

Large Deviations for Small-Noise Stochastic Differential Equations

Large Deviations for Small-Noise Stochastic Differential Equations Chapter 22 Large Deviations for Small-Noise Stochastic Differential Equations This lecture is at once the end of our main consideration of diffusions and stochastic calculus, and a first taste of large

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Series 7, May 22, 2018 (EM Convergence)

Series 7, May 22, 2018 (EM Convergence) Exercises Introduction to Machine Learning SS 2018 Series 7, May 22, 2018 (EM Convergence) Institute for Machine Learning Dept. of Computer Science, ETH Zürich Prof. Dr. Andreas Krause Web: https://las.inf.ethz.ch/teaching/introml-s18

More information

Characterization of cutoff for reversible Markov chains

Characterization of cutoff for reversible Markov chains Characterization of cutoff for reversible Markov chains Yuval Peres Joint work with Riddhi Basu and Jonathan Hermon 23 Feb 2015 Joint work with Riddhi Basu and Jonathan Hermon Characterization of cutoff

More information

Lecture 12: Detailed balance and Eigenfunction methods

Lecture 12: Detailed balance and Eigenfunction methods Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility), 4.2-4.4 (explicit examples of eigenfunction methods) Gardiner

More information

Exact Simulation of Multivariate Itô Diffusions

Exact Simulation of Multivariate Itô Diffusions Exact Simulation of Multivariate Itô Diffusions Jose Blanchet Joint work with Fan Zhang Columbia and Stanford July 7, 2017 Jose Blanchet (Columbia/Stanford) Exact Simulation of Diffusions July 7, 2017

More information

Basic modeling approaches for biological systems. Mahesh Bule

Basic modeling approaches for biological systems. Mahesh Bule Basic modeling approaches for biological systems Mahesh Bule The hierarchy of life from atoms to living organisms Modeling biological processes often requires accounting for action and feedback involving

More information

Bayesian Regularization

Bayesian Regularization Bayesian Regularization Aad van der Vaart Vrije Universiteit Amsterdam International Congress of Mathematicians Hyderabad, August 2010 Contents Introduction Abstract result Gaussian process priors Co-authors

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Information geometry for bivariate distribution control

Information geometry for bivariate distribution control Information geometry for bivariate distribution control C.T.J.Dodson + Hong Wang Mathematics + Control Systems Centre, University of Manchester Institute of Science and Technology Optimal control of stochastic

More information

The dynamics of small particles whose size is roughly 1 µmt or. smaller, in a fluid at room temperature, is extremely erratic, and is

The dynamics of small particles whose size is roughly 1 µmt or. smaller, in a fluid at room temperature, is extremely erratic, and is 1 I. BROWNIAN MOTION The dynamics of small particles whose size is roughly 1 µmt or smaller, in a fluid at room temperature, is extremely erratic, and is called Brownian motion. The velocity of such particles

More information

Endogenous Information Choice

Endogenous Information Choice Endogenous Information Choice Lecture 7 February 11, 2015 An optimizing trader will process those prices of most importance to his decision problem most frequently and carefully, those of less importance

More information

Stochastic Petri Nets. Jonatan Lindén. Modelling SPN GSPN. Performance measures. Almost none of the theory. December 8, 2010

Stochastic Petri Nets. Jonatan Lindén. Modelling SPN GSPN. Performance measures. Almost none of the theory. December 8, 2010 Stochastic Almost none of the theory December 8, 2010 Outline 1 2 Introduction A Petri net (PN) is something like a generalized automata. A Stochastic Petri Net () a stochastic extension to Petri nets,

More information

Exact Simulation of Diffusions and Jump Diffusions

Exact Simulation of Diffusions and Jump Diffusions Exact Simulation of Diffusions and Jump Diffusions A work by: Prof. Gareth O. Roberts Dr. Alexandros Beskos Dr. Omiros Papaspiliopoulos Dr. Bruno Casella 28 th May, 2008 Content 1 Exact Algorithm Construction

More information

arxiv: v2 [cond-mat.stat-mech] 11 Aug 2015

arxiv: v2 [cond-mat.stat-mech] 11 Aug 2015 arxiv:4.83v2 [cond-mat.stat-mech] Aug 25 Effect of diffusion in simple discontinuous absorbing transition models Salete Pianegonda Instituto de Física, Universidade Federal do Rio Grande do Sul, Caixa

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

Numerical methods in molecular dynamics and multiscale problems

Numerical methods in molecular dynamics and multiscale problems Numerical methods in molecular dynamics and multiscale problems Two examples T. Lelièvre CERMICS - Ecole des Ponts ParisTech & MicMac project-team - INRIA Horizon Maths December 2012 Introduction The aim

More information

Conjugate Predictive Distributions and Generalized Entropies

Conjugate Predictive Distributions and Generalized Entropies Conjugate Predictive Distributions and Generalized Entropies Eduardo Gutiérrez-Peña Department of Probability and Statistics IIMAS-UNAM, Mexico Padova, Italy. 21-23 March, 2013 Menu 1 Antipasto/Appetizer

More information

Uncertainty quantification and systemic risk

Uncertainty quantification and systemic risk Uncertainty quantification and systemic risk Josselin Garnier (Université Paris Diderot) with George Papanicolaou and Tzu-Wei Yang (Stanford University) February 3, 2016 Modeling systemic risk We consider

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

An Introduction to Stochastic Epidemic Models

An Introduction to Stochastic Epidemic Models An Introduction to Stochastic Epidemic Models Linda J. S. Allen Department of Mathematics and Statistics Texas Tech University Lubbock, Texas 79409-1042, U.S.A. linda.j.allen@ttu.edu 1 Introduction The

More information

Lecture 12: Detailed balance and Eigenfunction methods

Lecture 12: Detailed balance and Eigenfunction methods Miranda Holmes-Cerfon Applied Stochastic Analysis, Spring 2015 Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility),

More information

Asymptotics for posterior hazards

Asymptotics for posterior hazards Asymptotics for posterior hazards Pierpaolo De Blasi University of Turin 10th August 2007, BNR Workshop, Isaac Newton Intitute, Cambridge, UK Joint work with Giovanni Peccati (Université Paris VI) and

More information

Efficient robust optimization for robust control with constraints Paul Goulart, Eric Kerrigan and Danny Ralph

Efficient robust optimization for robust control with constraints Paul Goulart, Eric Kerrigan and Danny Ralph Efficient robust optimization for robust control with constraints p. 1 Efficient robust optimization for robust control with constraints Paul Goulart, Eric Kerrigan and Danny Ralph Efficient robust optimization

More information

Extending the Tools of Chemical Reaction Engineering to the Molecular Scale

Extending the Tools of Chemical Reaction Engineering to the Molecular Scale Extending the Tools of Chemical Reaction Engineering to the Molecular Scale Multiple-time-scale order reduction for stochastic kinetics James B. Rawlings Department of Chemical and Biological Engineering

More information

Adaptive Monte Carlo methods

Adaptive Monte Carlo methods Adaptive Monte Carlo methods Jean-Michel Marin Projet Select, INRIA Futurs, Université Paris-Sud joint with Randal Douc (École Polytechnique), Arnaud Guillin (Université de Marseille) and Christian Robert

More information

Gradient-based Monte Carlo sampling methods

Gradient-based Monte Carlo sampling methods Gradient-based Monte Carlo sampling methods Johannes von Lindheim 31. May 016 Abstract Notes for a 90-minute presentation on gradient-based Monte Carlo sampling methods for the Uncertainty Quantification

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

On different notions of timescales in molecular dynamics

On different notions of timescales in molecular dynamics On different notions of timescales in molecular dynamics Origin of Scaling Cascades in Protein Dynamics June 8, 217 IHP Trimester Stochastic Dynamics out of Equilibrium Overview 1. Motivation 2. Definition

More information

16. Working with the Langevin and Fokker-Planck equations

16. Working with the Langevin and Fokker-Planck equations 16. Working with the Langevin and Fokker-Planck equations In the preceding Lecture, we have shown that given a Langevin equation (LE), it is possible to write down an equivalent Fokker-Planck equation

More information

Existence, Uniqueness and Stability of Invariant Distributions in Continuous-Time Stochastic Models

Existence, Uniqueness and Stability of Invariant Distributions in Continuous-Time Stochastic Models Existence, Uniqueness and Stability of Invariant Distributions in Continuous-Time Stochastic Models Christian Bayer and Klaus Wälde Weierstrass Institute for Applied Analysis and Stochastics and University

More information

Estimation of probability density functions by the Maximum Entropy Method

Estimation of probability density functions by the Maximum Entropy Method Estimation of probability density functions by the Maximum Entropy Method Claudio Bierig, Alexey Chernov Institute for Mathematics Carl von Ossietzky University Oldenburg alexey.chernov@uni-oldenburg.de

More information

Statistics & Data Sciences: First Year Prelim Exam May 2018

Statistics & Data Sciences: First Year Prelim Exam May 2018 Statistics & Data Sciences: First Year Prelim Exam May 2018 Instructions: 1. Do not turn this page until instructed to do so. 2. Start each new question on a new sheet of paper. 3. This is a closed book

More information

On Input Design for System Identification

On Input Design for System Identification On Input Design for System Identification Input Design Using Markov Chains CHIARA BRIGHENTI Masters Degree Project Stockholm, Sweden March 2009 XR-EE-RT 2009:002 Abstract When system identification methods

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Derivation of the GENERIC form of nonequilibrium thermodynamics from a statistical optimization principle

Derivation of the GENERIC form of nonequilibrium thermodynamics from a statistical optimization principle Derivation of the GENERIC form of nonequilibrium thermodynamics from a statistical optimization principle Bruce Turkington Univ. of Massachusetts Amherst An optimization principle for deriving nonequilibrium

More information

Cybergenetics: Control theory for living cells

Cybergenetics: Control theory for living cells Department of Biosystems Science and Engineering, ETH-Zürich Cybergenetics: Control theory for living cells Corentin Briat Joint work with Ankit Gupta and Mustafa Khammash Introduction Overview Cybergenetics:

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Decay rates for partially dissipative hyperbolic systems

Decay rates for partially dissipative hyperbolic systems Outline Decay rates for partially dissipative hyperbolic systems Basque Center for Applied Mathematics Bilbao, Basque Country, Spain zuazua@bcamath.org http://www.bcamath.org/zuazua/ Numerical Methods

More information

Stochastic2010 Page 1

Stochastic2010 Page 1 Stochastic2010 Page 1 Extinction Probability for Branching Processes Friday, April 02, 2010 2:03 PM Long-time properties for branching processes Clearly state 0 is an absorbing state, forming its own recurrent

More information

Effective dynamics for the (overdamped) Langevin equation

Effective dynamics for the (overdamped) Langevin equation Effective dynamics for the (overdamped) Langevin equation Frédéric Legoll ENPC and INRIA joint work with T. Lelièvre (ENPC and INRIA) Enumath conference, MS Numerical methods for molecular dynamics EnuMath

More information

Robotics. Control Theory. Marc Toussaint U Stuttgart

Robotics. Control Theory. Marc Toussaint U Stuttgart Robotics Control Theory Topics in control theory, optimal control, HJB equation, infinite horizon case, Linear-Quadratic optimal control, Riccati equations (differential, algebraic, discrete-time), controllability,

More information

Entrance Exam, Differential Equations April, (Solve exactly 6 out of the 8 problems) y + 2y + y cos(x 2 y) = 0, y(0) = 2, y (0) = 4.

Entrance Exam, Differential Equations April, (Solve exactly 6 out of the 8 problems) y + 2y + y cos(x 2 y) = 0, y(0) = 2, y (0) = 4. Entrance Exam, Differential Equations April, 7 (Solve exactly 6 out of the 8 problems). Consider the following initial value problem: { y + y + y cos(x y) =, y() = y. Find all the values y such that the

More information

Local vs. Nonlocal Diffusions A Tale of Two Laplacians

Local vs. Nonlocal Diffusions A Tale of Two Laplacians Local vs. Nonlocal Diffusions A Tale of Two Laplacians Jinqiao Duan Dept of Applied Mathematics Illinois Institute of Technology Chicago duan@iit.edu Outline 1 Einstein & Wiener: The Local diffusion 2

More information

Path integrals for classical Markov processes

Path integrals for classical Markov processes Path integrals for classical Markov processes Hugo Touchette National Institute for Theoretical Physics (NITheP) Stellenbosch, South Africa Chris Engelbrecht Summer School on Non-Linear Phenomena in Field

More information

Brownian motion and the Central Limit Theorem

Brownian motion and the Central Limit Theorem Brownian motion and the Central Limit Theorem Amir Bar January 4, 3 Based on Shang-Keng Ma, Statistical Mechanics, sections.,.7 and the course s notes section 6. Introduction In this tutorial we shall

More information

A variational radial basis function approximation for diffusion processes

A variational radial basis function approximation for diffusion processes A variational radial basis function approximation for diffusion processes Michail D. Vrettas, Dan Cornford and Yuan Shen Aston University - Neural Computing Research Group Aston Triangle, Birmingham B4

More information

Minimum Message Length Analysis of the Behrens Fisher Problem

Minimum Message Length Analysis of the Behrens Fisher Problem Analysis of the Behrens Fisher Problem Enes Makalic and Daniel F Schmidt Centre for MEGA Epidemiology The University of Melbourne Solomonoff 85th Memorial Conference, 2011 Outline Introduction 1 Introduction

More information

Numerical Methods with Lévy Processes

Numerical Methods with Lévy Processes Numerical Methods with Lévy Processes 1 Objective: i) Find models of asset returns, etc ii) Get numbers out of them. Why? VaR and risk management Valuing and hedging derivatives Why not? Usual assumption:

More information

Stochastic Processes around Central Dogma

Stochastic Processes around Central Dogma Stochastic Processes around Central Dogma Hao Ge haoge@pu.edu.cn Beijing International Center for Mathematical Research Biodynamic Optical Imaging Center Peing University, China http://www.bicmr.org/personal/gehao/

More information

GWAS V: Gaussian processes

GWAS V: Gaussian processes GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011

More information

Introduction Probabilistic Programming ProPPA Inference Results Conclusions. Embedding Machine Learning in Stochastic Process Algebra.

Introduction Probabilistic Programming ProPPA Inference Results Conclusions. Embedding Machine Learning in Stochastic Process Algebra. Embedding Machine Learning in Stochastic Process Algebra Jane Hillston Joint work with Anastasis Georgoulas and Guido Sanguinetti, School of Informatics, University of Edinburgh 16th August 2017 quan col....

More information

Onsager theory: overview

Onsager theory: overview Onsager theory: overview Pearu Peterson December 18, 2006 1 Introduction Our aim is to study matter that consists of large number of molecules. A complete mechanical description of such a system is practically

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Lecture 9: Sensitivity Analysis ST 2018 Tobias Neckel Scientific Computing in Computer Science TUM Repetition of Previous Lecture Sparse grids in Uncertainty Quantification

More information

Stationary Priors for Bayesian Target Tracking

Stationary Priors for Bayesian Target Tracking Stationary Priors for Bayesian Target Tracking Brian R. La Cour Applied Research Laboratories The University of Texas at Austin Austin, Texas, U.S.A. Email: blacour@arlut.utexas.edu 969 Abstract A dynamically-based

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

Computational methods for continuous time Markov chains with applications to biological processes

Computational methods for continuous time Markov chains with applications to biological processes Computational methods for continuous time Markov chains with applications to biological processes David F. Anderson anderson@math.wisc.edu Department of Mathematics University of Wisconsin - Madison Penn.

More information