Bayesian inference for stochastic differential mixed effects models - initial steps
|
|
- May Jefferson
- 5 years ago
- Views:
Transcription
1 Bayesian inference for stochastic differential ixed effects odels - initial steps Gavin Whitaker 2nd May 2012 Supervisors: RJB and AG
2 Outline Mixed Effects Stochastic Differential Equations (SDEs) Bayesian inference for SDEs Toy odel (Roberts and Straer 2001 paper)
3 SDE Models Consider an Itô process {X t, t 0} satisfying dx t = α(x t, θ)dt + β(x t, θ)dw t X t is the value of the process at tie t α(x t, θ) is the drift β(x t, θ) is the diffusion coefficient W t is standard Brownian otion X 0 is the vector of initial conditions
4 SDE Models Consider an Itô process {X t, t 0} satisfying dx t = α(x t, θ)dt + β(x t, θ)dw t X t is the value of the process at tie t α(x t, θ) is the drift β(x t, θ) is the diffusion coefficient W t is standard Brownian otion X 0 is the vector of initial conditions Seek a nuerical solution via (for exaple) the Euler-Maruyaa approxiation X t X t+ t X t = α(x t, θ) t + β(x t, θ) W t where W t N(0, I t)
5 SDE Models CIR Model dx t = (θ 1 θ 2 X t )dt + θ 3 Xt dw t Used to odel short ter interest rates The process is ean reverting
6 SDE Models CIR Model Siulation of CIR Model X t θ 3=0.5 θ 3= Tie Figure: Nuerical solution for CIR odel, θ 1 = 1, θ 2 = 0.1, X 0 = 15
7 SDE Models Aphid Growth Model Also known as plant lice, or greenfly They are sall sap sucking insects Soe species of ants far aphids, for the honeydew they release. These dairying ants, ilk the aphids by stroking the
8 SDE Models Aphid Growth Model ( ) ( ) ( dnt λnt µn = t C t λnt + µn dt + t C t λn t dc t λn t λn t λn t ) 1 2 dw t N t is the aphid population size at tie t C t is the cuulative population at tie t This odel is an SDE approxiation to an underlying stochastic kinetic odel Birth rate of λn t and a death rate of µn t C t
9 SDE Models Aphid Growth Model Siulation of Aphid Growth Model Population size N t C t Tie Figure: Nuerical solution for Aphid odel, λ = 1.75, µ = 0.001
10 Mixed Effects SDE Models What if experiental units are not identical? Suppose the units have coon paraeters θ but different paraeters b i We treat the b i as rando effects with a population profile
11 Mixed Effects SDE Models What if experiental units are not identical? Suppose the units have coon paraeters θ but different paraeters b i We treat the b i as rando effects with a population profile This gives us a stochastic differential ixed-effects odel for the experiental units: dx i t = α(x i t, θ, b i )dt + β(x i t, θ, b i )dw i t, i = 1,..., M Differences between units are down to different realisations of the Brownian otion paths W i t and the rando effects b i Allows us to split the total variation between within- and between-individual coponents
12 Bayesian inference for SDEs Probleatic due to the intractability of the transition density characterising the process In other words, we typically can t solve an SDE analytically So we could just work with the Euler approxiation Given data d at equidistant ties t 0, t 1,..., t n, the Euler approxiation ay be unsatisfactory for t = t i+1 t i We therefore adopt a data augentation schee
13 Bayesian inference for SDEs Introduce a partition of [t i, t i+1 ] as t i = τ i < τ i+1 <... < τ (i+1) = t i+1 where τ τ i+1 τ i = t i+1 t i Apply Euler approxiation over each interval of width τ Introduces 1 latent values between every pair of observations
14 Bayesian inference for SDEs Forulate joint posterior for paraeters and latent values d = (x t0, x t1,..., x tn ) x = (x τ1, x τ2,..., x τ 1, x τ+1,...,..., x τn 1 ) = latent path (x, d) = (x τ0, x τ1,..., x τ, x τ+1,...,..., x τn ) = augented path
15 Bayesian inference for SDEs Forulate joint posterior for paraeters and latent data as where π(θ, x d) π(θ)π(x, d θ) n 1 π(θ) i=0 π(x τi+1 x τi, θ) π(x τi+1 x τi, θ) = φ ( x τi+1 ; x τi + α(x τi, θ) t, β(x τi, θ) t ) and φ( ; µ, Σ) denotes the Gaussian density with ean µ and variance Σ
16 Bayesian inference for SDEs Forulate joint posterior for paraeters and latent data as where π(θ, x d) π(θ)π(x, d θ) n 1 π(θ) i=0 π(x τi+1 x τi, θ) π(x τi+1 x τi, θ) = φ ( x τi+1 ; x τi + α(x τi, θ) t, β(x τi, θ) t ) and φ( ; µ, Σ) denotes the Gaussian density with ean µ and variance Σ The posterior distribution is typically analytically intractable
17 A Gibbs sapling approach We therefore saple via an MCMC schee E.g a Gibbs sapler, alternating between draws of θ x, d x θ, d
18 A Gibbs sapling approach We therefore saple via an MCMC schee E.g a Gibbs sapler, alternating between draws of θ x, d x θ, d The last step can be done (for exaple) in blocks of length 1 between observations Metropolis within Gibbs updates ay be needed Proble: the ixing is poor for large
19 Toy odel Consider the SDE dx t = 1 θ dw t Suppose that we have observations X 0 = x 0 = 0 and X 1 = x 1 Set τ i = i/ for i = 0, 1,..., so that (x, d) = x 0, x 1, x 2,..., x 1,x 1 }{{} obs path (bridge) obs Under the Euler approxiation X i x (i 1), θ N ( ) 1 x (i 1), θ
20 Toy odel Consider the SDE dx t = 1 θ dw t Suppose that we have observations X 0 = x 0 = 0 and X 1 = x 1 Set τ i = i/ for i = 0, 1,..., so that (x, d) = x 0, x 1, x 2,..., x 1,x 1 }{{} obs path (bridge) obs
21 Toy odel Consider the SDE dx t = 1 θ dw t Suppose that we have observations X 0 = x 0 = 0 and X 1 = x 1 Set τ i = i/ for i = 0, 1,..., so that (x, d) = x 0, x 1, x 2,..., x 1,x 1 }{{} obs path (bridge) obs Under the Euler approxiation X i x (i 1), θ N ( ) 1 x (i 1), θ
22 Toy odel Hence π(x, d θ) θ θ exp 2π i=1 θ /2 exp { 1 2 θ i=1 ( x i ( x i x (i 1) 2 x (i 1) ) 2 } ) 2
23 Toy odel Hence π(x, d θ) θ θ exp 2π i=1 θ /2 exp { 1 2 θ i=1 ( x i ( x i x (i 1) 2 x (i 1) ) 2 } ) 2 Take prior θ Exp(1) The full conditional for θ is π(θ x, d) π(θ)π(x, d θ)
24 Toy odel θ /2 exp { { θ /2 exp θ 1 2 θ i=1 ( ΣX 2 ( x i )} + 1 x (i 1) ) 2 θ } where Σ X = i=1 ( x i x (i 1) ) 2
25 Toy odel θ /2 exp { { θ /2 exp θ 1 2 θ i=1 ( ΣX 2 ( x i )} + 1 x (i 1) ) 2 θ } where Σ X = i=1 ( x i x (i 1) ) 2 Therefore ( θ x, d Γ 2 + 1, Σ X 2 ) + 1
26 Toy odel Under the linear Gaussian structure of the siple SDE, the full conditional x θ, d can be sapled using X i x 1, θ = ix Z i, i = 1, 2,..., θ where {Z t, 0 t 1} is a standard Brownian bidge, that is a standard Brownian otion conditioned to hit 0 at tie 0, at tie 1
27 Toy odel Siulated data Take x 0 = 0, θ = 1 Siulate x 1 using Euler schee Get x 1 = d = (0, )
28 Toy odel Siulated data Take x 0 = 0, θ = 1 Siulate x 1 using Euler schee Get x 1 = d = (0, ) MCMC schee We perfor a run of 1000 iterations of the schee with no thin. Initialise with θ (0) = 1, the prior ean Step 1 Update the discretised Brownian bridge which hits x 1 at t = 1 Step 2 Draw θ fro its full conditional distribution, ( θ x, d Γ 2 + 1, Σ ) X + 1 2
29 Toy odel Results log(1 θ), =10 θ Tie log(1 θ), =100 θ Tie log(1 θ), =1000 θ Tie Figure: Trace plots for log( 1 θ )
30 Toy odel Results Auto correlation for log(1 θ) ACF Lag =10 =100 =1000 Figure: Auto-correlation plots for log( 1 θ )
31 Toy odel Results Mixing gets even worse for larger Why is this happening? Try and quantify the ixing tie by considering the paraeter update The paraeter update can be rewritten in a clever way...
32 Toy odel What s going wrong? Recall that X i x 1, θ = ix Z i, i = 1, 2,..., θ
33 Toy odel What s going wrong? Recall that X i x 1, θ = ix Z i, i = 1, 2,..., θ Now Σ X = = = ( i x [ (i 1) Z i i=1 θ ( 1 [ Z i Z (i 1) θ i=1 i=1 ( 1 θ [ Z i Z (i 1) x θ Z (i 1) ] + i x (i 1) 1 ] + x ) 2 1 x 1 ) 2 ]) 2
34 Toy odel What s going wrong? Expanding out Σ X = i=1 ( 1 [ Z i θ Z (i 1) = 1 θ Σ Z + x x 1 θ ] 2 + x 2 1 i= θ x 1 [ Z i Z (i 1) ] [ Z i Z (i 1) ] ) Now i=1 [ Z i Z (i 1) ] = ( ) ( ) Z 1 Z 0 + Z 2 Z (... + ( Z ( 1) Z ( 2) ) + Z 1 Z ( 1) )
35 Toy odel What s going wrong? Expanding out Σ X = i=1 ( 1 [ Z i θ Z (i 1) = 1 θ Σ Z + x x 1 θ ] 2 + x 2 1 i= θ x 1 [ Z i Z (i 1) ] [ Z i Z (i 1) ] ) Now i=1 [ Z i Z (i 1) ] = ( ) ( ) Z 1 Z 0 + Z 2 Z ( ) ( )... + Z ( 1) Z ( 2) + Z 1 Z ( 1)
36 Toy odel What s going wrong? So Σ X = 1 θ Σ Z + x θ x 1 [Z 1 Z 0 ] But Z 0 = Z 1 = 0 since Z is a standard Brownian bridge Thus where Σ Z = Σ X = 1 θ Σ Z + x2 1 i=1 ( Z i Z (i 1) ) 2 Using properties of Gaussian rando variables we have Σ Z χ2 1
37 Toy odel What s going wrong? So Σ X = 1 θ Σ Z + x ] x 1 θ [ 0 Z 1 Z 0 But Z 0 = Z 1 = 0 since Z is a standard Brownian bridge Thus where Σ Z = Σ X = 1 θ Σ Z + x2 1 i=1 ( Z i Z (i 1) ) 2
38 Toy odel What s going wrong? So Σ X = 1 θ Σ Z + x ] x 1 θ [ 0 Z 1 Z 0 But Z 0 = Z 1 = 0 since Z is a standard Brownian bridge Thus where Σ Z = Σ X = 1 θ Σ Z + x2 1 i=1 ( Z i Z (i 1) ) 2 Using properties of Gaussian rando variables we have Σ Z χ2 1
39 Toy odel What s going wrong? Now If H Γ ( 2 + 1, 1) then ( θ x, d Γ 2 + 1, Σ X 2 θ new = = H Σ X H ( x χ2 1 θ old ) + 1 ) + 1
40 Toy odel What s going wrong? Now If H Γ ( 2 + 1, 1) then ( θ x, d Γ 2 + 1, Σ X 2 θ new = = H Σ X H x χ2 1 2θ old + 1 ) + 1
41 Toy odel What s going wrong? For large, approxiate H and χ 2 1 with Noral rando variables Roberts and Straer then use a suitable Taylor expansion of the expression for θ new to give ( ) 1 } 2 2 θ new θ old {1 + (W1 W 2 ) 2 + x2 1 θ old + W 2 2 where W 1 and W 2 are independent N(0, 1) rando variables
42 Toy odel What s going wrong? For large, approxiate H and χ 2 1 with Noral rando variables Roberts and Straer then use a suitable Taylor expansion of the expression for θ new to give ( ) 1 } 2 2 θ new θ old {1 + (W1 W 2 ) 2 + x2 1 θ old + W 2 2 where W 1 and W 2 are independent N(0, 1) rando variables θ new = θ old { 1 + O( 1 ) }
43 Toy odel What s going wrong? For large, approxiate H and χ 2 1 with Noral rando variables Roberts and Straer then use a suitable Taylor expansion of the expression for θ new to give ( ) 1 } 2 2 θ new θ old {1 + (W1 W 2 ) 2 + x2 1 θ old + W 2 2 where W 1 and W 2 are independent N(0, 1) rando variables θ new = θ old { 1 + O( 1 ) } Mixing tie is O()
44 Future work Construct MCMC schees for arbitrary nonlinear diffusion processes Naive schees with a block update for the path Better schees that use a reparaeterisation Joint update of path and paraeters (pmcmc) Application to ixed effects SDEs Aphid odel, real data exaples
45 References Roberts. G. O. and Straer. O. On inference for partially observed nonlinear diffusion odels using the Metropolis-Hastings algorith. Bioetrika, 88 (3) , 2001 Gillespie, C. S. and Golightly, A. Bayesian inference for generalized stochastic population growth odels with application to aphids. JRSS Series C, Applied Statistics, 59(2): , 2010
Pseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory
Pseudo-arginal Metropolis-Hastings: a siple explanation and (partial) review of theory Chris Sherlock Motivation Iagine a stochastic process V which arises fro soe distribution with density p(v θ ). Iagine
More informationTracking using CONDENSATION: Conditional Density Propagation
Tracking using CONDENSATION: Conditional Density Propagation Goal Model-based visual tracking in dense clutter at near video frae rates M. Isard and A. Blake, CONDENSATION Conditional density propagation
More informationUsing EM To Estimate A Probablity Density With A Mixture Of Gaussians
Using EM To Estiate A Probablity Density With A Mixture Of Gaussians Aaron A. D Souza adsouza@usc.edu Introduction The proble we are trying to address in this note is siple. Given a set of data points
More informationSome Examples on Gibbs Sampling and Metropolis-Hastings methods
Soe Exaples o Gibbs Saplig ad Metropolis-Hastigs ethods S420/620 Itroductio to Statistical Theory, Fall 2012 Gibbs Sapler Saple a ultidiesioal probability distributio fro coditioal desities. Suppose d
More informationCS Lecture 13. More Maximum Likelihood
CS 6347 Lecture 13 More Maxiu Likelihood Recap Last tie: Introduction to axiu likelihood estiation MLE for Bayesian networks Optial CPTs correspond to epirical counts Today: MLE for CRFs 2 Maxiu Likelihood
More informationMachine Learning Basics: Estimators, Bias and Variance
Machine Learning Basics: Estiators, Bias and Variance Sargur N. srihari@cedar.buffalo.edu This is part of lecture slides on Deep Learning: http://www.cedar.buffalo.edu/~srihari/cse676 1 Topics in Basics
More informationBayes Decision Rule and Naïve Bayes Classifier
Bayes Decision Rule and Naïve Bayes Classifier Le Song Machine Learning I CSE 6740, Fall 2013 Gaussian Mixture odel A density odel p(x) ay be ulti-odal: odel it as a ixture of uni-odal distributions (e.g.
More informationNonparametric Drift Estimation for Stochastic Differential Equations
Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,
More information1 Brownian motion and the Langevin equation
Figure 1: The robust appearance of Robert Brown (1773 1858) 1 Brownian otion and the Langevin equation In 1827, while exaining pollen grains and the spores of osses suspended in water under a icroscope,
More informationDonald Fussell. October 28, Computer Science Department The University of Texas at Austin. Point Masses and Force Fields.
s Vector Moving s and Coputer Science Departent The University of Texas at Austin October 28, 2014 s Vector Moving s Siple classical dynaics - point asses oved by forces Point asses can odel particles
More informationLecture 12: Ensemble Methods. Introduction. Weighted Majority. Mixture of Experts/Committee. Σ k α k =1. Isabelle Guyon
Lecture 2: Enseble Methods Isabelle Guyon guyoni@inf.ethz.ch Introduction Book Chapter 7 Weighted Majority Mixture of Experts/Coittee Assue K experts f, f 2, f K (base learners) x f (x) Each expert akes
More informationA Poisson process reparameterisation for Bayesian inference for extremes
A Poisson process reparaeterisation for Bayesian inference for extrees Paul Sharkey and Jonathan A. Tawn Eail: p.sharkey1@lancs.ac.uk, j.tawn@lancs.ac.uk STOR-i Centre for Doctoral Training, Departent
More informationBlock designs and statistics
Bloc designs and statistics Notes for Math 447 May 3, 2011 The ain paraeters of a bloc design are nuber of varieties v, bloc size, nuber of blocs b. A design is built on a set of v eleents. Each eleent
More informationMarkov chain Monte Carlo algorithms for SDE parameter estimation
Markov chain Monte Carlo algorithms for SDE parameter estimation Andrew Golightly and Darren J. Wilkinson Abstract This chapter considers stochastic differential equations for Systems Biology models derived
More informationTEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES
TEST OF HOMOGENEITY OF PARALLEL SAMPLES FROM LOGNORMAL POPULATIONS WITH UNEQUAL VARIANCES S. E. Ahed, R. J. Tokins and A. I. Volodin Departent of Matheatics and Statistics University of Regina Regina,
More informationTraining an RBM: Contrastive Divergence. Sargur N. Srihari
Training an RBM: Contrastive Divergence Sargur N. srihari@cedar.buffalo.edu Topics in Partition Function Definition of Partition Function 1. The log-likelihood gradient 2. Stochastic axiu likelihood and
More informationSIMPLE HARMONIC MOTION: NEWTON S LAW
SIMPLE HARMONIC MOTION: NEWTON S LAW siple not siple PRIOR READING: Main 1.1, 2.1 Taylor 5.1, 5.2 http://www.yoops.org/twocw/it/nr/rdonlyres/physics/8-012fall-2005/7cce46ac-405d-4652-a724-64f831e70388/0/chp_physi_pndul.jpg
More informationGaussian processes for inference in stochastic differential equations
Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017
More informationCSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13
CSE55: Randoied Algoriths and obabilistic Analysis May 6, Lecture Lecturer: Anna Karlin Scribe: Noah Siegel, Jonathan Shi Rando walks and Markov chains This lecture discusses Markov chains, which capture
More informationBayesian Inference for Discretely Sampled Diffusion Processes: A New MCMC Based Approach to Inference
Bayesian Inference for Discretely Sampled Diffusion Processes: A New MCMC Based Approach to Inference Osnat Stramer 1 and Matthew Bognar 1 Department of Statistics and Actuarial Science, University of
More informationExact Simulation of Diffusions and Jump Diffusions
Exact Simulation of Diffusions and Jump Diffusions A work by: Prof. Gareth O. Roberts Dr. Alexandros Beskos Dr. Omiros Papaspiliopoulos Dr. Bruno Casella 28 th May, 2008 Content 1 Exact Algorithm Construction
More informationLecture 4: Dynamic models
linear s Lecture 4: s Hedibert Freitas Lopes The University of Chicago Booth School of Business 5807 South Woodlawn Avenue, Chicago, IL 60637 http://faculty.chicagobooth.edu/hedibert.lopes hlopes@chicagobooth.edu
More informationExtension of CSRSM for the Parametric Study of the Face Stability of Pressurized Tunnels
Extension of CSRSM for the Paraetric Study of the Face Stability of Pressurized Tunnels Guilhe Mollon 1, Daniel Dias 2, and Abdul-Haid Soubra 3, M.ASCE 1 LGCIE, INSA Lyon, Université de Lyon, Doaine scientifique
More informationReading from Young & Freedman: For this topic, read the introduction to chapter 25 and sections 25.1 to 25.3 & 25.6.
PHY10 Electricity Topic 6 (Lectures 9 & 10) Electric Current and Resistance n this topic, we will cover: 1) Current in a conductor ) Resistivity 3) Resistance 4) Oh s Law 5) The Drude Model of conduction
More information13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices
CS71 Randoness & Coputation Spring 018 Instructor: Alistair Sinclair Lecture 13: February 7 Disclaier: These notes have not been subjected to the usual scrutiny accorded to foral publications. They ay
More informationEstimating Parameters for a Gaussian pdf
Pattern Recognition and achine Learning Jaes L. Crowley ENSIAG 3 IS First Seester 00/0 Lesson 5 7 Noveber 00 Contents Estiating Paraeters for a Gaussian pdf Notation... The Pattern Recognition Proble...3
More informationDEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS
ISSN 1440-771X AUSTRALIA DEPARTMENT OF ECONOMETRICS AND BUSINESS STATISTICS An Iproved Method for Bandwidth Selection When Estiating ROC Curves Peter G Hall and Rob J Hyndan Working Paper 11/00 An iproved
More informationRiemann Manifold Methods in Bayesian Statistics
Ricardo Ehlers ehlers@icmc.usp.br Applied Maths and Stats University of São Paulo, Brazil Working Group in Statistical Learning University College Dublin September 2015 Bayesian inference is based on Bayes
More informationPh 20.3 Numerical Solution of Ordinary Differential Equations
Ph 20.3 Nuerical Solution of Ordinary Differential Equations Due: Week 5 -v20170314- This Assignent So far, your assignents have tried to failiarize you with the hardware and software in the Physics Coputing
More informationData-Driven Imaging in Anisotropic Media
18 th World Conference on Non destructive Testing, 16- April 1, Durban, South Africa Data-Driven Iaging in Anisotropic Media Arno VOLKER 1 and Alan HUNTER 1 TNO Stieltjesweg 1, 6 AD, Delft, The Netherlands
More informationEstimation of the Mean of the Exponential Distribution Using Maximum Ranked Set Sampling with Unequal Samples
Open Journal of Statistics, 4, 4, 64-649 Published Online Septeber 4 in SciRes http//wwwscirporg/ournal/os http//ddoiorg/436/os4486 Estiation of the Mean of the Eponential Distribution Using Maiu Ranked
More informationCombining Classifiers
Cobining Classifiers Generic ethods of generating and cobining ultiple classifiers Bagging Boosting References: Duda, Hart & Stork, pg 475-480. Hastie, Tibsharini, Friedan, pg 246-256 and Chapter 10. http://www.boosting.org/
More informationRetail Planning in Future Cities A Stochastic Dynamical Singly Constrained Spatial Interaction Model
Retail Planning in Future Cities A Stochastic Dynamical Singly Constrained Spatial Interaction Model Mark Girolami Department of Mathematics, Imperial College London The Alan Turing Institute Lloyds Register
More informationSupport Vector Machines. Machine Learning Series Jerry Jeychandra Blohm Lab
Support Vector Machines Machine Learning Series Jerry Jeychandra Bloh Lab Outline Main goal: To understand how support vector achines (SVMs) perfor optial classification for labelled data sets, also a
More informationNonmonotonic Networks. a. IRST, I Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I Povo (Trento) Italy
Storage Capacity and Dynaics of Nononotonic Networks Bruno Crespi a and Ignazio Lazzizzera b a. IRST, I-38050 Povo (Trento) Italy, b. Univ. of Trento, Physics Dept., I-38050 Povo (Trento) Italy INFN Gruppo
More informationDetection and Estimation Theory
ESE 54 Detection and Estiation Theory Joseph A. O Sullivan Sauel C. Sachs Professor Electronic Systes and Signals Research Laboratory Electrical and Systes Engineering Washington University 11 Urbauer
More information2 Q 10. Likewise, in case of multiple particles, the corresponding density in 2 must be averaged over all
Lecture 6 Introduction to kinetic theory of plasa waves Introduction to kinetic theory So far we have been odeling plasa dynaics using fluid equations. The assuption has been that the pressure can be either
More informationA stochastic formulation of a dynamical singly constrained spatial interaction model
A stochastic formulation of a dynamical singly constrained spatial interaction model Mark Girolami Department of Mathematics, Imperial College London The Alan Turing Institute, British Library Lloyds Register
More informationSUPERPOSITION OF BETA PROCESSES
SUPERPOSITION OF BETA PROCESSES Pierpaolo De Blasi 1, Stefano Favaro 1 and Pietro Muliere 2 1 Dipartiento di Statistica e Mateatica Applicata, Università degli Studi di Torino Corso Unione Sovietica 218/bis,
More informationJ11.3 STOCHASTIC EVENT RECONSTRUCTION OF ATMOSPHERIC CONTAMINANT DISPERSION
J11.3 STOCHASTIC EVENT RECONSTRUCTION OF ATMOSPHERIC CONTAMINANT DISPERSION Inanc Senocak 1*, Nicolas W. Hengartner, Margaret B. Short 3, and Brent W. Daniel 1 Boise State University, Boise, ID, Los Alaos
More informationSequence Analysis, WS 14/15, D. Huson & R. Neher (this part by D. Huson) February 5,
Sequence Analysis, WS 14/15, D. Huson & R. Neher (this part by D. Huson) February 5, 2015 31 11 Motif Finding Sources for this section: Rouchka, 1997, A Brief Overview of Gibbs Sapling. J. Buhler, M. Topa:
More informationFigure 1: Equivalent electric (RC) circuit of a neurons membrane
Exercise: Leaky integrate and fire odel of neural spike generation This exercise investigates a siplified odel of how neurons spike in response to current inputs, one of the ost fundaental properties of
More informationlecture 35: Linear Multistep Mehods: Truncation Error
88 lecture 5: Linear Multistep Meods: Truncation Error 5.5 Linear ultistep etods One-step etods construct an approxiate solution x k+ x(t k+ ) using only one previous approxiation, x k. Tis approac enoys
More informationBayesian Approach for Fatigue Life Prediction from Field Inspection
Bayesian Approach for Fatigue Life Prediction fro Field Inspection Dawn An and Jooho Choi School of Aerospace & Mechanical Engineering, Korea Aerospace University, Goyang, Seoul, Korea Srira Pattabhiraan
More informationSEISMIC FRAGILITY ANALYSIS
9 th ASCE Specialty Conference on Probabilistic Mechanics and Structural Reliability PMC24 SEISMIC FRAGILITY ANALYSIS C. Kafali, Student M. ASCE Cornell University, Ithaca, NY 483 ck22@cornell.edu M. Grigoriu,
More informationare equal to zero, where, q = p 1. For each gene j, the pairwise null and alternative hypotheses are,
Page of 8 Suppleentary Materials: A ultiple testing procedure for ulti-diensional pairwise coparisons with application to gene expression studies Anjana Grandhi, Wenge Guo, Shyaal D. Peddada S Notations
More informationHORIZONTAL MOTION WITH RESISTANCE
DOING PHYSICS WITH MATLAB MECHANICS HORIZONTAL MOTION WITH RESISTANCE Ian Cooper School of Physics, Uniersity of Sydney ian.cooper@sydney.edu.au DOWNLOAD DIRECTORY FOR MATLAB SCRIPTS ec_fr_b. This script
More informatione-companion ONLY AVAILABLE IN ELECTRONIC FORM
OPERATIONS RESEARCH doi 10.1287/opre.1070.0427ec pp. ec1 ec5 e-copanion ONLY AVAILABLE IN ELECTRONIC FORM infors 07 INFORMS Electronic Copanion A Learning Approach for Interactive Marketing to a Custoer
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More information2nd Workshop on Joints Modelling Dartington April 2009 Identification of Nonlinear Bolted Lap Joint Parameters using Force State Mapping
Identification of Nonlinear Bolted Lap Joint Paraeters using Force State Mapping International Journal of Solids and Structures, 44 (007) 8087 808 Hassan Jalali, Haed Ahadian and John E Mottershead _ Γ
More informationAdaptive Monte Carlo methods
Adaptive Monte Carlo methods Jean-Michel Marin Projet Select, INRIA Futurs, Université Paris-Sud joint with Randal Douc (École Polytechnique), Arnaud Guillin (Université de Marseille) and Christian Robert
More information3D acoustic wave modeling with a time-space domain dispersion-relation-based Finite-difference scheme
P-8 3D acoustic wave odeling with a tie-space doain dispersion-relation-based Finite-difference schee Yang Liu * and rinal K. Sen State Key Laboratory of Petroleu Resource and Prospecting (China University
More informationRandom Process Review
Rando Process Review Consider a rando process t, and take k saples. For siplicity, we will set k. However it should ean any nuber of saples. t () t x t, t, t We have a rando vector t, t, t. If we find
More informationBayesian inference for nonlinear multivariate diffusion processes (and other Markov processes, and their application to systems biology)
Bayesian inference for nonlinear multivariate diffusion processes (and other Markov processes, and their application to systems biology) Darren Wilkinson School of Mathematics & Statistics and Centre for
More informationIN modern society that various systems have become more
Developent of Reliability Function in -Coponent Standby Redundant Syste with Priority Based on Maxiu Entropy Principle Ryosuke Hirata, Ikuo Arizono, Ryosuke Toohiro, Satoshi Oigawa, and Yasuhiko Takeoto
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE227C (Spring 2018): Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee227c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee227c@berkeley.edu October
More informationExperimental Design For Model Discrimination And Precise Parameter Estimation In WDS Analysis
City University of New York (CUNY) CUNY Acadeic Works International Conference on Hydroinforatics 8-1-2014 Experiental Design For Model Discriination And Precise Paraeter Estiation In WDS Analysis Giovanna
More informationCh 12: Variations on Backpropagation
Ch 2: Variations on Backpropagation The basic backpropagation algorith is too slow for ost practical applications. It ay take days or weeks of coputer tie. We deonstrate why the backpropagation algorith
More informationBoosting with log-loss
Boosting with log-loss Marco Cusuano-Towner Septeber 2, 202 The proble Suppose we have data exaples {x i, y i ) i =... } for a two-class proble with y i {, }. Let F x) be the predictor function with the
More informationMarkov Chain Monte Carlo
Markov Chain Monte Carlo Recall: To compute the expectation E ( h(y ) ) we use the approximation E(h(Y )) 1 n n h(y ) t=1 with Y (1),..., Y (n) h(y). Thus our aim is to sample Y (1),..., Y (n) from f(y).
More informationMultiscale Entropy Analysis: A New Method to Detect Determinism in a Time. Series. A. Sarkar and P. Barat. Variable Energy Cyclotron Centre
Multiscale Entropy Analysis: A New Method to Detect Deterinis in a Tie Series A. Sarkar and P. Barat Variable Energy Cyclotron Centre /AF Bidhan Nagar, Kolkata 700064, India PACS nubers: 05.45.Tp, 89.75.-k,
More informationCHAPTER 19: Single-Loop IMC Control
When I coplete this chapter, I want to be able to do the following. Recognize that other feedback algoriths are possible Understand the IMC structure and how it provides the essential control features
More information1 Bounding the Margin
COS 511: Theoretical Machine Learning Lecturer: Rob Schapire Lecture #12 Scribe: Jian Min Si March 14, 2013 1 Bounding the Margin We are continuing the proof of a bound on the generalization error of AdaBoost
More informationFeedforward Networks
Feedforward Networks Gradient Descent Learning and Backpropagation Christian Jacob CPSC 433 Christian Jacob Dept.of Coputer Science,University of Calgary CPSC 433 - Feedforward Networks 2 Adaptive "Prograing"
More informationAnalyzing Simulation Results
Analyzing Siulation Results Dr. John Mellor-Cruey Departent of Coputer Science Rice University johnc@cs.rice.edu COMP 528 Lecture 20 31 March 2005 Topics for Today Model verification Model validation Transient
More informationMarkov Chain Monte Carlo (MCMC)
Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can
More informationBayesian parameter inference for stochastic biochemical network models using particle MCMC
Bayesian parameter inference for stochastic biochemical network models using particle MCMC Andrew Golightly Darren J. Wilkinson May 16, 2011 Abstract Computational systems biology is concerned with the
More informationThe Solution of One-Phase Inverse Stefan Problem. by Homotopy Analysis Method
Applied Matheatical Sciences, Vol. 8, 214, no. 53, 2635-2644 HIKARI Ltd, www.-hikari.co http://dx.doi.org/1.12988/as.214.43152 The Solution of One-Phase Inverse Stefan Proble by Hootopy Analysis Method
More informationInspection; structural health monitoring; reliability; Bayesian analysis; updating; decision analysis; value of information
Cite as: Straub D. (2014). Value of inforation analysis with structural reliability ethods. Structural Safety, 49: 75-86. Value of Inforation Analysis with Structural Reliability Methods Daniel Straub
More informationVCMC: Variational Consensus Monte Carlo
VCMC: Variational Consensus Monte Carlo Maxim Rabinovich, Elaine Angelino, Michael I. Jordan Berkeley Vision and Learning Center September 22, 2015 probabilistic models! sky fog bridge water grass object
More informationBayes Estimation of the Logistic Distribution Parameters Based on Progressive Sampling
Appl. Math. Inf. Sci. 0, No. 6, 93-30 06) 93 Applied Matheatics & Inforation Sciences An International Journal http://dx.doi.org/0.8576/ais/0063 Bayes Estiation of the Logistic Distribution Paraeters Based
More informationFeedforward Networks
Feedforward Neural Networks - Backpropagation Feedforward Networks Gradient Descent Learning and Backpropagation CPSC 533 Fall 2003 Christian Jacob Dept.of Coputer Science,University of Calgary Feedforward
More informationFeedforward Networks. Gradient Descent Learning and Backpropagation. Christian Jacob. CPSC 533 Winter 2004
Feedforward Networks Gradient Descent Learning and Backpropagation Christian Jacob CPSC 533 Winter 2004 Christian Jacob Dept.of Coputer Science,University of Calgary 2 05-2-Backprop-print.nb Adaptive "Prograing"
More informationExercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters
Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for
More informationBiostatistics Department Technical Report
Biostatistics Departent Technical Report BST006-00 Estiation of Prevalence by Pool Screening With Equal Sized Pools and a egative Binoial Sapling Model Charles R. Katholi, Ph.D. Eeritus Professor Departent
More informationA generalization of the Multiple-try Metropolis algorithm for Bayesian estimation and model selection
A generalization of the Multiple-try Metropolis algorithm for Bayesian estimation and model selection Silvia Pandolfi Francesco Bartolucci Nial Friel University of Perugia, IT University of Perugia, IT
More informationMCMC 2: Lecture 3 SIR models - more topics. Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham
MCMC 2: Lecture 3 SIR models - more topics Phil O Neill Theo Kypraios School of Mathematical Sciences University of Nottingham Contents 1. What can be estimated? 2. Reparameterisation 3. Marginalisation
More informationStatistical Logic Cell Delay Analysis Using a Current-based Model
Statistical Logic Cell Delay Analysis Using a Current-based Model Hanif Fatei Shahin Nazarian Massoud Pedra Dept. of EE-Systes, University of Southern California, Los Angeles, CA 90089 {fatei, shahin,
More informationSPECTRUM sensing is a core concept of cognitive radio
World Acadey of Science, Engineering and Technology International Journal of Electronics and Counication Engineering Vol:6, o:2, 202 Efficient Detection Using Sequential Probability Ratio Test in Mobile
More informationComputational statistics
Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated
More informationNumerical Studies of a Nonlinear Heat Equation with Square Root Reaction Term
Nuerical Studies of a Nonlinear Heat Equation with Square Root Reaction Ter Ron Bucire, 1 Karl McMurtry, 1 Ronald E. Micens 2 1 Matheatics Departent, Occidental College, Los Angeles, California 90041 2
More informationClassical and Bayesian Inference for an Extension of the Exponential Distribution under Progressive Type-II Censored Data with Binomial Removals
J. Stat. Appl. Pro. Lett. 1, No. 3, 75-86 (2014) 75 Journal of Statistics Applications & Probability Letters An International Journal http://dx.doi.org/10.12785/jsapl/010304 Classical and Bayesian Inference
More informationProbability Distributions
Probability Distributions In Chapter, we ephasized the central role played by probability theory in the solution of pattern recognition probles. We turn now to an exploration of soe particular exaples
More informationma x = -bv x + F rod.
Notes on Dynaical Systes Dynaics is the study of change. The priary ingredients of a dynaical syste are its state and its rule of change (also soeties called the dynaic). Dynaical systes can be continuous
More informationA Note on the Applied Use of MDL Approximations
A Note on the Applied Use of MDL Approxiations Daniel J. Navarro Departent of Psychology Ohio State University Abstract An applied proble is discussed in which two nested psychological odels of retention
More informationGEE ESTIMATORS IN MIXTURE MODEL WITH VARYING CONCENTRATIONS
ACTA UIVERSITATIS LODZIESIS FOLIA OECOOMICA 3(3142015 http://dx.doi.org/10.18778/0208-6018.314.03 Olesii Doronin *, Rostislav Maiboroda ** GEE ESTIMATORS I MIXTURE MODEL WITH VARYIG COCETRATIOS Abstract.
More informationChapter 4: Hypothesis of Diffusion-Limited Growth
Suary This section derives a useful equation to predict quantu dot size evolution under typical organoetallic synthesis conditions that are used to achieve narrow size distributions. Assuing diffusion-controlled
More informationC na (1) a=l. c = CO + Clm + CZ TWO-STAGE SAMPLE DESIGN WITH SMALL CLUSTERS. 1. Introduction
TWO-STGE SMPLE DESIGN WITH SMLL CLUSTERS Robert G. Clark and David G. Steel School of Matheatics and pplied Statistics, University of Wollongong, NSW 5 ustralia. (robert.clark@abs.gov.au) Key Words: saple
More informationSimulation of Discrete Event Systems
Siulation of Discrete Event Systes Unit 9 Queueing Models Fall Winter 207/208 Prof. Dr.-Ing. Dipl.-Wirt.-Ing. Sven Tackenberg Benedikt Andrew Latos M.Sc.RWTH Chair and Institute of Industrial Engineering
More informationCourse Notes for EE227C (Spring 2018): Convex Optimization and Approximation
Course Notes for EE7C (Spring 018: Convex Optiization and Approxiation Instructor: Moritz Hardt Eail: hardt+ee7c@berkeley.edu Graduate Instructor: Max Sichowitz Eail: sichow+ee7c@berkeley.edu October 15,
More informationSupplementary to Learning Discriminative Bayesian Networks from High-dimensional Continuous Neuroimaging Data
Suppleentary to Learning Discriinative Bayesian Networks fro High-diensional Continuous Neuroiaging Data Luping Zhou, Lei Wang, Lingqiao Liu, Philip Ogunbona, and Dinggang Shen Proposition. Given a sparse
More informationSome Perspective. Forces and Newton s Laws
Soe Perspective The language of Kineatics provides us with an efficient ethod for describing the otion of aterial objects, and we ll continue to ake refineents to it as we introduce additional types of
More informationProjectile Motion with Air Resistance (Numerical Modeling, Euler s Method)
Projectile Motion with Air Resistance (Nuerical Modeling, Euler s Method) Theory Euler s ethod is a siple way to approxiate the solution of ordinary differential equations (ode s) nuerically. Specifically,
More informationIntelligent Systems: Reasoning and Recognition. Artificial Neural Networks
Intelligent Systes: Reasoning and Recognition Jaes L. Crowley MOSIG M1 Winter Seester 2018 Lesson 7 1 March 2018 Outline Artificial Neural Networks Notation...2 Introduction...3 Key Equations... 3 Artificial
More informationPHY307F/407F - Computational Physics Background Material for Expt. 3 - Heat Equation David Harrison
INTRODUCTION PHY37F/47F - Coputational Physics Background Material for Expt 3 - Heat Equation David Harrison In the Pendulu Experient, we studied the Runge-Kutta algorith for solving ordinary differential
More informationUncertainty Propagation and Nonlinear Filtering for Space Navigation using Differential Algebra
Uncertainty Propagation and Nonlinear Filtering for Space Navigation using Differential Algebra M. Valli, R. Arellin, P. Di Lizia and M. R. Lavagna Departent of Aerospace Engineering, Politecnico di Milano
More informationPattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods
Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs
More informationAn Improved Particle Filter with Applications in Ballistic Target Tracking
Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing
More informationPattern Recognition and Machine Learning. Learning and Evaluation for Pattern Recognition
Pattern Recognition and Machine Learning Jaes L. Crowley ENSIMAG 3 - MMIS Fall Seester 2017 Lesson 1 4 October 2017 Outline Learning and Evaluation for Pattern Recognition Notation...2 1. The Pattern Recognition
More informationA MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION
A eshsize boosting algorith in kernel density estiation A MESHSIZE BOOSTING ALGORITHM IN KERNEL DENSITY ESTIMATION C.C. Ishiekwene, S.M. Ogbonwan and J.E. Osewenkhae Departent of Matheatics, University
More information