Bayesian reconstruction of free energy profiles from umbrella samples

Size: px
Start display at page:

Download "Bayesian reconstruction of free energy profiles from umbrella samples"

Transcription

1 Bayesian reconstruction of free energy profiles from umbrella samples Thomas Stecher with Gábor Csányi and Noam Bernstein 21st November 212

2 Motivation Motivation and Goals Many conventional methods implicitly assume a smooth free energy Thermodynamic Integration Umbrella Integration Improve by making smoothness assumption explicit? Aim to develop an efficient technique to analyse data Try and use as much information as possible Alternative to WHAM, UI Bayesian framework: consistent way to analyse noisy data

3 Background Theory Umbrella Sampling Umbrella sampling in multiple windows Free energy in collective variable u = s(x): F (u) = 1 ln P (u) β P (u) = 1 e βv (x) δ(s(x) u)dx Z Thermal barriers make direct sampling impossible Torrie and Valleau (1974): add umbrella potential(s) w(u) to sample specific region: w(u) F (u) Repeat F (u) = F b (u) w(u) + C Complication: C will vary from window to window

4 Background Theory Gaussian Process Regression Bayes Theorem Tackles inverse probability problems: given some data, what are the model parameters? One-line derivation: P (f y)p (y) = P (y f)p (f) = P (y, f) P (f y) = P (y f)p (f) P (y) posterior = likelihood prior evidence P (y) = P (y f)p (f)df is a normalisation constant

5 Background Theory Gaussian Process Regression Gaussian Process Prior Prior distribution over functions A collection of random variables, any finite number of which have a joint Gaussian distribution Think: infinite-dimensional multivariate Gaussian Defined by a mean function, and a covariance function, m(x) = f(x) k(x 1, x 2 ) = (f(x 1 ) m(x 1 ))(f(x 2 ) m(x 2 ))

6 Background Theory Gaussian Process Regression Samples from a Gaussian Process Prior m(x) = k(x 1, x 2 ) = σ 2 f exp( 1 2 (x 1 x 2 ) 2 /l 2 )

7 Background Theory Gaussian Process Regression Samples from a Gaussian Process Prior m(x) = k(x 1, x 2 ) = σf 2 exp( 1 2 (x 1 x 2 ) 2 /l 2 ) l = 1., σ f =

8 Background Theory Gaussian Process Regression Samples from a Gaussian Process Prior m(x) = k(x 1, x 2 ) = σf 2 exp( 1 2 (x 1 x 2 ) 2 /l 2 ) l = 1., σ f =

9 Background Theory Gaussian Process Regression Samples from a Gaussian Process Prior m(x) = k(x 1, x 2 ) = σf 2 exp( 1 2 (x 1 x 2 ) 2 /l 2 ) l = 1., σ f =

10 Background Theory Gaussian Process Regression Samples from a Gaussian Process Prior m(x) = k(x 1, x 2 ) = σf 2 exp( 1 2 (x 1 x 2 ) 2 /l 2 ) l = 1., σ f =

11 Background Theory Gaussian Process Regression Samples from a Gaussian Process Prior m(x) = k(x 1, x 2 ) = σ 2 f exp( 1 2 (x 1 x 2 ) 2 /l 2 ) l = 1., σ f = 1. l = 2., σ f =

12 Background Theory Gaussian Process Regression Samples from a Gaussian Process Prior m(x) = k(x 1, x 2 ) = σ 2 f exp( 1 2 (x 1 x 2 ) 2 /l 2 ) l = 1., σ f = 1. l = 2., σ f =

13 Background Theory Gaussian Process Regression Samples from a Gaussian Process Prior m(x) = k(x 1, x 2 ) = σ 2 f exp( 1 2 (x 1 x 2 ) 2 /l 2 ) l = 1., σ f = 1. l = 2., σ f =

14 Background Theory Gaussian Process Regression Samples from a Gaussian Process Prior m(x) = k(x 1, x 2 ) = σ 2 f exp( 1 2 (x 1 x 2 ) 2 /l 2 ) l = 1., σ f = 1. l = 2., σ f =

15 Background Theory Gaussian Process Regression Likelihood and posterior distribution Observations with Gaussian noise: y f(x) N(f(x), Σ y ) Results in Gaussian process posterior: f(x ) = k T (x )(K + Σ y ) 1 y cov(f(x 1), f(x 2)) = k(x 1, x 2) k T (x 1)(K + Σ y ) 1 k(x 2) where (k(x )) i = k(x, x i ) K ij = k(x i, x j )

16 Background Theory Gaussian Process Regression Samples from the posterior

17 Background Theory Gaussian Process Regression Samples from the posterior

18 Background Theory Gaussian Process Regression Samples from the posterior

19 Background Theory Gaussian Process Regression Samples from the posterior

20 Background Theory Gaussian Process Regression Samples from the posterior

21 Background Theory Gaussian Process Regression Samples from the posterior

22 New methodology Histograms Learning from umbrella sampled histograms Bin probabilities biased free energies unbiased free energies F (u) = F b (u) w(u) + C Use unbiased free energies in GPR. Two complications: 1 True likelihood (noise) not Gaussian: approximate: Multinomial bin probabilities Gaussian propagate to first order to free energies 2 Need to account for unknown constants C

23 New methodology Histograms Accounting for the unknown free energy offsets Two equivalent ideas: 1 Model unknown constants explicitly with a flat (uninformative) prior: f(x ) = k T (x )Ky 1 (y H T β) cov(f(x 1), f(x 2)) = k(x 1, x 2) k T (x 1)Ky 1 k(x 2) where + k T (x 1)K 1 y β = [HK 1 y H T [HKy 1 H T ] 1 HKy 1 k(x 2) H T ] 1 HKy 1 y 2 Use free energy differences as input Linearity of difference operation crucial

24 New methodology UI Derivative information (UI) Mean forces from either constraint (TI) or restraint simulations (UI: Kästner and Thiel, 25) Harmonic umbrella: F (u) u κ i (ū i u i ) ūi Derivative (linear) of Gaussian process is another Gaussian process: k f,f (x 1, x 2 ) = f(x 1 ) f(x 2 ) x 2 = f(x 1 )f(x 2 ) = k(x 1, x 2 ) x 2 x 2 k f,f (x 1, x 2 ) = 2 x 1 x 2 k(x 1, x 2 )

25 New methodology UI Derivative information (UI) Results in posterior: k f,f (x 1, x 2 ) = k f,f (x 1, x 2 ) = x 2 k(x 1, x 2 ) 2 x 1 x 2 k(x 1, x 2 ) f(x ) = k T f,f (x )(K f,f + Σ y ) 1 y cov(f(x 1), f(x 2)) = k(x 1, x 2) k T f,f (x 1)(K f,f + Σ y ) 1 k f,f (x 2) where (k f,f (x )) i = k f,f (x, x i ) (K f,f ) ij = k f,f (x i, x j )

26 New methodology Hybrid Using derivative information in conjunction with histograms Define a joint Gaussian process of function values and its derivatives ( ) K(xy, x y ) K K f f = ff (x y, x y ) Kff T (x y, x y ) K f f (x y, x y ) Assume no correlation between function and derivative noise: Σ y y = Σ y Σ y

27 New methodology Hybrid Using derivative information in conjunction with histograms Define a joint Gaussian process of function values and its derivatives ( ) K(xy, x y ) K K f f = ff (x y, x y ) Kff T (x y, x y ) K f f (x y, x y ) Assume no correlation between function and derivative noise: Σ y y = Σ y Σ y BRUSH: Bayesian Reconstruction of free energy from Umbrella Samples using Histograms BRUSH(h) BRUSH(d) BRUSH(h+d)

28 Performance 1D 1D performance Slice of alanine dipeptide as test system Reference reconstruction: 5 x 5 ns MD umbrella strength: 1 kcal/mol Compare methods by comparing average RMS errors (over 1 reconstructions) for an optimised set of parameters: Umbrella strength: 5, 15, 25, 5, 1 kcal/mol Number of windows: 24, 12, 8 Number of bins: 2, 4, 8 (WHAM); 2-1 (GP)

29 Performance 1D 1D performance RMS error/kbt.1 BRUSH(h+d) BRUSH(h) BRUSH(d) UI, spline UI, K&T WHAM sampling time/ps

30 Performance 1D Representative reconstructions (6 ps) 3 Reference WHAM BRUSH(h+d) F/kBT -3 π π Ψ

31 Performance 1D Representative reconstructions (12 ps) 3 Reference WHAM BRUSH(h+d) F/kBT -3 π π Ψ

32 Performance 1D Representative reconstructions (12 ps) 3 Reference WHAM BRUSH(h+d) F/kBT -3 π π Ψ

33 Performance 1D Representative reconstructions (12 ps) 3 Reference WHAM BRUSH(h+d) F/kBT -3 π π Ψ

34 Performance 1D 1D Conclusions Ouperform conventional methods such as WHAM and UI Can use different types of data systematically Uncertainty estimates intrinsic to the method and accurate Smooth reconstructions

35 Performance 2D Two and more collective variables π Want to reconstruct from derivative information Conventional approach: least-squares fit of of radial basis (RB) functions Known problems with high condition numbers Ψ π π Φ π

36 Performance 2D Varying the number of windows (derivative readings) 5 ps/window RMS error/(kcal/mol) BRUSH(d) RB, opt. l RB, l = π/3 RB, l = π/3, SVD RMS noise in data number of windows

37 Performance 2D Varying the number of windows (derivative readings) π 18 = windows, 5ps 72 = windows, 5ps GP Ψ π 234 = 48 2 windows, 5ps π π Ψ RB Ψ π π Φ π π π Φ π π Φ π

38 Performance 2D Varying the simulation length RMS error/(kcal/mol) trajectory length/ps BRUSH(d) RB, l = π/3

39 Performance 2D Varying the simulation length π.25ps/window.5ps/window GP Ψ π 234 = 48 2 windows 5ps/window π π Ψ RB Ψ π π Φ π π π Φ π π Φ π

40 Conclusions and Outlook Conclusions and Outlook More efficient use of data in Bayesian framework Concurrent use of different information possible Histograms Derivatives Outperforms WHAM and UI Outperforms radial basis functions in several regimes

41 Conclusions and Outlook Conclusions and Outlook More efficient use of data in Bayesian framework Concurrent use of different information possible Histograms Derivatives Outperforms WHAM and UI Outperforms radial basis functions in several regimes Go beyond Gaussian noise approximation? Learn from MD trajectory directly? (Density estimation)

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

Gaussian Processes (10/16/13)

Gaussian Processes (10/16/13) STA561: Probabilistic machine learning Gaussian Processes (10/16/13) Lecturer: Barbara Engelhardt Scribes: Changwei Hu, Di Jin, Mengdi Wang 1 Introduction In supervised learning, we observe some inputs

More information

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes

COMP 551 Applied Machine Learning Lecture 20: Gaussian processes COMP 55 Applied Machine Learning Lecture 2: Gaussian processes Instructor: Ryan Lowe (ryan.lowe@cs.mcgill.ca) Slides mostly by: (herke.vanhoof@mcgill.ca) Class web page: www.cs.mcgill.ca/~hvanho2/comp55

More information

Sampling the free energy surfaces of collective variables

Sampling the free energy surfaces of collective variables Sampling the free energy surfaces of collective variables Jérôme Hénin Enhanced Sampling and Free-Energy Calculations Urbana, 12 September 2018 Please interrupt! struct bioinform phys chem theoretical

More information

GAUSSIAN PROCESS REGRESSION

GAUSSIAN PROCESS REGRESSION GAUSSIAN PROCESS REGRESSION CSE 515T Spring 2015 1. BACKGROUND The kernel trick again... The Kernel Trick Consider again the linear regression model: y(x) = φ(x) w + ε, with prior p(w) = N (w; 0, Σ). The

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information

Practical Bayesian Optimization of Machine Learning. Learning Algorithms

Practical Bayesian Optimization of Machine Learning. Learning Algorithms Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that

More information

Free energy simulations

Free energy simulations Free energy simulations Marcus Elstner and Tomáš Kubař January 14, 2013 Motivation a physical quantity that is of most interest in chemistry? free energies Helmholtz F or Gibbs G holy grail of computational

More information

Statistical Techniques in Robotics (16-831, F12) Lecture#20 (Monday November 12) Gaussian Processes

Statistical Techniques in Robotics (16-831, F12) Lecture#20 (Monday November 12) Gaussian Processes Statistical Techniques in Robotics (6-83, F) Lecture# (Monday November ) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan Applications of Gaussian Processes (a) Inverse Kinematics

More information

Reliability Monitoring Using Log Gaussian Process Regression

Reliability Monitoring Using Log Gaussian Process Regression COPYRIGHT 013, M. Modarres Reliability Monitoring Using Log Gaussian Process Regression Martin Wayne Mohammad Modarres PSA 013 Center for Risk and Reliability University of Maryland Department of Mechanical

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes Neil D. Lawrence GPSS 10th June 2013 Book Rasmussen and Williams (2006) Outline The Gaussian Density Covariance from Basis Functions Basis Function Representations Constructing

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.)

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.) Prof. Daniel Cremers 2. Regression (cont.) Regression with MLE (Rep.) Assume that y is affected by Gaussian noise : t = f(x, w)+ where Thus, we have p(t x, w, )=N (t; f(x, w), 2 ) 2 Maximum A-Posteriori

More information

Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes

Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Statistical Techniques in Robotics (16-831, F12) Lecture#21 (Monday November 12) Gaussian Processes Lecturer: Drew Bagnell Scribe: Venkatraman Narayanan 1, M. Koval and P. Parashar 1 Applications of Gaussian

More information

Supervised Learning Coursework

Supervised Learning Coursework Supervised Learning Coursework John Shawe-Taylor Tom Diethe Dorota Glowacka November 30, 2009; submission date: noon December 18, 2009 Abstract Using a series of synthetic examples, in this exercise session

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

Gaussian Processes. 1 What problems can be solved by Gaussian Processes?

Gaussian Processes. 1 What problems can be solved by Gaussian Processes? Statistical Techniques in Robotics (16-831, F1) Lecture#19 (Wednesday November 16) Gaussian Processes Lecturer: Drew Bagnell Scribe:Yamuna Krishnamurthy 1 1 What problems can be solved by Gaussian Processes?

More information

Computer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression

Computer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression Group Prof. Daniel Cremers 9. Gaussian Processes - Regression Repetition: Regularized Regression Before, we solved for w using the pseudoinverse. But: we can kernelize this problem as well! First step:

More information

Probabilistic & Unsupervised Learning

Probabilistic & Unsupervised Learning Probabilistic & Unsupervised Learning Gaussian Processes Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc ML/CSML, Dept Computer Science University College London

More information

Gaussian Processes in Machine Learning

Gaussian Processes in Machine Learning Gaussian Processes in Machine Learning November 17, 2011 CharmGil Hong Agenda Motivation GP : How does it make sense? Prior : Defining a GP More about Mean and Covariance Functions Posterior : Conditioning

More information

Gaussian processes for inference in stochastic differential equations

Gaussian processes for inference in stochastic differential equations Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017

More information

Modeling the Free Energy Landscape for Janus Particle Self-Assembly in the Gas Phase. Andy Long Kridsanaphong Limtragool

Modeling the Free Energy Landscape for Janus Particle Self-Assembly in the Gas Phase. Andy Long Kridsanaphong Limtragool Modeling the Free Energy Landscape for Janus Particle Self-Assembly in the Gas Phase Andy Long Kridsanaphong Limtragool Motivation We want to study the spontaneous formation of micelles and vesicles Applications

More information

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model (& discussion on the GPLVM tech. report by Prof. N. Lawrence, 06) Andreas Damianou Department of Neuro- and Computer Science,

More information

Confidence Estimation Methods for Neural Networks: A Practical Comparison

Confidence Estimation Methods for Neural Networks: A Practical Comparison , 6-8 000, Confidence Estimation Methods for : A Practical Comparison G. Papadopoulos, P.J. Edwards, A.F. Murray Department of Electronics and Electrical Engineering, University of Edinburgh Abstract.

More information

CS 7140: Advanced Machine Learning

CS 7140: Advanced Machine Learning Instructor CS 714: Advanced Machine Learning Lecture 3: Gaussian Processes (17 Jan, 218) Jan-Willem van de Meent (j.vandemeent@northeastern.edu) Scribes Mo Han (han.m@husky.neu.edu) Guillem Reus Muns (reusmuns.g@husky.neu.edu)

More information

Multivariate Bayesian Linear Regression MLAI Lecture 11

Multivariate Bayesian Linear Regression MLAI Lecture 11 Multivariate Bayesian Linear Regression MLAI Lecture 11 Neil D. Lawrence Department of Computer Science Sheffield University 21st October 2012 Outline Univariate Bayesian Linear Regression Multivariate

More information

Probabilistic numerics for deep learning

Probabilistic numerics for deep learning Presenter: Shijia Wang Department of Engineering Science, University of Oxford rning (RLSS) Summer School, Montreal 2017 Outline 1 Introduction Probabilistic Numerics 2 Components Probabilistic modeling

More information

CSci 8980: Advanced Topics in Graphical Models Gaussian Processes

CSci 8980: Advanced Topics in Graphical Models Gaussian Processes CSci 8980: Advanced Topics in Graphical Models Gaussian Processes Instructor: Arindam Banerjee November 15, 2007 Gaussian Processes Outline Gaussian Processes Outline Parametric Bayesian Regression Gaussian

More information

Computer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression

Computer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression Group Prof. Daniel Cremers 4. Gaussian Processes - Regression Definition (Rep.) Definition: A Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution.

More information

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017

COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 COMS 4721: Machine Learning for Data Science Lecture 10, 2/21/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University FEATURE EXPANSIONS FEATURE EXPANSIONS

More information

Multivariate statistical methods and data mining in particle physics

Multivariate statistical methods and data mining in particle physics Multivariate statistical methods and data mining in particle physics RHUL Physics www.pp.rhul.ac.uk/~cowan Academic Training Lectures CERN 16 19 June, 2008 1 Outline Statement of the problem Some general

More information

A new Hierarchical Bayes approach to ensemble-variational data assimilation

A new Hierarchical Bayes approach to ensemble-variational data assimilation A new Hierarchical Bayes approach to ensemble-variational data assimilation Michael Tsyrulnikov and Alexander Rakitko HydroMetCenter of Russia College Park, 20 Oct 2014 Michael Tsyrulnikov and Alexander

More information

Introduction. Chapter 1

Introduction. Chapter 1 Chapter 1 Introduction In this book we will be concerned with supervised learning, which is the problem of learning input-output mappings from empirical data (the training dataset). Depending on the characteristics

More information

Model Selection for Gaussian Processes

Model Selection for Gaussian Processes Institute for Adaptive and Neural Computation School of Informatics,, UK December 26 Outline GP basics Model selection: covariance functions and parameterizations Criteria for model selection Marginal

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Gaussian Processes Barnabás Póczos http://www.gaussianprocess.org/ 2 Some of these slides in the intro are taken from D. Lizotte, R. Parr, C. Guesterin

More information

Parametric Techniques Lecture 3

Parametric Techniques Lecture 3 Parametric Techniques Lecture 3 Jason Corso SUNY at Buffalo 22 January 2009 J. Corso (SUNY at Buffalo) Parametric Techniques Lecture 3 22 January 2009 1 / 39 Introduction In Lecture 2, we learned how to

More information

Adaptive Sampling of Clouds with a Fleet of UAVs: Improving Gaussian Process Regression by Including Prior Knowledge

Adaptive Sampling of Clouds with a Fleet of UAVs: Improving Gaussian Process Regression by Including Prior Knowledge Master s Thesis Presentation Adaptive Sampling of Clouds with a Fleet of UAVs: Improving Gaussian Process Regression by Including Prior Knowledge Diego Selle (RIS @ LAAS-CNRS, RT-TUM) Master s Thesis Presentation

More information

Gaussian Process Regression

Gaussian Process Regression Gaussian Process Regression 4F1 Pattern Recognition, 21 Carl Edward Rasmussen Department of Engineering, University of Cambridge November 11th - 16th, 21 Rasmussen (Engineering, Cambridge) Gaussian Process

More information

Models for models. Douglas Nychka Geophysical Statistics Project National Center for Atmospheric Research

Models for models. Douglas Nychka Geophysical Statistics Project National Center for Atmospheric Research Models for models Douglas Nychka Geophysical Statistics Project National Center for Atmospheric Research Outline Statistical models and tools Spatial fields (Wavelets) Climate regimes (Regression and clustering)

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes Iain Murray murray@cs.toronto.edu CSC255, Introduction to Machine Learning, Fall 28 Dept. Computer Science, University of Toronto The problem Learn scalar function of

More information

Parametric Techniques

Parametric Techniques Parametric Techniques Jason J. Corso SUNY at Buffalo J. Corso (SUNY at Buffalo) Parametric Techniques 1 / 39 Introduction When covering Bayesian Decision Theory, we assumed the full probabilistic structure

More information

Outline. Supervised Learning. Hong Chang. Institute of Computing Technology, Chinese Academy of Sciences. Machine Learning Methods (Fall 2012)

Outline. Supervised Learning. Hong Chang. Institute of Computing Technology, Chinese Academy of Sciences. Machine Learning Methods (Fall 2012) Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Linear Models for Regression Linear Regression Probabilistic Interpretation

More information

Lecture 5: GPs and Streaming regression

Lecture 5: GPs and Streaming regression Lecture 5: GPs and Streaming regression Gaussian Processes Information gain Confidence intervals COMP-652 and ECSE-608, Lecture 5 - September 19, 2017 1 Recall: Non-parametric regression Input space X

More information

Multi-Ensemble Markov Models and TRAM. Fabian Paul 21-Feb-2018

Multi-Ensemble Markov Models and TRAM. Fabian Paul 21-Feb-2018 Multi-Ensemble Markov Models and TRAM Fabian Paul 21-Feb-2018 Outline Free energies Simulation types Boltzmann reweighting Umbrella sampling multi-temperature simulation accelerated MD Analysis methods

More information

On the Noise Model of Support Vector Machine Regression. Massimiliano Pontil, Sayan Mukherjee, Federico Girosi

On the Noise Model of Support Vector Machine Regression. Massimiliano Pontil, Sayan Mukherjee, Federico Girosi MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL INTELLIGENCE LABORATORY and CENTER FOR BIOLOGICAL AND COMPUTATIONAL LEARNING DEPARTMENT OF BRAIN AND COGNITIVE SCIENCES A.I. Memo No. 1651 October 1998

More information

Bayesian tsunami fragility modeling considering input data uncertainty

Bayesian tsunami fragility modeling considering input data uncertainty Bayesian tsunami fragility modeling considering input data uncertainty Raffaele De Risi Research Associate University of Bristol United Kingdom De Risi, R., Goda, K., Mori, N., & Yasuda, T. (2016). Bayesian

More information

A Process over all Stationary Covariance Kernels

A Process over all Stationary Covariance Kernels A Process over all Stationary Covariance Kernels Andrew Gordon Wilson June 9, 0 Abstract I define a process over all stationary covariance kernels. I show how one might be able to perform inference that

More information

STAT 518 Intro Student Presentation

STAT 518 Intro Student Presentation STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible

More information

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions

Pattern Recognition and Machine Learning. Bishop Chapter 2: Probability Distributions Pattern Recognition and Machine Learning Chapter 2: Probability Distributions Cécile Amblard Alex Kläser Jakob Verbeek October 11, 27 Probability Distributions: General Density Estimation: given a finite

More information

GWAS V: Gaussian processes

GWAS V: Gaussian processes GWAS V: Gaussian processes Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS V: Gaussian processes Summer 2011

More information

Machine Learning. Bayesian Regression & Classification. Marc Toussaint U Stuttgart

Machine Learning. Bayesian Regression & Classification. Marc Toussaint U Stuttgart Machine Learning Bayesian Regression & Classification learning as inference, Bayesian Kernel Ridge regression & Gaussian Processes, Bayesian Kernel Logistic Regression & GP classification, Bayesian Neural

More information

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis

Lecture 3. G. Cowan. Lecture 3 page 1. Lectures on Statistical Data Analysis Lecture 3 1 Probability (90 min.) Definition, Bayes theorem, probability densities and their properties, catalogue of pdfs, Monte Carlo 2 Statistical tests (90 min.) general concepts, test statistics,

More information

PROBABILISTIC REASONING OVER TIME

PROBABILISTIC REASONING OVER TIME PROBABILISTIC REASONING OVER TIME In which we try to interpret the present, understand the past, and perhaps predict the future, even when very little is crystal clear. Outline Time and uncertainty Inference:

More information

Variational Methods in Bayesian Deconvolution

Variational Methods in Bayesian Deconvolution PHYSTAT, SLAC, Stanford, California, September 8-, Variational Methods in Bayesian Deconvolution K. Zarb Adami Cavendish Laboratory, University of Cambridge, UK This paper gives an introduction to the

More information

Bayesian Learning (II)

Bayesian Learning (II) Universität Potsdam Institut für Informatik Lehrstuhl Maschinelles Lernen Bayesian Learning (II) Niels Landwehr Overview Probabilities, expected values, variance Basic concepts of Bayesian learning MAP

More information

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

Chris Bishop s PRML Ch. 8: Graphical Models

Chris Bishop s PRML Ch. 8: Graphical Models Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular

More information

Gaussian Process Regression Networks

Gaussian Process Regression Networks Gaussian Process Regression Networks Andrew Gordon Wilson agw38@camacuk mlgengcamacuk/andrew University of Cambridge Joint work with David A Knowles and Zoubin Ghahramani June 27, 2012 ICML, Edinburgh

More information

COMP 551 Applied Machine Learning Lecture 21: Bayesian optimisation

COMP 551 Applied Machine Learning Lecture 21: Bayesian optimisation COMP 55 Applied Machine Learning Lecture 2: Bayesian optimisation Associate Instructor: (herke.vanhoof@mcgill.ca) Class web page: www.cs.mcgill.ca/~jpineau/comp55 Unless otherwise noted, all material posted

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Gaussian processes and bayesian optimization Stanisław Jastrzębski. kudkudak.github.io kudkudak

Gaussian processes and bayesian optimization Stanisław Jastrzębski. kudkudak.github.io kudkudak Gaussian processes and bayesian optimization Stanisław Jastrzębski kudkudak.github.io kudkudak Plan Goal: talk about modern hyperparameter optimization algorithms Bayes reminder: equivalent linear regression

More information

System identification and control with (deep) Gaussian processes. Andreas Damianou

System identification and control with (deep) Gaussian processes. Andreas Damianou System identification and control with (deep) Gaussian processes Andreas Damianou Department of Computer Science, University of Sheffield, UK MIT, 11 Feb. 2016 Outline Part 1: Introduction Part 2: Gaussian

More information

Introduction to Gaussian Processes

Introduction to Gaussian Processes Introduction to Gaussian Processes 1 Objectives to express prior knowledge/beliefs about model outputs using Gaussian process (GP) to sample functions from the probability measure defined by GP to build

More information

Computer Model Calibration: Bayesian Methods for Combining Simulations and Experiments for Inference and Prediction

Computer Model Calibration: Bayesian Methods for Combining Simulations and Experiments for Inference and Prediction Computer Model Calibration: Bayesian Methods for Combining Simulations and Experiments for Inference and Prediction Example The Lyon-Fedder-Mobary (LFM) model simulates the interaction of solar wind plasma

More information

1 Data Arrays and Decompositions

1 Data Arrays and Decompositions 1 Data Arrays and Decompositions 1.1 Variance Matrices and Eigenstructure Consider a p p positive definite and symmetric matrix V - a model parameter or a sample variance matrix. The eigenstructure is

More information

Linear regression example Simple linear regression: f(x) = ϕ(x)t w w ~ N(0, ) The mean and covariance are given by E[f(x)] = ϕ(x)e[w] = 0.

Linear regression example Simple linear regression: f(x) = ϕ(x)t w w ~ N(0, ) The mean and covariance are given by E[f(x)] = ϕ(x)e[w] = 0. Gaussian Processes Gaussian Process Stochastic process: basically, a set of random variables. may be infinite. usually related in some way. Gaussian process: each variable has a Gaussian distribution every

More information

From Physical Modelling to Big Data Analytics: Examples and Challenges

From Physical Modelling to Big Data Analytics: Examples and Challenges From Physical Modelling to Big Data Analytics: Examples and Challenges Ben Leimkuhler University of Edinburgh EPSRC/NSF ERC ATI Summit on Big Data in the Physical Sciences Outline Part I: Challenges Ideas

More information

Probabilistic Graphical Models Lecture 20: Gaussian Processes

Probabilistic Graphical Models Lecture 20: Gaussian Processes Probabilistic Graphical Models Lecture 20: Gaussian Processes Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 30, 2015 1 / 53 What is Machine Learning? Machine learning algorithms

More information

Gaussian process for nonstationary time series prediction

Gaussian process for nonstationary time series prediction Computational Statistics & Data Analysis 47 (2004) 705 712 www.elsevier.com/locate/csda Gaussian process for nonstationary time series prediction Soane Brahim-Belhouari, Amine Bermak EEE Department, Hong

More information

Biomolecular modeling. Theoretical Chemistry, TU Braunschweig (Dated: December 10, 2010)

Biomolecular modeling. Theoretical Chemistry, TU Braunschweig (Dated: December 10, 2010) Biomolecular modeling Marcus Elstner and Tomáš Kubař Theoretical Chemistry, TU Braunschweig (Dated: December 10, 2010) IX. FREE ENERGY SIMULATIONS When searching for a physical quantity that is of most

More information

MAT Inverse Problems, Part 2: Statistical Inversion

MAT Inverse Problems, Part 2: Statistical Inversion MAT-62006 Inverse Problems, Part 2: Statistical Inversion S. Pursiainen Department of Mathematics, Tampere University of Technology Spring 2015 Overview The statistical inversion approach is based on the

More information

Lecture 9. Time series prediction

Lecture 9. Time series prediction Lecture 9 Time series prediction Prediction is about function fitting To predict we need to model There are a bewildering number of models for data we look at some of the major approaches in this lecture

More information

Gaussian Processes: We demand rigorously defined areas of uncertainty and doubt

Gaussian Processes: We demand rigorously defined areas of uncertainty and doubt Gaussian Processes: We demand rigorously defined areas of uncertainty and doubt ACS Spring National Meeting. COMP, March 16 th 2016 Matthew Segall, Peter Hunt, Ed Champness matt.segall@optibrium.com Optibrium,

More information

Bayesian Linear Regression. Sargur Srihari

Bayesian Linear Regression. Sargur Srihari Bayesian Linear Regression Sargur srihari@cedar.buffalo.edu Topics in Bayesian Regression Recall Max Likelihood Linear Regression Parameter Distribution Predictive Distribution Equivalent Kernel 2 Linear

More information

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II

ADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING. Non-linear regression techniques Part - II 1 Non-linear regression techniques Part - II Regression Algorithms in this Course Support Vector Machine Relevance Vector Machine Support vector regression Boosting random projections Relevance vector

More information

State Space Representation of Gaussian Processes

State Space Representation of Gaussian Processes State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)

More information

Bayesian Models in Machine Learning

Bayesian Models in Machine Learning Bayesian Models in Machine Learning Lukáš Burget Escuela de Ciencias Informáticas 2017 Buenos Aires, July 24-29 2017 Frequentist vs. Bayesian Frequentist point of view: Probability is the frequency of

More information

Probabilistic Models for Learning Data Representations. Andreas Damianou

Probabilistic Models for Learning Data Representations. Andreas Damianou Probabilistic Models for Learning Data Representations Andreas Damianou Department of Computer Science, University of Sheffield, UK IBM Research, Nairobi, Kenya, 23/06/2015 Sheffield SITraN Outline Part

More information

Creating Non-Gaussian Processes from Gaussian Processes by the Log-Sum-Exp Approach. Radford M. Neal, 28 February 2005

Creating Non-Gaussian Processes from Gaussian Processes by the Log-Sum-Exp Approach. Radford M. Neal, 28 February 2005 Creating Non-Gaussian Processes from Gaussian Processes by the Log-Sum-Exp Approach Radford M. Neal, 28 February 2005 A Very Brief Review of Gaussian Processes A Gaussian process is a distribution over

More information

Computing free energy: Replica exchange

Computing free energy: Replica exchange Computing free energy: Replica exchange Extending the scale Length (m) 1 10 3 Potential Energy Surface: {Ri} 10 6 (3N+1) dimensional 10 9 E Thermodynamics: p, T, V, N continuum ls Macroscopic i a t e regime

More information

Nonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group

Nonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group Nonparmeteric Bayes & Gaussian Processes Baback Moghaddam baback@jpl.nasa.gov Machine Learning Group Outline Bayesian Inference Hierarchical Models Model Selection Parametric vs. Nonparametric Gaussian

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 2: PROBABILITY DISTRIBUTIONS Parametric Distributions Basic building blocks: Need to determine given Representation: or? Recall Curve Fitting Binary Variables

More information

Expectation propagation for signal detection in flat-fading channels

Expectation propagation for signal detection in flat-fading channels Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA

More information

Nonparameteric Regression:

Nonparameteric Regression: Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,

More information

Bayesian Interpretations of Regularization

Bayesian Interpretations of Regularization Bayesian Interpretations of Regularization Charlie Frogner 9.50 Class 15 April 1, 009 The Plan Regularized least squares maps {(x i, y i )} n i=1 to a function that minimizes the regularized loss: f S

More information

Worst-Case Bounds for Gaussian Process Models

Worst-Case Bounds for Gaussian Process Models Worst-Case Bounds for Gaussian Process Models Sham M. Kakade University of Pennsylvania Matthias W. Seeger UC Berkeley Abstract Dean P. Foster University of Pennsylvania We present a competitive analysis

More information

y Xw 2 2 y Xw λ w 2 2

y Xw 2 2 y Xw λ w 2 2 CS 189 Introduction to Machine Learning Spring 2018 Note 4 1 MLE and MAP for Regression (Part I) So far, we ve explored two approaches of the regression framework, Ordinary Least Squares and Ridge Regression:

More information

12 - Nonparametric Density Estimation

12 - Nonparametric Density Estimation ST 697 Fall 2017 1/49 12 - Nonparametric Density Estimation ST 697 Fall 2017 University of Alabama Density Review ST 697 Fall 2017 2/49 Continuous Random Variables ST 697 Fall 2017 3/49 1.0 0.8 F(x) 0.6

More information

1 Using standard errors when comparing estimated values

1 Using standard errors when comparing estimated values MLPR Assignment Part : General comments Below are comments on some recurring issues I came across when marking the second part of the assignment, which I thought it would help to explain in more detail

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods Pattern Recognition and Machine Learning Chapter 6: Kernel Methods Vasil Khalidov Alex Kläser December 13, 2007 Training Data: Keep or Discard? Parametric methods (linear/nonlinear) so far: learn parameter

More information

Gaussian processes in python

Gaussian processes in python Gaussian processes in python John Reid 17th March 2009 1 What are Gaussian processes? Often we have an inference problem involving n data, D = {(x i, y i i = 1,..., n, x i X, y i R} where the x i are the

More information

20: Gaussian Processes

20: Gaussian Processes 10-708: Probabilistic Graphical Models 10-708, Spring 2016 20: Gaussian Processes Lecturer: Andrew Gordon Wilson Scribes: Sai Ganesh Bandiatmakuri 1 Discussion about ML Here we discuss an introduction

More information

Overview. Probabilistic Interpretation of Linear Regression Maximum Likelihood Estimation Bayesian Estimation MAP Estimation

Overview. Probabilistic Interpretation of Linear Regression Maximum Likelihood Estimation Bayesian Estimation MAP Estimation Overview Probabilistic Interpretation of Linear Regression Maximum Likelihood Estimation Bayesian Estimation MAP Estimation Probabilistic Interpretation: Linear Regression Assume output y is generated

More information

Probabilistic Reasoning in Deep Learning

Probabilistic Reasoning in Deep Learning Probabilistic Reasoning in Deep Learning Dr Konstantina Palla, PhD palla@stats.ox.ac.uk September 2017 Deep Learning Indaba, Johannesburgh Konstantina Palla 1 / 39 OVERVIEW OF THE TALK Basics of Bayesian

More information

Statistical Data Analysis Stat 3: p-values, parameter estimation

Statistical Data Analysis Stat 3: p-values, parameter estimation Statistical Data Analysis Stat 3: p-values, parameter estimation London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway,

More information

Strong Lens Modeling (II): Statistical Methods

Strong Lens Modeling (II): Statistical Methods Strong Lens Modeling (II): Statistical Methods Chuck Keeton Rutgers, the State University of New Jersey Probability theory multiple random variables, a and b joint distribution p(a, b) conditional distribution

More information