An Implementation of Dynamic Causal Modelling

Size: px
Start display at page:

Download "An Implementation of Dynamic Causal Modelling"

Transcription

1 An Implementation of Dynamic Causal Modelling Christian Himpe

2 Overview Contents: 1 Intro 2 Model 3 Parameter Estimation 4 Examples

3 Motivation Motivation: How are brain regions coupled?

4 Motivation Motivation: How are brain regions coupled? How does the connectivity change in an experimental context?

5 Cycle DCM Cycle:

6 Cycle DCM Cycle: 1 Form Hypothesis

7 Cycle DCM Cycle: 1 Form Hypothesis 2 Data Acquisition (via fmri, EEG, MEG)

8 Cycle DCM Cycle: 1 Form Hypothesis 2 Data Acquisition (via fmri, EEG, MEG) 3 Data Preprocessing

9 Cycle DCM Cycle: 1 Form Hypothesis 2 Data Acquisition (via fmri, EEG, MEG) 3 Data Preprocessing 4 (Select appropriate Model)

10 Cycle DCM Cycle: 1 Form Hypothesis 2 Data Acquisition (via fmri, EEG, MEG) 3 Data Preprocessing 4 (Select appropriate Model) 5 Tune model-parameters to t Data

11 Cycle DCM Cycle: 1 Form Hypothesis 2 Data Acquisition (via fmri, EEG, MEG) 3 Data Preprocessing 4 (Select appropriate Model) 5 Tune model-parameters to t Data An Example: 1 Hypothesis: Connectivity Strengthens for selected Brain Regions under Fear

12 Cycle DCM Cycle: 1 Form Hypothesis 2 Data Acquisition (via fmri, EEG, MEG) 3 Data Preprocessing 4 (Select appropriate Model) 5 Tune model-parameters to t Data An Example: 1 Hypothesis: Connectivity Strengthens for selected Brain Regions under Fear 2 Data Acquisition: Condition Mice, Trigger Stimulus, Record Data

13 Cycle DCM Cycle: 1 Form Hypothesis 2 Data Acquisition (via fmri, EEG, MEG) 3 Data Preprocessing 4 (Select appropriate Model) 5 Tune model-parameters to t Data An Example: 1 Hypothesis: Connectivity Strengthens for selected Brain Regions under Fear 2 Data Acquisition: Condition Mice, Trigger Stimulus, Record Data 3 Preprocessing: Low-Pass Filtering

14 Cycle DCM Cycle: 1 Form Hypothesis 2 Data Acquisition (via fmri, EEG, MEG) 3 Data Preprocessing 4 (Select appropriate Model) 5 Tune model-parameters to t Data An Example: 1 Hypothesis: Connectivity Strengthens for selected Brain Regions under Fear 2 Data Acquisition: Condition Mice, Trigger Stimulus, Record Data 3 Preprocessing: Low-Pass Filtering 4 Model Selection: EEG(invasive) Model

15 Cycle DCM Cycle: 1 Form Hypothesis 2 Data Acquisition (via fmri, EEG, MEG) 3 Data Preprocessing 4 (Select appropriate Model) 5 Tune model-parameters to t Data An Example: 1 Hypothesis: Connectivity Strengthens for selected Brain Regions under Fear 2 Data Acquisition: Condition Mice, Trigger Stimulus, Record Data 3 Preprocessing: Low-Pass Filtering 4 Model Selection: EEG(invasive) Model 5 Evaluate with DCM Software.

16 Model Model Principles: Dynamic Deterministic Multiple Inputs + Outputs Two Component Model

17 Model Model Principles: Dynamic Deterministic Multiple Inputs + Outputs Two Component Model Dynamic Submodel (Coupling) Forward Submodel (Signal)

18 Model Model Principles: Dynamic Deterministic Multiple Inputs + Outputs Two Component Model Dynamic Submodel (Coupling) Forward Submodel (Signal) Dier by Data Acquisition Method (fmri,eeg/meg).

19 Dynamic Submodel (fmri) I General Description: ż = F (z, u, θ)

20 Dynamic Submodel (fmri) I General Description: Bilinear Approximation: ż = F (z, u, θ) ż δf δz z + δf δu u + k u k δ 2 F z δzδu k

21 Dynamic Submodel (fmri) I General Description: Bilinear Approximation: ż = F (z, u, θ) ż δf δz z + δf δu u + k u k δ 2 F z δzδu k Reparametrization: ż Az + Cu + k u k B k z.

22 Dynamic Submodel (fmri) II

23 Dynamic Submodel (EEG) I Construction: Tripartioning Coupling Ruleset (Forward,Backward,Lateral) { H e Impulse Response Kernel: p(t) = τ t e exp ( t τ e ) (t 0) 0 (t < 0) Sigmoid Function: S(x) = Convolution: p(t) u(t) = x 2e0 1+exp ( rx) e 0

24 Dynamic Submodel (EEG) I Construction: Tripartioning Coupling Ruleset (Forward,Backward,Lateral) { H e Impulse Response Kernel: p(t) = τ t e exp ( t τ e ) (t 0) 0 (t < 0) Sigmoid Function: S(x) = 2e0 1+exp ( rx) e 0 Convolution: p(t) u(t) = x ẍ = He τ e u(t) 2 τ e ẋ(t) 1 τ 2 x(t)

25 Dynamic Submodel (EEG) II Neuronal State Equation: x 1 = x 4 x 4 = H e x 7 = x 8 x 8 = H e τ e ((C F + C L + γ 1 I )S(x 0 ) + C U u) 2 τ e x 4 1 τ e ((C B + C L + γ 3 I )S(x 0 )) 2 τ e x 8 1 x 0 = x 5 x 6 x 2 = x 5 x 3 = x 6 x 5 = H e ((C B + C L )S(x 0 ) + γ 2 S(x 1 )) 2 x 5 1 x 2 τ e τ e τe 2 x 6 = H i γ 4 S(x 7 ) 2 x 6 1 x 3 τ i τ i τ 2 i τ 2 e x 7 τ 2 e x 1

26 Forward Submodel (fmri) Hemodynamic Equation: s i = z i κs i γ(f i 1) f i = s i v i = 1 τ (f i v 1 α ) i q i = 1 f i τ ρ (1 (1 ρ) 1 ρ ) v 1 α 1 q i i y i = V 0 (γ 1 ρ(1 q i ) + γ 2 (1 q i v i ) + (γ 3 ρ 0.02)(1 τ))

27 Forward Submodel (EEG) EEG Forward Model: non-invasive: y = LKx 0 invasive: y = Kx 0

28 Development fmri: Balloon Model (Buxton, 1998) Hemodynamic Model (Friston, 2000) Bayesian Estimation of Dynamical Systems (Friston, 2001) Dynamic Causal Modelling for fmri (Friston,2003)

29 Development fmri: EEG: Balloon Model (Buxton, 1998) Hemodynamic Model (Friston, 2000) Bayesian Estimation of Dynamical Systems (Friston, 2001) Dynamic Causal Modelling for fmri (Friston,2003) Jansen Model (Jansen + Rit, 1995) Neural Mass Model (David + Friston, 2003) Modelling Event-Related Responses (David + Friston, 2005) Dynamic Causal Modelling for EEG/MEG (David + Friston, 2006)

30 Parameter Overview fmri Parameters: Coupling Scale Hemodynamic

31 Parameter Overview fmri Parameters: Coupling Scale Hemodynamic For EEG: Coupling Gain Synaptic Input Contribution

32 Parameter Overview fmri Parameters: Coupling Scale Hemodynamic For EEG: Coupling Gain Synaptic Input Contribution Both: Drift

33 Estimation Preconditions: Data Model: y = h(z, u, θ) + X β + ɛ

34 Estimation Preconditions: Data Model: y = h(z, u, θ) + X β + ɛ Gaussian Assumption for Parameter distribution

35 Estimation Preconditions: Data Model: y = h(z, u, θ) + X β + ɛ Gaussian Assumption for Parameter distribution Prior Knowledge

36 Estimation Preconditions: Data Model: y = h(z, u, θ) + X β + ɛ Gaussian Assumption for Parameter distribution Prior Knowledge Bayes Rule: P(θ y) P(y θ) P(θ)

37 Estimation Preconditions: Data Model: y = h(z, u, θ) + X β + ɛ Gaussian Assumption for Parameter distribution Prior Knowledge Bayes Rule: P(θ y) P(y θ) P(θ) Unknown Covariance Parametrization of Covariance.

38 EM-Algorithm Posterior Estimation: Mean Estimation least-squares estimator

39 EM-Algorithm Posterior Estimation: Mean Estimation least-squares estimator Covariance Estimation Scoring algorithm

40 EM-Algorithm Posterior Estimation: Mean Estimation least-squares estimator Covariance Estimation Scoring algorithm Two Step Procedure (EM-Algorithm).

41 E-Step E-Step: ( y h(η i θ y 1 Residuals: ȳ = ) ) η θ η i θ y

42 E-Step E-Step: ( y h(η i θ y 1 Residuals: ȳ = ) ) η θ η i θ y 2 Parameter Jacobian (Design Matrix): J = ( J X 1 0 )

43 E-Step E-Step: ( y h(η i θ y 1 Residuals: ȳ = ) ) η θ η i θ y ( ) J X 2 Parameter Jacobian (Design Matrix): J = 1 0 ( ) λi V ˆQi 0 3 Covariance Weights: Cɛ = 0 Cθ

44 E-Step E-Step: ( y h(η i θ y 1 Residuals: ȳ = ) ) η θ η i θ y ( ) J X 2 Parameter Jacobian (Design Matrix): J = 1 0 ( ) λi V ˆQi 0 3 Covariance Weights: Cɛ = 0 Cθ 4 T 1 Posterior Covariance: C θ y = ( J C ɛ J) 1

45 E-Step E-Step: ( y h(η i θ y 1 Residuals: ȳ = ) ) η θ η i θ y ( ) J X 2 Parameter Jacobian (Design Matrix): J = 1 0 ( ) λi V ˆQi 0 3 Covariance Weights: Cɛ = 0 Cθ 4 T 1 Posterior Covariance: C θ y = ( J C ɛ J) 1 5 Posterior Mean: η n+1 θ y = η n θ y + C θ y( J T Cɛ 1 ȳ)

46 M-Step M-Step: 1 Residual Forming Matrix: P = C 1 ɛ Cɛ 1 JC θ y J T Cɛ 1

47 M-Step M-Step: 1 Residual Forming Matrix: P = C 1 ɛ Cɛ 1 JC θ y J T Cɛ 1 2 1st NFE/LL Derivative: g i = 1 2 tr(pq i) ȳ T P T Q i Pȳ

48 M-Step M-Step: 1 Residual Forming Matrix: P = C 1 ɛ Cɛ 1 JC θ y J T Cɛ 1 2 1st NFE/LL Derivative: g i = 1 2 tr(pq i) ȳ T P T Q i Pȳ 3 2nd NFE/LL Derivatives Expectation: H ij = 1 2 tr(pq i PQ j )

49 M-Step M-Step: 1 Residual Forming Matrix: P = C 1 ɛ Cɛ 1 JC θ y J T Cɛ 1 2 1st NFE/LL Derivative: g i = 1 2 tr(pq i) ȳ T P T Q i Pȳ 3 2nd NFE/LL Derivatives Expectation: H ij = 1 2 tr(pq i PQ j ) 4 Hyperparameter Update: λ n+1 = λ n H 1 g

50 Problems and Improvements Problems and Improvements: Large Matrices

51 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices

52 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications

53 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization

54 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization Parameter Jacobian

55 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization Parameter Jacobian Parallelization

56 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization Parameter Jacobian Parallelization Inversion/Solving

57 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization Parameter Jacobian Parallelization Inversion/Solving Cholesky Decomposition

58 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization Parameter Jacobian Parallelization Inversion/Solving Cholesky Decomposition Traces of (Complicated) Matrix Products

59 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization Parameter Jacobian Parallelization Inversion/Solving Cholesky Decomposition Traces of (Complicated) Matrix Products Recycling E-Step

60 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization Parameter Jacobian Parallelization Inversion/Solving Cholesky Decomposition Traces of (Complicated) Matrix Products Recycling E-Step NFE/LL Derivatives

61 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization Parameter Jacobian Parallelization Inversion/Solving Cholesky Decomposition Traces of (Complicated) Matrix Products Recycling E-Step NFE/LL Derivatives Parallelization.

62 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization Parameter Jacobian Parallelization Inversion/Solving Cholesky Decomposition Traces of (Complicated) Matrix Products Recycling E-Step NFE/LL Derivatives Parallelization. Some Linear Algebra:

63 Problems and Improvements Problems and Improvements: Large Matrices Sparse Matrices Matrix Multiplications Parallelization Parameter Jacobian Parallelization Inversion/Solving Cholesky Decomposition Traces of (Complicated) Matrix Products Recycling E-Step NFE/LL Derivatives Parallelization. Some Linear Algebra: (AB) T = B T A T tr(abc) = tr(bca) = tr(cab) tr(ab) = ij A ij B ji

64 Improved E-Step Improved E-Step: ȳ = J = Cɛ = ( y h(η i ) ) θ y η θ η i θ y ) ( J X 1 0 ( ) λi V ˆQi 0 J A = Cɛ 1 J J B = J A T 0 Cθ T 1 C θ y = ( J C ɛ J) 1 C θ y = ( J B J) 1 D = C θ y J B η θ y = Dȳ η n+1 = η n + θ y θ y C T θ y( J Cɛ 1 ȳ) η n+1 = η n + η θ y θ y θ y

65 Improved M-Step Improved M-Step: Q A = Q i i Cɛ 1 Q B A = Q i i J Q C B = DQ i i p y = C 1 P = Cɛ 1 Cɛ 1 JC θ y J T Cɛ 1 ɛ ȳ ( J A η θ y ) g i = 1 tr(pq 2 i) + 1 ȳ T P T Q 2 i Pȳ g i = p T y Q i p y tr(q A ) + tr(q C ) i i H ij = 1 tr(pq 2 i PQ j ) H ij = tr(q A Q A ) i j tr(q A Q B D) j i D) tr(q A i +tr(q C i Q B j Q C j ) λ n+1 = λ n H 1 g λ n+1 = λ n + H 1 g

66 Implementation Implementation Notes: C++ Sparse Matrices OpenMP Runge-Kutta-Fehlberg Modular Submodel Classes 6500 Lines of Code

67 Articial EEG

68 Articial EEG

69 Articial EEG

70 Articial EEG

71 Articial EEG

72 Articial EEG

73 Articial EEG

74 Articial EEG

75 Articial EEG

76 Articial EEG

77 Articial EEG

78 Articial EEG

79 Articial EEG.

80 Real EEG

81 Real EEG

82 Real EEG

83 Real EEG

84 Real EEG

85 Real EEG

86 Real EEG.

87 Challenges Input Distribution Stimulus Adaption Drift Order Initrinsic Connections

88 exit(0); Thank You

Dynamic Causal Modelling for EEG and MEG. Stefan Kiebel

Dynamic Causal Modelling for EEG and MEG. Stefan Kiebel Dynamic Causal Modelling for EEG and MEG Stefan Kiebel Overview 1 M/EEG analysis 2 Dynamic Causal Modelling Motivation 3 Dynamic Causal Modelling Generative model 4 Bayesian inference 5 Applications Overview

More information

Dynamic Causal Modelling for EEG/MEG: principles J. Daunizeau

Dynamic Causal Modelling for EEG/MEG: principles J. Daunizeau Dynamic Causal Modelling for EEG/MEG: principles J. Daunizeau Motivation, Brain and Behaviour group, ICM, Paris, France Overview 1 DCM: introduction 2 Dynamical systems theory 3 Neural states dynamics

More information

Will Penny. 21st April The Macroscopic Brain. Will Penny. Cortical Unit. Spectral Responses. Macroscopic Models. Steady-State Responses

Will Penny. 21st April The Macroscopic Brain. Will Penny. Cortical Unit. Spectral Responses. Macroscopic Models. Steady-State Responses The The 21st April 2011 Jansen and Rit (1995), building on the work of Lopes Da Sliva and others, developed a biologically inspired model of EEG activity. It was originally developed to explain alpha activity

More information

Dynamic Causal Modelling for fmri

Dynamic Causal Modelling for fmri Dynamic Causal Modelling for fmri André Marreiros Friday 22 nd Oct. 2 SPM fmri course Wellcome Trust Centre for Neuroimaging London Overview Brain connectivity: types & definitions Anatomical connectivity

More information

Principles of DCM. Will Penny. 26th May Principles of DCM. Will Penny. Introduction. Differential Equations. Bayesian Estimation.

Principles of DCM. Will Penny. 26th May Principles of DCM. Will Penny. Introduction. Differential Equations. Bayesian Estimation. 26th May 2011 Dynamic Causal Modelling Dynamic Causal Modelling is a framework studying large scale brain connectivity by fitting differential equation models to brain imaging data. DCMs differ in their

More information

Effective Connectivity & Dynamic Causal Modelling

Effective Connectivity & Dynamic Causal Modelling Effective Connectivity & Dynamic Causal Modelling Hanneke den Ouden Donders Centre for Cognitive Neuroimaging Radboud University Nijmegen Advanced SPM course Zurich, Februari 13-14, 2014 Functional Specialisation

More information

Dynamic Causal Modelling for EEG and MEG

Dynamic Causal Modelling for EEG and MEG Dynamic Causal Modelling for EEG and MEG Stefan Kiebel Ma Planck Institute for Human Cognitive and Brain Sciences Leipzig, Germany Overview 1 M/EEG analysis 2 Dynamic Causal Modelling Motivation 3 Dynamic

More information

Dynamic Modeling of Brain Activity

Dynamic Modeling of Brain Activity 0a Dynamic Modeling of Brain Activity EIN IIN PC Thomas R. Knösche, Leipzig Generative Models for M/EEG 4a Generative Models for M/EEG states x (e.g. dipole strengths) parameters parameters (source positions,

More information

Causal modeling of fmri: temporal precedence and spatial exploration

Causal modeling of fmri: temporal precedence and spatial exploration Causal modeling of fmri: temporal precedence and spatial exploration Alard Roebroeck Maastricht Brain Imaging Center (MBIC) Faculty of Psychology & Neuroscience Maastricht University Intro: What is Brain

More information

MIXED EFFECTS MODELS FOR TIME SERIES

MIXED EFFECTS MODELS FOR TIME SERIES Outline MIXED EFFECTS MODELS FOR TIME SERIES Cristina Gorrostieta Hakmook Kang Hernando Ombao Brown University Biostatistics Section February 16, 2011 Outline OUTLINE OF TALK 1 SCIENTIFIC MOTIVATION 2

More information

Dynamic Causal Models

Dynamic Causal Models Dynamic Causal Models V1 SPC Will Penny V1 SPC V5 V5 Olivier David, Karl Friston, Lee Harrison, Andrea Mechelli, Klaas Stephan Wellcome Department of Imaging Neuroscience, ION, UCL, UK. Mathematics in

More information

Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior

Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Tom Heskes joint work with Marcel van Gerven

More information

Dynamic causal modeling for fmri

Dynamic causal modeling for fmri Dynamic causal modeling for fmri Methods and Models for fmri, HS 2015 Jakob Heinzle Structural, functional & effective connectivity Sporns 2007, Scholarpedia anatomical/structural connectivity - presence

More information

Dynamic Causal Modelling for evoked responses J. Daunizeau

Dynamic Causal Modelling for evoked responses J. Daunizeau Dynamic Causal Modelling for evoked responses J. Daunizeau Institute for Empirical Research in Economics, Zurich, Switzerland Brain and Spine Institute, Paris, France Overview 1 DCM: introduction 2 Neural

More information

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References

Hierarchy. Will Penny. 24th March Hierarchy. Will Penny. Linear Models. Convergence. Nonlinear Models. References 24th March 2011 Update Hierarchical Model Rao and Ballard (1999) presented a hierarchical model of visual cortex to show how classical and extra-classical Receptive Field (RF) effects could be explained

More information

The General Linear Model (GLM)

The General Linear Model (GLM) he General Linear Model (GLM) Klaas Enno Stephan ranslational Neuromodeling Unit (NU) Institute for Biomedical Engineering University of Zurich & EH Zurich Wellcome rust Centre for Neuroimaging Institute

More information

Stochastic Dynamic Causal Modelling for resting-state fmri

Stochastic Dynamic Causal Modelling for resting-state fmri Stochastic Dynamic Causal Modelling for resting-state fmri Ged Ridgway, FIL, Wellcome Trust Centre for Neuroimaging, UCL Institute of Neurology, London Overview Connectivity in the brain Introduction to

More information

Probabilistic Reasoning in Deep Learning

Probabilistic Reasoning in Deep Learning Probabilistic Reasoning in Deep Learning Dr Konstantina Palla, PhD palla@stats.ox.ac.uk September 2017 Deep Learning Indaba, Johannesburgh Konstantina Palla 1 / 39 OVERVIEW OF THE TALK Basics of Bayesian

More information

Latent state estimation using control theory

Latent state estimation using control theory Latent state estimation using control theory Bert Kappen SNN Donders Institute, Radboud University, Nijmegen Gatsby Unit, UCL London August 3, 7 with Hans Christian Ruiz Bert Kappen Smoothing problem Given

More information

Neural mass model parameter identification for MEG/EEG

Neural mass model parameter identification for MEG/EEG Neural mass model parameter identification for MEG/EEG Jan Kybic a, Olivier Faugeras b, Maureen Clerc b, Théo Papadopoulo b a Center for Machine Perception, Faculty of Electrical Engineering, Czech Technical

More information

Convolutional networks. Sebastian Seung

Convolutional networks. Sebastian Seung Convolutional networks Sebastian Seung Convolutional network Neural network with spatial organization every neuron has a location usually on a grid Translation invariance synaptic strength depends on locations

More information

Gaussian processes and bayesian optimization Stanisław Jastrzębski. kudkudak.github.io kudkudak

Gaussian processes and bayesian optimization Stanisław Jastrzębski. kudkudak.github.io kudkudak Gaussian processes and bayesian optimization Stanisław Jastrzębski kudkudak.github.io kudkudak Plan Goal: talk about modern hyperparameter optimization algorithms Bayes reminder: equivalent linear regression

More information

Multi-layer Neural Networks

Multi-layer Neural Networks Multi-layer Neural Networks Steve Renals Informatics 2B Learning and Data Lecture 13 8 March 2011 Informatics 2B: Learning and Data Lecture 13 Multi-layer Neural Networks 1 Overview Multi-layer neural

More information

Nonparameteric Regression:

Nonparameteric Regression: Nonparameteric Regression: Nadaraya-Watson Kernel Regression & Gaussian Process Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,

More information

Likelihood NIPS July 30, Gaussian Process Regression with Student-t. Likelihood. Jarno Vanhatalo, Pasi Jylanki and Aki Vehtari NIPS-2009

Likelihood NIPS July 30, Gaussian Process Regression with Student-t. Likelihood. Jarno Vanhatalo, Pasi Jylanki and Aki Vehtari NIPS-2009 with with July 30, 2010 with 1 2 3 Representation Representation for Distribution Inference for the Augmented Model 4 Approximate Laplacian Approximation Introduction to Laplacian Approximation Laplacian

More information

Fast Dimension-Reduced Climate Model Calibration and the Effect of Data Aggregation

Fast Dimension-Reduced Climate Model Calibration and the Effect of Data Aggregation Fast Dimension-Reduced Climate Model Calibration and the Effect of Data Aggregation Won Chang Post Doctoral Scholar, Department of Statistics, University of Chicago Oct 15, 2014 Thesis Advisors: Murali

More information

Statistical Machine Learning (BE4M33SSU) Lecture 5: Artificial Neural Networks

Statistical Machine Learning (BE4M33SSU) Lecture 5: Artificial Neural Networks Statistical Machine Learning (BE4M33SSU) Lecture 5: Artificial Neural Networks Jan Drchal Czech Technical University in Prague Faculty of Electrical Engineering Department of Computer Science Topics covered

More information

Models of effective connectivity & Dynamic Causal Modelling (DCM)

Models of effective connectivity & Dynamic Causal Modelling (DCM) Models of effective connectivit & Dnamic Causal Modelling (DCM Presented b: Ariana Anderson Slides shared b: Karl Friston Functional Imaging Laborator (FIL Wellcome Trust Centre for Neuroimaging Universit

More information

A prior distribution over model space p(m) (or hypothesis space ) can be updated to a posterior distribution after observing data y.

A prior distribution over model space p(m) (or hypothesis space ) can be updated to a posterior distribution after observing data y. June 2nd 2011 A prior distribution over model space p(m) (or hypothesis space ) can be updated to a posterior distribution after observing data y. This is implemented using Bayes rule p(m y) = p(y m)p(m)

More information

Bayesian inference J. Daunizeau

Bayesian inference J. Daunizeau Bayesian inference J. Daunizeau Brain and Spine Institute, Paris, France Wellcome Trust Centre for Neuroimaging, London, UK Overview of the talk 1 Probabilistic modelling and representation of uncertainty

More information

Bayesian Additive Regression Tree (BART) with application to controlled trail data analysis

Bayesian Additive Regression Tree (BART) with application to controlled trail data analysis Bayesian Additive Regression Tree (BART) with application to controlled trail data analysis Weilan Yang wyang@stat.wisc.edu May. 2015 1 / 20 Background CATE i = E(Y i (Z 1 ) Y i (Z 0 ) X i ) 2 / 20 Background

More information

1 Bayesian Linear Regression (BLR)

1 Bayesian Linear Regression (BLR) Statistical Techniques in Robotics (STR, S15) Lecture#10 (Wednesday, February 11) Lecturer: Byron Boots Gaussian Properties, Bayesian Linear Regression 1 Bayesian Linear Regression (BLR) In linear regression,

More information

Will Penny. SPM short course for M/EEG, London 2015

Will Penny. SPM short course for M/EEG, London 2015 SPM short course for M/EEG, London 2015 Ten Simple Rules Stephan et al. Neuroimage, 2010 Model Structure The model evidence is given by integrating out the dependence on model parameters p(y m) = p(y,

More information

20: Gaussian Processes

20: Gaussian Processes 10-708: Probabilistic Graphical Models 10-708, Spring 2016 20: Gaussian Processes Lecturer: Andrew Gordon Wilson Scribes: Sai Ganesh Bandiatmakuri 1 Discussion about ML Here we discuss an introduction

More information

The General Linear Model (GLM)

The General Linear Model (GLM) The General Linear Model (GLM) Dr. Frederike Petzschner Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering, University of Zurich & ETH Zurich With many thanks for slides & images

More information

Model Comparison. Course on Bayesian Inference, WTCN, UCL, February Model Comparison. Bayes rule for models. Linear Models. AIC and BIC.

Model Comparison. Course on Bayesian Inference, WTCN, UCL, February Model Comparison. Bayes rule for models. Linear Models. AIC and BIC. Course on Bayesian Inference, WTCN, UCL, February 2013 A prior distribution over model space p(m) (or hypothesis space ) can be updated to a posterior distribution after observing data y. This is implemented

More information

Data assimilation with and without a model

Data assimilation with and without a model Data assimilation with and without a model Tyrus Berry George Mason University NJIT Feb. 28, 2017 Postdoc supported by NSF This work is in collaboration with: Tim Sauer, GMU Franz Hamilton, Postdoc, NCSU

More information

Will Penny. SPM for MEG/EEG, 15th May 2012

Will Penny. SPM for MEG/EEG, 15th May 2012 SPM for MEG/EEG, 15th May 2012 A prior distribution over model space p(m) (or hypothesis space ) can be updated to a posterior distribution after observing data y. This is implemented using Bayes rule

More information

Models of effective connectivity & Dynamic Causal Modelling (DCM)

Models of effective connectivity & Dynamic Causal Modelling (DCM) Models of effective connectivit & Dnamic Causal Modelling (DCM) Slides obtained from: Karl Friston Functional Imaging Laborator (FIL) Wellcome Trust Centre for Neuroimaging Universit College London Klaas

More information

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods

Pattern Recognition and Machine Learning. Bishop Chapter 6: Kernel Methods Pattern Recognition and Machine Learning Chapter 6: Kernel Methods Vasil Khalidov Alex Kläser December 13, 2007 Training Data: Keep or Discard? Parametric methods (linear/nonlinear) so far: learn parameter

More information

M/EEG source analysis

M/EEG source analysis Jérémie Mattout Lyon Neuroscience Research Center Will it ever happen that mathematicians will know enough about the physiology of the brain, and neurophysiologists enough of mathematical discovery, for

More information

New Machine Learning Methods for Neuroimaging

New Machine Learning Methods for Neuroimaging New Machine Learning Methods for Neuroimaging Gatsby Computational Neuroscience Unit University College London, UK Dept of Computer Science University of Helsinki, Finland Outline Resting-state networks

More information

Continuous Time Particle Filtering for fmri

Continuous Time Particle Filtering for fmri Continuous Time Particle Filtering for fmri Lawrence Murray School of Informatics University of Edinburgh lawrence.murray@ed.ac.uk Amos Storkey School of Informatics University of Edinburgh a.storkey@ed.ac.uk

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

Graphical Models for Collaborative Filtering

Graphical Models for Collaborative Filtering Graphical Models for Collaborative Filtering Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Sequence modeling HMM, Kalman Filter, etc.: Similarity: the same graphical model topology,

More information

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop Music and Machine Learning (IFT68 Winter 8) Prof. Douglas Eck, Université de Montréal These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop

More information

BANA 7046 Data Mining I Lecture 6. Other Data Mining Algorithms 1

BANA 7046 Data Mining I Lecture 6. Other Data Mining Algorithms 1 BANA 7046 Data Mining I Lecture 6. Other Data Mining Algorithms 1 Shaobo Li University of Cincinnati 1 Partially based on Hastie, et al. (2009) ESL, and James, et al. (2013) ISLR Data Mining I Lecture

More information

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model

Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model Tutorial on Gaussian Processes and the Gaussian Process Latent Variable Model (& discussion on the GPLVM tech. report by Prof. N. Lawrence, 06) Andreas Damianou Department of Neuro- and Computer Science,

More information

November 2002 STA Random Effects Selection in Linear Mixed Models

November 2002 STA Random Effects Selection in Linear Mixed Models November 2002 STA216 1 Random Effects Selection in Linear Mixed Models November 2002 STA216 2 Introduction It is common practice in many applications to collect multiple measurements on a subject. Linear

More information

Parameter Estimation in the Spatio-Temporal Mixed Effects Model Analysis of Massive Spatio-Temporal Data Sets

Parameter Estimation in the Spatio-Temporal Mixed Effects Model Analysis of Massive Spatio-Temporal Data Sets Parameter Estimation in the Spatio-Temporal Mixed Effects Model Analysis of Massive Spatio-Temporal Data Sets Matthias Katzfuß Advisor: Dr. Noel Cressie Department of Statistics The Ohio State University

More information

Spatial Source Filtering. Outline EEG/ERP. ERPs) Event-related Potentials (ERPs( EEG

Spatial Source Filtering. Outline EEG/ERP. ERPs) Event-related Potentials (ERPs( EEG Integration of /MEG/fMRI Vince D. Calhoun, Ph.D. Director, Image Analysis & MR Research The MIND Institute Outline fmri/ data Three Approaches to integration/fusion Prediction Constraints 2 nd Level Fusion

More information

Will Penny. SPM short course for M/EEG, London 2013

Will Penny. SPM short course for M/EEG, London 2013 SPM short course for M/EEG, London 2013 Ten Simple Rules Stephan et al. Neuroimage, 2010 Model Structure Bayes rule for models A prior distribution over model space p(m) (or hypothesis space ) can be updated

More information

Stat260: Bayesian Modeling and Inference Lecture Date: February 10th, Jeffreys priors. exp 1 ) p 2

Stat260: Bayesian Modeling and Inference Lecture Date: February 10th, Jeffreys priors. exp 1 ) p 2 Stat260: Bayesian Modeling and Inference Lecture Date: February 10th, 2010 Jeffreys priors Lecturer: Michael I. Jordan Scribe: Timothy Hunter 1 Priors for the multivariate Gaussian Consider a multivariate

More information

MEG and fmri for nonlinear estimation of neural activity

MEG and fmri for nonlinear estimation of neural activity Copyright 2009 SS&C. Published in the Proceedings of the Asilomar Conference on Signals, Systems, and Computers, November 1-4, 2009, Pacific Grove, California, USA MEG and fmri for nonlinear estimation

More information

CSC411: Final Review. James Lucas & David Madras. December 3, 2018

CSC411: Final Review. James Lucas & David Madras. December 3, 2018 CSC411: Final Review James Lucas & David Madras December 3, 2018 Agenda 1. A brief overview 2. Some sample questions Basic ML Terminology The final exam will be on the entire course; however, it will be

More information

Lecture 13: Simple Linear Regression in Matrix Format. 1 Expectations and Variances with Vectors and Matrices

Lecture 13: Simple Linear Regression in Matrix Format. 1 Expectations and Variances with Vectors and Matrices Lecture 3: Simple Linear Regression in Matrix Format To move beyond simple regression we need to use matrix algebra We ll start by re-expressing simple linear regression in matrix form Linear algebra is

More information

The General Linear Model. Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London

The General Linear Model. Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London The General Linear Model Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London SPM Course Lausanne, April 2012 Image time-series Spatial filter Design matrix Statistical Parametric

More information

Radial-Basis Function Networks

Radial-Basis Function Networks Radial-Basis Function etworks A function is radial () if its output depends on (is a nonincreasing function of) the distance of the input from a given stored vector. s represent local receptors, as illustrated

More information

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012

Parametric Models. Dr. Shuang LIANG. School of Software Engineering TongJi University Fall, 2012 Parametric Models Dr. Shuang LIANG School of Software Engineering TongJi University Fall, 2012 Today s Topics Maximum Likelihood Estimation Bayesian Density Estimation Today s Topics Maximum Likelihood

More information

Bayesian Estimation of Dynamical Systems: An Application to fmri

Bayesian Estimation of Dynamical Systems: An Application to fmri NeuroImage 16, 513 530 (2002) doi:10.1006/nimg.2001.1044, available online at http://www.idealibrary.com on Bayesian Estimation of Dynamical Systems: An Application to fmri K. J. Friston The Wellcome Department

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Relevance Vector Machines

Relevance Vector Machines LUT February 21, 2011 Support Vector Machines Model / Regression Marginal Likelihood Regression Relevance vector machines Exercise Support Vector Machines The relevance vector machine (RVM) is a bayesian

More information

Comparing Dynamic Causal Models

Comparing Dynamic Causal Models Comparing Dynamic Causal Models W.D. Penny, K.E. Stephan, A. Mechelli and K.J. Friston, Wellcome Department of Imaging Neuroscience, University College London. October 17, 2003 Abstract This article describes

More information

Linear Models for Regression

Linear Models for Regression Linear Models for Regression Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

STAT 518 Intro Student Presentation

STAT 518 Intro Student Presentation STAT 518 Intro Student Presentation Wen Wei Loh April 11, 2013 Title of paper Radford M. Neal [1999] Bayesian Statistics, 6: 475-501, 1999 What the paper is about Regression and Classification Flexible

More information

Radial-Basis Function Networks

Radial-Basis Function Networks Radial-Basis Function etworks A function is radial basis () if its output depends on (is a non-increasing function of) the distance of the input from a given stored vector. s represent local receptors,

More information

First Technical Course, European Centre for Soft Computing, Mieres, Spain. 4th July 2011

First Technical Course, European Centre for Soft Computing, Mieres, Spain. 4th July 2011 First Technical Course, European Centre for Soft Computing, Mieres, Spain. 4th July 2011 Linear Given probabilities p(a), p(b), and the joint probability p(a, B), we can write the conditional probabilities

More information

Will Penny. DCM short course, Paris 2012

Will Penny. DCM short course, Paris 2012 DCM short course, Paris 2012 Ten Simple Rules Stephan et al. Neuroimage, 2010 Model Structure Bayes rule for models A prior distribution over model space p(m) (or hypothesis space ) can be updated to a

More information

Machine Learning - MT & 5. Basis Expansion, Regularization, Validation

Machine Learning - MT & 5. Basis Expansion, Regularization, Validation Machine Learning - MT 2016 4 & 5. Basis Expansion, Regularization, Validation Varun Kanade University of Oxford October 19 & 24, 2016 Outline Basis function expansion to capture non-linear relationships

More information

A new Hierarchical Bayes approach to ensemble-variational data assimilation

A new Hierarchical Bayes approach to ensemble-variational data assimilation A new Hierarchical Bayes approach to ensemble-variational data assimilation Michael Tsyrulnikov and Alexander Rakitko HydroMetCenter of Russia College Park, 20 Oct 2014 Michael Tsyrulnikov and Alexander

More information

John Ashburner. Wellcome Trust Centre for Neuroimaging, UCL Institute of Neurology, 12 Queen Square, London WC1N 3BG, UK.

John Ashburner. Wellcome Trust Centre for Neuroimaging, UCL Institute of Neurology, 12 Queen Square, London WC1N 3BG, UK. FOR MEDICAL IMAGING John Ashburner Wellcome Trust Centre for Neuroimaging, UCL Institute of Neurology, 12 Queen Square, London WC1N 3BG, UK. PIPELINES V MODELS PROBABILITY THEORY MEDICAL IMAGE COMPUTING

More information

Computer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression

Computer Vision Group Prof. Daniel Cremers. 9. Gaussian Processes - Regression Group Prof. Daniel Cremers 9. Gaussian Processes - Regression Repetition: Regularized Regression Before, we solved for w using the pseudoinverse. But: we can kernelize this problem as well! First step:

More information

HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS

HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS EMERY N. BROWN AND CHRIS LONG NEUROSCIENCE STATISTICS RESEARCH LABORATORY DEPARTMENT

More information

Probabilistic & Unsupervised Learning

Probabilistic & Unsupervised Learning Probabilistic & Unsupervised Learning Week 2: Latent Variable Models Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc ML/CSML, Dept Computer Science University College

More information

Default Priors and Effcient Posterior Computation in Bayesian

Default Priors and Effcient Posterior Computation in Bayesian Default Priors and Effcient Posterior Computation in Bayesian Factor Analysis January 16, 2010 Presented by Eric Wang, Duke University Background and Motivation A Brief Review of Parameter Expansion Literature

More information

GWAS IV: Bayesian linear (variance component) models

GWAS IV: Bayesian linear (variance component) models GWAS IV: Bayesian linear (variance component) models Dr. Oliver Stegle Christoh Lippert Prof. Dr. Karsten Borgwardt Max-Planck-Institutes Tübingen, Germany Tübingen Summer 2011 Oliver Stegle GWAS IV: Bayesian

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Chris Bishop s PRML Ch. 8: Graphical Models

Chris Bishop s PRML Ch. 8: Graphical Models Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular

More information

The Laplace Approximation

The Laplace Approximation The Laplace Approximation Sargur N. University at Buffalo, State University of New York USA Topics in Linear Models for Classification Overview 1. Discriminant Functions 2. Probabilistic Generative Models

More information

Computer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression

Computer Vision Group Prof. Daniel Cremers. 4. Gaussian Processes - Regression Group Prof. Daniel Cremers 4. Gaussian Processes - Regression Definition (Rep.) Definition: A Gaussian process is a collection of random variables, any finite number of which have a joint Gaussian distribution.

More information

Anatomical Background of Dynamic Causal Modeling and Effective Connectivity Analyses

Anatomical Background of Dynamic Causal Modeling and Effective Connectivity Analyses Anatomical Background of Dynamic Causal Modeling and Effective Connectivity Analyses Jakob Heinzle Translational Neuromodeling Unit (TNU), Institute for Biomedical Engineering University and ETH Zürich

More information

Statistical Analysis of fmrl Data

Statistical Analysis of fmrl Data Statistical Analysis of fmrl Data F. Gregory Ashby The MIT Press Cambridge, Massachusetts London, England Preface xi Acronyms xv 1 Introduction 1 What Is fmri? 2 The Scanning Session 4 Experimental Design

More information

A Review of Pseudo-Marginal Markov Chain Monte Carlo

A Review of Pseudo-Marginal Markov Chain Monte Carlo A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the

More information

High-dimensional geometry of cortical population activity. Marius Pachitariu University College London

High-dimensional geometry of cortical population activity. Marius Pachitariu University College London High-dimensional geometry of cortical population activity Marius Pachitariu University College London Part I: introduction to the brave new world of large-scale neuroscience Part II: large-scale data preprocessing

More information

Anytime Planning for Decentralized Multi-Robot Active Information Gathering

Anytime Planning for Decentralized Multi-Robot Active Information Gathering Anytime Planning for Decentralized Multi-Robot Active Information Gathering Brent Schlotfeldt 1 Dinesh Thakur 1 Nikolay Atanasov 2 Vijay Kumar 1 George Pappas 1 1 GRASP Laboratory University of Pennsylvania

More information

Practical Bayesian Optimization of Machine Learning. Learning Algorithms

Practical Bayesian Optimization of Machine Learning. Learning Algorithms Practical Bayesian Optimization of Machine Learning Algorithms CS 294 University of California, Berkeley Tuesday, April 20, 2016 Motivation Machine Learning Algorithms (MLA s) have hyperparameters that

More information

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix

Labor-Supply Shifts and Economic Fluctuations. Technical Appendix Labor-Supply Shifts and Economic Fluctuations Technical Appendix Yongsung Chang Department of Economics University of Pennsylvania Frank Schorfheide Department of Economics University of Pennsylvania January

More information

Part 1: Expectation Propagation

Part 1: Expectation Propagation Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Linear Regression Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE 474/574 1

More information

Event-related fmri. Christian Ruff. Laboratory for Social and Neural Systems Research Department of Economics University of Zurich

Event-related fmri. Christian Ruff. Laboratory for Social and Neural Systems Research Department of Economics University of Zurich Event-related fmri Christian Ruff Laboratory for Social and Neural Systems Research Department of Economics University of Zurich Institute of Neurology University College London With thanks to the FIL

More information

An introduction to Bayesian inference and model comparison J. Daunizeau

An introduction to Bayesian inference and model comparison J. Daunizeau An introduction to Bayesian inference and model comparison J. Daunizeau ICM, Paris, France TNU, Zurich, Switzerland Overview of the talk An introduction to probabilistic modelling Bayesian model comparison

More information

Hierarchical Modeling for Univariate Spatial Data

Hierarchical Modeling for Univariate Spatial Data Hierarchical Modeling for Univariate Spatial Data Geography 890, Hierarchical Bayesian Models for Environmental Spatial Data Analysis February 15, 2011 1 Spatial Domain 2 Geography 890 Spatial Domain This

More information

Observed Brain Dynamics

Observed Brain Dynamics Observed Brain Dynamics Partha P. Mitra Hemant Bokil OXTORD UNIVERSITY PRESS 2008 \ PART I Conceptual Background 1 1 Why Study Brain Dynamics? 3 1.1 Why Dynamics? An Active Perspective 3 Vi Qimnü^iQ^Dv.aamics'v

More information

Machine Learning. Bayesian Regression & Classification. Marc Toussaint U Stuttgart

Machine Learning. Bayesian Regression & Classification. Marc Toussaint U Stuttgart Machine Learning Bayesian Regression & Classification learning as inference, Bayesian Kernel Ridge regression & Gaussian Processes, Bayesian Kernel Logistic Regression & GP classification, Bayesian Neural

More information

Bayesian inference J. Daunizeau

Bayesian inference J. Daunizeau Bayesian inference J. Daunizeau Brain and Spine Institute, Paris, France Wellcome Trust Centre for Neuroimaging, London, UK Overview of the talk 1 Probabilistic modelling and representation of uncertainty

More information

Machine learning strategies for fmri analysis

Machine learning strategies for fmri analysis Machine learning strategies for fmri analysis DTU Informatics Technical University of Denmark Co-workers: Morten Mørup, Kristoffer Madsen, Peter Mondrup, Daniel Jacobsen, Stephen Strother,. OUTLINE Do

More information

Signal Processing for Functional Brain Imaging: General Linear Model (2)

Signal Processing for Functional Brain Imaging: General Linear Model (2) Signal Processing for Functional Brain Imaging: General Linear Model (2) Maria Giulia Preti, Dimitri Van De Ville Medical Image Processing Lab, EPFL/UniGE http://miplab.epfl.ch/teaching/micro-513/ March

More information

Linear Dynamical Systems (Kalman filter)

Linear Dynamical Systems (Kalman filter) Linear Dynamical Systems (Kalman filter) (a) Overview of HMMs (b) From HMMs to Linear Dynamical Systems (LDS) 1 Markov Chains with Discrete Random Variables x 1 x 2 x 3 x T Let s assume we have discrete

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee 1 and Andrew O. Finley 2 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. 2 Department of Forestry & Department

More information

Lecture 5: GPs and Streaming regression

Lecture 5: GPs and Streaming regression Lecture 5: GPs and Streaming regression Gaussian Processes Information gain Confidence intervals COMP-652 and ECSE-608, Lecture 5 - September 19, 2017 1 Recall: Non-parametric regression Input space X

More information