Data Analysis I: Single Subject
|
|
- Samuel Stone
- 5 years ago
- Views:
Transcription
1 Data Analysis I: Single Subject
2 ON OFF
3
4 he General Linear Model (GLM) y= X fmri Signal = Design Matrix our data = what we CAN explain x β x Betas + + how much x of it we CAN + explain ε Residuals what we CANNO explain If a voxel cares about the task, then you should be able to predict its time-course from the onset and offset of the task. β captures how much of 'y' your task/design is predicting.
5 he General Linear Model (GLM) y= X fmri Signal = Design Matrix our data = what we CAN explain x β x Betas + + how much x of it we CAN + explain ε Residuals what we CANNO explain
6 he General Linear Model (GLM) yi = β0+β1x1+ εi yi Reaction time X1 Age β0, β1 are the model parameters εi is the random error; how observation deviates from population mean
7 he General Linear Model (GLM) yi = β0+β1x1+ εi Fixed: β0+β1x1 Mean of Yi (E[Yi]) Random: εi he variability of Y i (needs to have specific properties)
8 he General Linear Model (GLM) AGE Source: J Mumford
9 Statistical Analysis Conceptually the analysis is divided into 2 steps: 1. Estimation: the magnitude of the effect of interest (β) 2. Inference: assess significance of the estimated parameter
10 Ordinary Least Squares (OLS) e.g. What is the relationship tying years of MALB experience and coding efficiency?
11 Ordinary Least Squares (OLS) e.g. What is the relationship tying years of MALB experience and coding efficiency? Minimize error: Actual (i.e. Y) n 2 i n 2 e = Y i Y i i=1 n Predicted (i.e. Y-hat) i=1 2 Y i 1 X 1 i=1
12 Ordinary Least Squares (OLS) e.g. What is the relationship tying years of MALB experience and coding efficiency? Minimize error: Actual (i.e. Y) n 2 i n 2 e = Y i Y i i=1 n Predicted (i.e. Y-hat) i=1 2 Y i 1 X 1 i=1
13 Least Squares Difference Minimize error n n i=1 i=1 2 2 e = Y i Y i = Y i i X i i=1 2 i n Y i Y X X i 1= 2 Y i Y 0 =Y 1 X
14 Massive Univariate Approach y = 0 1 X 1
15 Massive Univariate Approach y = 0 1 X 1 2 X 2... p X p
16 Voxel-wise GLM y 1= 0 1 X 1,1 2 X 2,1 p X p,1 1 y 2= 0 1 X 1,2 2 X 2,2 p X p, 2 2 y 3= 0 1 X 1,3 2 X 2,3 p X p,3 3 y n= 0 1 X 1, n 2 X 2, n p X p, n n
17 Voxel-wise GLM
18 Voxel-wise GLM
19 Voxel-wise GLM
20 Voxel-wise GLM
21 Voxel-wise GLM
22 Ordinary Least Squares (OLS) Y = X Y = X X Y = X X X X X Y = 1
23 Ordinary Least Squares (OLS) Y = X = X X 1 X Y 2 1 Var = X X 1 X X X Y t= = ~ N p 2 1 Var X X
24 Ordinary Least Squares (OLS) Effect of interest Nature 1 X X X Y t= = ~ N p 2 1 X X Var Subject, scanner,... You Residuals Error (i.e., what we don't know) Design Matrix
25 Gauss Markov Assumptions Under the following assumptions, the OLS estimator is the Best Linear Unbiased Estimator (BLUE).
26 Source: J Mumford
27 Gauss Markov Assumptions Under the following assumptions, the OLS estimator is the Best Linear Unbiased Estimator (BLUE). i.i.d. 2 ~ N 0, I E i =0 Error has mean 0 Error is Homoschedastic Var i = 2 Errors have same variance Errors are not Autocorrelated Cov i, j =0 Errors are independent of each other
28 A simple experiment: the FFA Ss: 1 participant Design: 16s Faces, 16s Houses (block design) ake Voxel 15,19,18, in right posterior temporal cortex. Does this voxel care about faces?
29 Model I: Boxcar Regressor Y = X *β
30 Model I: Boxcar Regressor
31 Model I: Boxcar Regressor 1 X X X Y t df = 2 1 X X t 159 = =
32 Model II: Convolved Boxcar Regressor Gaussian 1 X X X Y t df = 2 1 X X t 159 = =
33 Model II: Convolved Boxcar Regressor Gamma 1 X X X Y t df = 2 1 X X t 159 = =
34 Model II: Convolved Boxcar Regressor Double Gamma 1 X X X Y t df = 2 1 X X t 159 = =
35 What Convolution? Linear ime Invariant Basis Functions: Canonical HRF Canonical HRF + temp & disp derivs Informed basis set FIR/Smooth FIR Non Linear: Inverse Logit Balloon model
36 Canonical HRF Stimulus HRF (DGamma)
37 HRF Variability Across Subjects Handwerker et al 2004 Aguirre et Al 1998
38 HRF Variability Within Subjects same subject, same session same subject, different session Aguirre et al 1998
39 HRF Variability Across Brain Area Handwerker et al 2004 Handwerker et al 2004
40 Basis Functions I: HRF + Derivatives Canonical HRF emporal derivative Dispersion derivative
41 Basis Functions II: Fourier Basis Set f(t) h1(t) h2(t) h3(t)
42 Basis Functions III: Informed Basis Functions (FLOBS)
43 Basis Functions IV: FIR
44 Non Linear Approaches Model all 3 parameters (,W,H)
45 Slide from M Lindquist Inverse Logit Model Superposition of three inverse logit (sigmoid) functions. x 1 L x = 1 e Each function has three variable parameters dictating the amplitude, position and slope. h t θ =α 1 L t 1 / D 1 α 2 L t 2 / D2 α 3 L t 3 / D3 Lindquist & Wager (2007)
46 Slide from M Lindquist Flexibility By shifting the position of the second IL function one can model differences in duration. By shifting the position of all three IL functions one can model differences in onset.
47 Balloon Model BOLD response, % positive BOLD response initial overshoot dip post stimulus undershoot 0 stimulus time
48 What Convolution? - Recap he problem: Bias/Variance trade off Assume known shape v estimate X-Matrix mis-specification: Incorrect HRF will increase σ2
49 What Convolution? - Recap Canonical HRF: Pros: 1 parameter only, easy to interpret; easy to integrate at the second level. Cons: Bias in favour of a specific filter Basis Sets Functions: Pros: Flexible, can capture greater variability, unexpected and subject specific differences. Cons: many parameters to estimate (degrading dfs); difficult to interpret; how to aggregate at the 2nd level? Non-Linear Systems: Potentially very flexible and may better capture parameters and vasculature Computationally very demanding, parameters not obvious to interpret and aggregate across subjects (e.g., balloon model)
50 X-Matrix Assumptions Deterministic & known No regressor is a linear transformation of one (or more) regressors
51 Multicollinearity No column of the X matrix can be obtained by linear combination of one or more other columns. he issue is not whether there is/isn't multicollinearity, but rather, how much is there. wo main consequences: Decreased efficiency of the design degrades power. Parameter estimates can vary wildly, even in the wrong direction!
52 Source: J Mumford
53 Source: J Mumford
54 Source: J Mumford
55 Source: J Mumford
56 Source: J Mumford
57 Contrasts
58 Contrasts Cond A D A Cond B D B
59 Contrasts t= Var = 4. p Cond A D A Cond B D B
60 Contrasts t= Var = 4. p Cond A D A Cond B D B
61 Contrasts t= Var = 4. p Cond A D A Cond B D B
62 Contrasts t= Var = 4. p Cond A D A Cond B D B
63 Contrasts c t= Var c = 4. p c1 c2 c3 c= c4. cp Cond A D A Cond B D B
64 Contrasts c t= Var c = 4. p c= 0. 0 Cond A D A Cond B D B
65 Contrasts c t= Var c = 4. p c= 0. 0 Cond A D A Cond B D B
66 Contrasts c t= Var c = 4. p c= 0. 0 Cond A D A Cond B D B
67 Contrasts c t= Var c = 4. p c= 0. 0 Cond A D A Cond B D B
68 Contrasts c t= Var c
69 Model I: Boxcar Regressor 1 X X X Y t df = 2 1 X X t 159 = =
70 Model II: Convolved Boxcar Regressor Double Gamma 1 X X X Y t df = 2 1 X X t 159 = =
71 Gauss Markov Assumptions Under the following assumptions, the OLS estimator is the Best Linear Unbiased Estimator (BLUE). E i =0 Error has mean 0 Error is Homoschedastic i.i.d. 2 ~ N 0, I Var i = 2 Errors have same variance Errors are not Autocorrelated Cov i, j =0 Errors are independent of each other
72 emporal Structure If residuals are correlated, estimator is still unbiased but the variance estimate is not; and df are less than nominal. Corr(εt,εt-1) > 0 Corr(εt,εt-1) < 0
73 emporal Structure If residuals are correlated, estimator is still unbiased but the variance estimate is not; and df are less than nominal. his can affect your -test either way: i. Positive autocorrelation: underestimate of variance, -tests artificially inflated (actual α up to 0.16 for a nominal of 0.05). ii.negative autocorrelation: overestimate of variance, -test artificially reduced. wo strategies: i. High Pass filtering ii.autocorrelation correction ( pre-whitening ; colouring )
74 Low frequency noise Plenty low frequency noise: scanner drift, physiological (e.g., basal metabolism), psychological (e.g., learning, alertness changes), left-over from motion correction,... power spectrum noise signal power spectrum
75 HP Filtering Strategy I: SPM Model low drifts to soak up their variance (using a discrete cosine transform basis set).
76 HP Filtering Strategy II: FSL Remove low drifts from the signal: i. Fit a Gaussian-weighted running line
77 HP Filtering Strategy II: FSL Remove low drifts from the signal: i. Fit a Gaussian-weighted running line ii.subtract from data (red is pre-hpf, green is post-hpf)
78 Model II: Convolved Boxcar Regressor Double Gamma 1 X X X Y t df = 2 1 X X t 159 = =
79 Model III: High Pass Filter Double Gamma 1 X X X Y t df = 2 1 X X 349 t 159 = =
80 Autocorrelation GM Assumption: 2 i.i.d. ~ N 0, I Errors are independent of each other Cov i, j =0 fmri Data: i.i.d. 2 2 ~ N 0, V Cov i, j = V Y = X V Errors are NO independent of each other
81 Autocorrelation GM Assumption: i.i.d. 2 ~ N 0, I Cov i, j =0 cov(ε1,ε2) cov(ε2,ε1) cov(εn,ε1) cov(ε2,εn) Errors are independent of each other cov(ε1,εn) cov(ε2,εn)
82 Autocorrelation: Pre-whitening FSL If only we knew the structure of the serial correlation in the data (V) we could get rid of it: Given V, there exist a square (non singular) matrix K such that: i. V = KK ii. e = Kε (where ε is a 'well behaved' residual ~N(0,σ2I)) So, we can now clean-up (i.e., whiten ) our GLM 1 K Y =K X K Cov K = K V K = K KK K 2 = I
83 Autocorrelation: Pre-whitening FSL If only we knew the structure of the serial correlation in the data (V) we could get rid of it; but we don't know it. 1. Estimate the autocorrelation structure (for every voxel): Lag 1 [e01 e02 e03 e04 e05 e06 e07 e08 e09 e10 e11 e12] [e01 e02 e03 e04 e05 e06 e07 e08 e09 e10 e11 e12]
84 Autocorrelation: Pre-whitening FSL If only we knew the structure of the serial correlation in the data (V) we could get rid of it; but we don't know it. 1. Estimate the autocorrelation structure (for every voxel): Lag 2 [e01 e02 e03 e04 e05 e06 e07 e08 e09 e10 e11 e12] [e01 e02 e03 e04 e05 e06 e07 e08 e09 e10 e11 e12]
85 Autocorrelation: Pre-whitening FSL If only we knew the structure of the serial correlation in the data (V) we could get rid of it; but we don't know it. 1. Estimate the autocorrelation structure (for every voxel): Lag 3 [e01 e02 e03 e04 e05 e06 e07 e08 e09 e10 e11 e12] [e01 e02 e03 e04 e05 e06 e07 e08 e09 e10 e11 e
86 Autocorrelation: Pre-whitening FSL If only we knew the structure of the serial correlation in the data (V) we could get rid of it; but we don't know it. 1. Estimate the autocorrelation structure (for every voxel): Lag 8 [e01 e02 e03 e04 e05 e06 e07 e08 e09 e10 e11 e12] [e01 e02 e03 e04 e05 e06
87 Autocorrelation: Pre-whitening FSL If only we knew the structure of the serial correlation in the data (V) we could get rid of it; but we don't know it. 1. Estimate the autocorrelation structure (for every voxel): Lag 9 [e01 e02 e03 e04 e05 e06 e07 e08 e09 e10 e11 e12] [e01 e02 e03 e04 e05 e
88 Autocorrelation: Pre-whitening FSL If only we knew the structure of the serial correlation in the data (V) we could get rid of it; but we don't know it. 1. Estimate the autocorrelation structure (for every voxel) 2. For each lag, smooth estimates spatially ('regularization')
89 Autocorrelation: Pre-whitening FSL If only we knew the structure of the serial correlation in the data (V) we could get rid of it; but we don't know it. 1. Estimate the autocorrelation structure (for every voxel) 2. For each lag, smooth estimates spatially ('regularization') 3. Smooth (temporally) the correlation estimate, for each voxel ('ukey taper'). Woolrich et al, 2001 NeuroImage
90 Autocorrelation: Pre-whitening FSL If only we knew the structure of the serial correlation in the data (V) we could get rid of it; but we don't know it. 1. Estimate the autocorrelation structure (for every voxel) 2. For each lag, smooth estimates spatially ('regularization') 3. Smooth (temporally) the correlation estimate, for each voxel ('ukey taper'). Raw (rxx) Woolrich et al, 2001 NeuroImage Smooth (ρxx)
91 Autocorrelation: Pre-whitening FSL If only we knew the structure of the serial correlation in the data (V) we could get rid of it; but we don't know it. Now that we have an estimate of the structure, we can run a second regression in which we whiten the residuals using the estimated V. he estimated β for the new model are BLUE (under the assumption that V was appropriately modelled)
92 Autocorrelation: pre-whitening SPM Global estimation of the correlations: the correlation of the time series is averaged over voxels. Imposed AR(1) model with correlation 0.2 plus white noise.
93 Model III: High Pass Filter Double Gamma 1 X X X Y t df = 2 1 X X 349 t 159 = =
94 Model IV: Whitening Double Gamma 1 X X X Y t df = 2 1 X X t 159 = =
95 Whitening algorithms performance Lenoski 08 IEEE
96 Autocorrelation: Coloring i.i.d. 2 2 ~ N 0, V Cov i, j = V Y = X V Errors are NO independent of each other Impose a known correlation structure S If indeed SVS ~ SS then the error is N(0, σ2s) So: SY = SXβ + Sε Estimates are unbiased but no longer efficient (according to how well V is swamped )
97 Coloring v. Whitening Design Power Spectrum Coloring Pre-whitening
98 omorrow Setting up group fmri models Jeanette your group model challenges and she may choose it to include in her workshop
99
The General Linear Model (GLM)
he General Linear Model (GLM) Klaas Enno Stephan ranslational Neuromodeling Unit (NU) Institute for Biomedical Engineering University of Zurich & EH Zurich Wellcome rust Centre for Neuroimaging Institute
More informationThe General Linear Model (GLM)
The General Linear Model (GLM) Dr. Frederike Petzschner Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering, University of Zurich & ETH Zurich With many thanks for slides & images
More informationThe General Linear Model. Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London
The General Linear Model Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London SPM Course Lausanne, April 2012 Image time-series Spatial filter Design matrix Statistical Parametric
More informationModelling temporal structure (in noise and signal)
Modelling temporal structure (in noise and signal) Mark Woolrich, Christian Beckmann*, Salima Makni & Steve Smith FMRIB, Oxford *Imperial/FMRIB temporal noise: modelling temporal autocorrelation temporal
More informationEvent-related fmri. Christian Ruff. Laboratory for Social and Neural Systems Research Department of Economics University of Zurich
Event-related fmri Christian Ruff Laboratory for Social and Neural Systems Research Department of Economics University of Zurich Institute of Neurology University College London With thanks to the FIL
More informationOverview of SPM. Overview. Making the group inferences we want. Non-sphericity Beyond Ordinary Least Squares. Model estimation A word on power
Group Inference, Non-sphericity & Covariance Components in SPM Alexa Morcom Edinburgh SPM course, April 011 Centre for Cognitive & Neural Systems/ Department of Psychology University of Edinburgh Overview
More informationWhat is NIRS? First-Level Statistical Models 5/18/18
First-Level Statistical Models Theodore Huppert, PhD (huppertt@upmc.edu) University of Pittsburgh Departments of Radiology and Bioengineering What is NIRS? Light Intensity SO 2 and Heart Rate 2 1 5/18/18
More informationOptimization of Designs for fmri
Optimization of Designs for fmri UCLA Advanced Neuroimaging Summer School August 2, 2007 Thomas Liu, Ph.D. UCSD Center for Functional MRI Why optimize? Scans are expensive. Subjects can be difficult to
More informationContents. Introduction The General Linear Model. General Linear Linear Model Model. The General Linear Model, Part I. «Take home» message
DISCOS SPM course, CRC, Liège, 2009 Contents The General Linear Model, Part I Introduction The General Linear Model Data & model Design matrix Parameter estimates & interpretation Simple contrast «Take
More informationJean-Baptiste Poline
Edinburgh course Avril 2010 Linear Models Contrasts Variance components Jean-Baptiste Poline Neurospin, I2BM, CEA Saclay, France Credits: Will Penny, G. Flandin, SPM course authors Outline Part I: Linear
More informationStatistical Inference
Statistical Inference J. Daunizeau Institute of Empirical Research in Economics, Zurich, Switzerland Brain and Spine Institute, Paris, France SPM Course Edinburgh, April 2011 Image time-series Spatial
More informationNeuroImage. Modeling the hemodynamic response function in fmri: Efficiency, bias and mis-modeling
NeuroImage 45 (2009) S187 S198 Contents lists available at ScienceDirect NeuroImage journal homepage: www. elsevier. com/ locate/ yni mg Modeling the hemodynamic response function in fmri: Efficiency,
More informationExtracting fmri features
Extracting fmri features PRoNTo course May 2018 Christophe Phillips, GIGA Institute, ULiège, Belgium c.phillips@uliege.be - http://www.giga.ulg.ac.be Overview Introduction Brain decoding problem Subject
More informationStatistical Analysis Aspects of Resting State Functional Connectivity
Statistical Analysis Aspects of Resting State Functional Connectivity Biswal s result (1995) Correlations between RS Fluctuations of left and right motor areas Why studying resting state? Human Brain =
More informationStatistical Inference
Statistical Inference Jean Daunizeau Wellcome rust Centre for Neuroimaging University College London SPM Course Edinburgh, April 2010 Image time-series Spatial filter Design matrix Statistical Parametric
More informationSignal Processing for Functional Brain Imaging: General Linear Model (2)
Signal Processing for Functional Brain Imaging: General Linear Model (2) Maria Giulia Preti, Dimitri Van De Ville Medical Image Processing Lab, EPFL/UniGE http://miplab.epfl.ch/teaching/micro-513/ March
More informationNeuroimaging for Machine Learners Validation and inference
GIGA in silico medicine, ULg, Belgium http://www.giga.ulg.ac.be Neuroimaging for Machine Learners Validation and inference Christophe Phillips, Ir. PhD. PRoNTo course June 2017 Univariate analysis: Introduction:
More informationGeneral linear model: basic
General linear model: basic Introducing General Linear Model (GLM): Start with an example Proper>es of the BOLD signal Linear Time Invariant (LTI) system The hemodynamic response func>on (Briefly) Evalua>ng
More informationBeyond Univariate Analyses: Multivariate Modeling of Functional Neuroimaging Data
Beyond Univariate Analyses: Multivariate Modeling of Functional Neuroimaging Data F. DuBois Bowman Department of Biostatistics and Bioinformatics Center for Biomedical Imaging Statistics Emory University,
More information1st level analysis Basis functions, parametric modulation and correlated regressors
1st level analysis Basis functions, parametric modulation and correlated regressors 1 First Level Analysis Bold impulse response Temporal Basis Functions Parametric modulation Correlated regressors Blocked
More informationMixed effects and Group Modeling for fmri data
Mixed effects and Group Modeling for fmri data Thomas Nichols, Ph.D. Department of Statistics Warwick Manufacturing Group University of Warwick Warwick fmri Reading Group May 19, 2010 1 Outline Mixed effects
More informationMIXED EFFECTS MODELS FOR TIME SERIES
Outline MIXED EFFECTS MODELS FOR TIME SERIES Cristina Gorrostieta Hakmook Kang Hernando Ombao Brown University Biostatistics Section February 16, 2011 Outline OUTLINE OF TALK 1 SCIENTIFIC MOTIVATION 2
More informationHST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS
HST 583 FUNCTIONAL MAGNETIC RESONANCE IMAGING DATA ANALYSIS AND ACQUISITION A REVIEW OF STATISTICS FOR FMRI DATA ANALYSIS EMERY N. BROWN AND CHRIS LONG NEUROSCIENCE STATISTICS RESEARCH LABORATORY DEPARTMENT
More informationEconomics 308: Econometrics Professor Moody
Economics 308: Econometrics Professor Moody References on reserve: Text Moody, Basic Econometrics with Stata (BES) Pindyck and Rubinfeld, Econometric Models and Economic Forecasts (PR) Wooldridge, Jeffrey
More informationExperimental design of fmri studies & Resting-State fmri
Methods & Models for fmri Analysis 2016 Experimental design of fmri studies & Resting-State fmri Sandra Iglesias With many thanks for slides & images to: Klaas Enno Stephan, FIL Methods group, Christian
More informationExperimental Design. Rik Henson. With thanks to: Karl Friston, Andrew Holmes
Experimental Design Rik Henson With thanks to: Karl Friston, Andrew Holmes Overview 1. A Taxonomy of Designs 2. Epoch vs Event-related 3. Mixed Epoch/Event Designs A taxonomy of design Categorical designs
More informationThe ASL signal. Parenchy mal signal. Venous signal. Arterial signal. Input Function (Label) Dispersion: (t e -kt ) Relaxation: (e -t/t1a )
Lecture Goals Other non-bold techniques (T2 weighted, Mn contrast agents, SSFP, Dynamic Diffusion, ASL) Understand Basic Principles in Spin labeling : spin inversion, flow vs. perfusion ASL variations
More informationEffective Connectivity & Dynamic Causal Modelling
Effective Connectivity & Dynamic Causal Modelling Hanneke den Ouden Donders Centre for Cognitive Neuroimaging Radboud University Nijmegen Advanced SPM course Zurich, Februari 13-14, 2014 Functional Specialisation
More informationExperimental design of fmri studies
Experimental design of fmri studies Zurich SPM Course 2016 Sandra Iglesias Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering (IBT) University and ETH Zürich With many thanks for
More informationOverview of Spatial Statistics with Applications to fmri
with Applications to fmri School of Mathematics & Statistics Newcastle University April 8 th, 2016 Outline Why spatial statistics? Basic results Nonstationary models Inference for large data sets An example
More informationVariance. Standard deviation VAR = = value. Unbiased SD = SD = 10/23/2011. Functional Connectivity Correlation and Regression.
10/3/011 Functional Connectivity Correlation and Regression Variance VAR = Standard deviation Standard deviation SD = Unbiased SD = 1 10/3/011 Standard error Confidence interval SE = CI = = t value for
More informationExperimental design of fmri studies
Experimental design of fmri studies Sandra Iglesias With many thanks for slides & images to: Klaas Enno Stephan, FIL Methods group, Christian Ruff SPM Course 2015 Overview of SPM Image time-series Kernel
More informationMultiple Regression Analysis
1 OUTLINE Basic Concept: Multiple Regression MULTICOLLINEARITY AUTOCORRELATION HETEROSCEDASTICITY REASEARCH IN FINANCE 2 BASIC CONCEPTS: Multiple Regression Y i = β 1 + β 2 X 1i + β 3 X 2i + β 4 X 3i +
More informationModel-free Functional Data Analysis
Model-free Functional Data Analysis MELODIC Multivariate Exploratory Linear Optimised Decomposition into Independent Components decomposes data into a set of statistically independent spatial component
More informationGranger Mediation Analysis of Functional Magnetic Resonance Imaging Time Series
Granger Mediation Analysis of Functional Magnetic Resonance Imaging Time Series Yi Zhao and Xi Luo Department of Biostatistics Brown University June 8, 2017 Overview 1 Introduction 2 Model and Method 3
More informationLab 07 Introduction to Econometrics
Lab 07 Introduction to Econometrics Learning outcomes for this lab: Introduce the different typologies of data and the econometric models that can be used Understand the rationale behind econometrics Understand
More informationContents. design. Experimental design Introduction & recap Experimental design «Take home» message. N εˆ. DISCOS SPM course, CRC, Liège, 2009
DISCOS SPM course, CRC, Liège, 2009 Contents Experimental design Introduction & recap Experimental design «Take home» message C. Phillips, Centre de Recherches du Cyclotron, ULg, Belgium Based on slides
More informationCommentary on the statistical properties of noise and its implication on general linear models in functional near-infrared spectroscopy
Commentary on the statistical properties of noise and its implication on general linear models in functional near-infrared spectroscopy Theodore J. Huppert Neurophotonics 3(1), 010401 (Jan Mar 2016) COMMENTARY
More informationEconometrics Summary Algebraic and Statistical Preliminaries
Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L
More informationExperimental design of fmri studies
Methods & Models for fmri Analysis 2017 Experimental design of fmri studies Sara Tomiello With many thanks for slides & images to: Sandra Iglesias, Klaas Enno Stephan, FIL Methods group, Christian Ruff
More informationAnalysis of fmri Timeseries:
1 Chapter 10 Analysis of fmri Timeseries: Linear Time-Invariant Models, Event-related fmri and Optimal Experimental Design Rik Henson The Wellcome Dept. of Imaging Neuroscience & Institute of Cognitive
More informationAUTOCORRELATION. Phung Thanh Binh
AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures
More informationINTRODUCTION TO BASIC LINEAR REGRESSION MODEL
INTRODUCTION TO BASIC LINEAR REGRESSION MODEL 13 September 2011 Yogyakarta, Indonesia Cosimo Beverelli (World Trade Organization) 1 LINEAR REGRESSION MODEL In general, regression models estimate the effect
More informationDynamic Causal Modelling for fmri
Dynamic Causal Modelling for fmri André Marreiros Friday 22 nd Oct. 2 SPM fmri course Wellcome Trust Centre for Neuroimaging London Overview Brain connectivity: types & definitions Anatomical connectivity
More informationOverview. Experimental Design. A categorical analysis. A taxonomy of design. A taxonomy of design. A taxonomy of design. 1. A Taxonomy of Designs
Experimental Design Overview Rik Henson With thanks to: Karl Friston, Andrew Holmes 1. A Taxonomy of Designs 2. Epoch vs Event-related 3. Mixed Epoch/Event Designs designs designs - and nonlinear interactions
More informationLECTURE 11. Introduction to Econometrics. Autocorrelation
LECTURE 11 Introduction to Econometrics Autocorrelation November 29, 2016 1 / 24 ON PREVIOUS LECTURES We discussed the specification of a regression equation Specification consists of choosing: 1. correct
More informationFinQuiz Notes
Reading 10 Multiple Regression and Issues in Regression Analysis 2. MULTIPLE LINEAR REGRESSION Multiple linear regression is a method used to model the linear relationship between a dependent variable
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationCovers Chapter 10-12, some of 16, some of 18 in Wooldridge. Regression Analysis with Time Series Data
Covers Chapter 10-12, some of 16, some of 18 in Wooldridge Regression Analysis with Time Series Data Obviously time series data different from cross section in terms of source of variation in x and y temporal
More informationGroup analysis. Jean Daunizeau Wellcome Trust Centre for Neuroimaging University College London. SPM Course Edinburgh, April 2010
Group analysis Jean Daunizeau Wellcome Trust Centre for Neuroimaging University College London SPM Course Edinburgh, April 2010 Image time-series Spatial filter Design matrix Statistical Parametric Map
More informationGroup Analysis. Lexicon. Hierarchical models Mixed effect models Random effect (RFX) models Components of variance
Group Analysis J. Daunizeau Institute of Empirical Research in Economics, Zurich, Switzerland Brain and Spine Institute, Paris, France SPM Course Edinburgh, April 2011 Image time-series Spatial filter
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationISyE 691 Data mining and analytics
ISyE 691 Data mining and analytics Regression Instructor: Prof. Kaibo Liu Department of Industrial and Systems Engineering UW-Madison Email: kliu8@wisc.edu Office: Room 3017 (Mechanical Engineering Building)
More informationClassic Time Series Analysis
Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t
More informationMaking sense of Econometrics: Basics
Making sense of Econometrics: Basics Lecture 2: Simple Regression Egypt Scholars Economic Society Happy Eid Eid present! enter classroom at http://b.socrative.com/login/student/ room name c28efb78 Outline
More informationRegression Models - Introduction
Regression Models - Introduction In regression models there are two types of variables that are studied: A dependent variable, Y, also called response variable. It is modeled as random. An independent
More informationLectures 5 & 6: Hypothesis Testing
Lectures 5 & 6: Hypothesis Testing in which you learn to apply the concept of statistical significance to OLS estimates, learn the concept of t values, how to use them in regression work and come across
More informationSimple Linear Regression: The Model
Simple Linear Regression: The Model task: quantifying the effect of change X in X on Y, with some constant β 1 : Y = β 1 X, linear relationship between X and Y, however, relationship subject to a random
More informationStatistics: A review. Why statistics?
Statistics: A review Why statistics? What statistical concepts should we know? Why statistics? To summarize, to explore, to look for relations, to predict What kinds of data exist? Nominal, Ordinal, Interval
More informationBayesian modelling of fmri time series
Bayesian modelling of fmri time series Pedro A. d. F. R. Højen-Sørensen, Lars K. Hansen and Carl Edward Rasmussen Department of Mathematical Modelling, Building 321 Technical University of Denmark DK-28
More informationDetecting fmri activation allowing for unknown latency of the hemodynamic response
Detecting fmri activation allowing for unknown latency of the hemodynamic response K.J. Worsley McGill University J.E. Taylor Stanford University January 7, 006 Abstract Several authors have suggested
More informationMULTICOLLINEARITY AND VARIANCE INFLATION FACTORS. F. Chiaromonte 1
MULTICOLLINEARITY AND VARIANCE INFLATION FACTORS F. Chiaromonte 1 Pool of available predictors/terms from them in the data set. Related to model selection, are the questions: What is the relative importance
More informationExperimental design of fmri studies
Experimental design of fmri studies Sandra Iglesias Translational Neuromodeling Unit University of Zurich & ETH Zurich With many thanks for slides & images to: Klaas Enno Stephan, FIL Methods group, Christian
More informationFIL. Event-related. fmri. Rik Henson. With thanks to: Karl Friston, Oliver Josephs
Event-related fmri Rik Henson With thanks to: Karl Friston, Oliver Josephs Overview 1. BOLD impulse response 2. General Linear Model 3. Temporal Basis Functions 4. Timing Issues 5. Design Optimisation
More informationThe General Linear Model. Monday, Lecture 2 Jeanette Mumford University of Wisconsin - Madison
The General Linear Model Monday, Lecture 2 Jeanette Mumford University of Wisconsin - Madison How we re approaching the GLM Regression for behavioral data Without using matrices Understand least squares
More informationStatistical Analysis of fmrl Data
Statistical Analysis of fmrl Data F. Gregory Ashby The MIT Press Cambridge, Massachusetts London, England Preface xi Acronyms xv 1 Introduction 1 What Is fmri? 2 The Scanning Session 4 Experimental Design
More informationReview of Econometrics
Review of Econometrics Zheng Tian June 5th, 2017 1 The Essence of the OLS Estimation Multiple regression model involves the models as follows Y i = β 0 + β 1 X 1i + β 2 X 2i + + β k X ki + u i, i = 1,...,
More informationPeter Hoff Linear and multilinear models April 3, GLS for multivariate regression 5. 3 Covariance estimation for the GLM 8
Contents 1 Linear model 1 2 GLS for multivariate regression 5 3 Covariance estimation for the GLM 8 4 Testing the GLH 11 A reference for some of this material can be found somewhere. 1 Linear model Recall
More informationJob Training Partnership Act (JTPA)
Causal inference Part I.b: randomized experiments, matching and regression (this lecture starts with other slides on randomized experiments) Frank Venmans Example of a randomized experiment: Job Training
More informationContrasts and Classical Inference
Elsevier UK Chapter: Ch9-P3756 8-7-6 7:p.m. Page:6 Trim:7.5in 9.5in C H A P T E R 9 Contrasts and Classical Inference J. Poline, F. Kherif, C. Pallier and W. Penny INTRODUCTION The general linear model
More information1. How can you tell if there is serial correlation? 2. AR to model serial correlation. 3. Ignoring serial correlation. 4. GLS. 5. Projects.
1. How can you tell if there is serial correlation? 2. AR to model serial correlation. 3. Ignoring serial correlation. 4. GLS. 5. Projects. 1) Identifying serial correlation. Plot Y t versus Y t 1. See
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More information1. The Multivariate Classical Linear Regression Model
Business School, Brunel University MSc. EC550/5509 Modelling Financial Decisions and Markets/Introduction to Quantitative Methods Prof. Menelaos Karanasos (Room SS69, Tel. 08956584) Lecture Notes 5. The
More information11.1 Gujarati(2003): Chapter 12
11.1 Gujarati(2003): Chapter 12 Time Series Data 11.2 Time series process of economic variables e.g., GDP, M1, interest rate, echange rate, imports, eports, inflation rate, etc. Realization An observed
More informationCh 2: Simple Linear Regression
Ch 2: Simple Linear Regression 1. Simple Linear Regression Model A simple regression model with a single regressor x is y = β 0 + β 1 x + ɛ, where we assume that the error ɛ is independent random component
More informationEconometrics Honor s Exam Review Session. Spring 2012 Eunice Han
Econometrics Honor s Exam Review Session Spring 2012 Eunice Han Topics 1. OLS The Assumptions Omitted Variable Bias Conditional Mean Independence Hypothesis Testing and Confidence Intervals Homoskedasticity
More informationLecture 3: Multiple Regression
Lecture 3: Multiple Regression R.G. Pierse 1 The General Linear Model Suppose that we have k explanatory variables Y i = β 1 + β X i + β 3 X 3i + + β k X ki + u i, i = 1,, n (1.1) or Y i = β j X ji + u
More informationThe General Linear Model Ivo Dinov
Stats 33 Statistical Methods for Biomedical Data The General Linear Model Ivo Dinov dinov@stat.ucla.edu http://www.stat.ucla.edu/~dinov Slide 1 Problems with t-tests and correlations 1) How do we evaluate
More informationFunctional Causal Mediation Analysis with an Application to Brain Connectivity. Martin Lindquist Department of Biostatistics Johns Hopkins University
Functional Causal Mediation Analysis with an Application to Brain Connectivity Martin Lindquist Department of Biostatistics Johns Hopkins University Introduction Functional data analysis (FDA) and causal
More informationStatistical Analysis of Functional ASL Images
Statistical Analysis of Functional ASL Images Daniel B. Rowe, Ph.D. Department of Mathematics, Statistics, and Computer Science Department of Biophysics Department of EE and CS 1 Outline: 1. Control/Label
More informationBayesian inference J. Daunizeau
Bayesian inference J. Daunizeau Brain and Spine Institute, Paris, France Wellcome Trust Centre for Neuroimaging, London, UK Overview of the talk 1 Probabilistic modelling and representation of uncertainty
More informationLECTURE 2 LINEAR REGRESSION MODEL AND OLS
SEPTEMBER 29, 2014 LECTURE 2 LINEAR REGRESSION MODEL AND OLS Definitions A common question in econometrics is to study the effect of one group of variables X i, usually called the regressors, on another
More informationMachine learning strategies for fmri analysis
Machine learning strategies for fmri analysis DTU Informatics Technical University of Denmark Co-workers: Morten Mørup, Kristoffer Madsen, Peter Mondrup, Daniel Jacobsen, Stephen Strother,. OUTLINE Do
More informationSimple Linear Regression
Simple Linear Regression Christopher Ting Christopher Ting : christophert@smu.edu.sg : 688 0364 : LKCSB 5036 January 7, 017 Web Site: http://www.mysmu.edu/faculty/christophert/ Christopher Ting QF 30 Week
More informationECONOMETRICS HONOR S EXAM REVIEW SESSION
ECONOMETRICS HONOR S EXAM REVIEW SESSION Eunice Han ehan@fas.harvard.edu March 26 th, 2013 Harvard University Information 2 Exam: April 3 rd 3-6pm @ Emerson 105 Bring a calculator and extra pens. Notes
More informationHeteroscedasticity and Autocorrelation
Heteroscedasticity and Autocorrelation Carlo Favero Favero () Heteroscedasticity and Autocorrelation 1 / 17 Heteroscedasticity, Autocorrelation, and the GLS estimator Let us reconsider the single equation
More informationMatematické Metody v Ekonometrii 7.
Matematické Metody v Ekonometrii 7. Multicollinearity Blanka Šedivá KMA zimní semestr 2016/2017 Blanka Šedivá (KMA) Matematické Metody v Ekonometrii 7. zimní semestr 2016/2017 1 / 15 One of the assumptions
More informationWellcome Trust Centre for Neuroimaging, UCL, UK.
Bayesian Inference Will Penny Wellcome Trust Centre for Neuroimaging, UCL, UK. SPM Course, Virginia Tech, January 2012 What is Bayesian Inference? (From Daniel Wolpert) Bayesian segmentation and normalisation
More informationcovariance function, 174 probability structure of; Yule-Walker equations, 174 Moving average process, fluctuations, 5-6, 175 probability structure of
Index* The Statistical Analysis of Time Series by T. W. Anderson Copyright 1971 John Wiley & Sons, Inc. Aliasing, 387-388 Autoregressive {continued) Amplitude, 4, 94 case of first-order, 174 Associated
More informationE 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test
E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October
More information1. The OLS Estimator. 1.1 Population model and notation
1. The OLS Estimator OLS stands for Ordinary Least Squares. There are 6 assumptions ordinarily made, and the method of fitting a line through data is by least-squares. OLS is a common estimation methodology
More informationLecture 4: Multivariate Regression, Part 2
Lecture 4: Multivariate Regression, Part 2 Gauss-Markov Assumptions 1) Linear in Parameters: Y X X X i 0 1 1 2 2 k k 2) Random Sampling: we have a random sample from the population that follows the above
More informationIntroduction to Econometrics
Introduction to Econometrics T H I R D E D I T I O N Global Edition James H. Stock Harvard University Mark W. Watson Princeton University Boston Columbus Indianapolis New York San Francisco Upper Saddle
More information11. Further Issues in Using OLS with TS Data
11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,
More informationMS&E 226. In-Class Midterm Examination Solutions Small Data October 20, 2015
MS&E 226 In-Class Midterm Examination Solutions Small Data October 20, 2015 PROBLEM 1. Alice uses ordinary least squares to fit a linear regression model on a dataset containing outcome data Y and covariates
More informationFöreläsning /31
1/31 Föreläsning 10 090420 Chapter 13 Econometric Modeling: Model Speci cation and Diagnostic testing 2/31 Types of speci cation errors Consider the following models: Y i = β 1 + β 2 X i + β 3 X 2 i +
More informationIntroduction to Econometrics. Heteroskedasticity
Introduction to Econometrics Introduction Heteroskedasticity When the variance of the errors changes across segments of the population, where the segments are determined by different values for the explanatory
More informationLinear Regression. Junhui Qian. October 27, 2014
Linear Regression Junhui Qian October 27, 2014 Outline The Model Estimation Ordinary Least Square Method of Moments Maximum Likelihood Estimation Properties of OLS Estimator Unbiasedness Consistency Efficiency
More informationReliability of inference (1 of 2 lectures)
Reliability of inference (1 of 2 lectures) Ragnar Nymoen University of Oslo 5 March 2013 1 / 19 This lecture (#13 and 14): I The optimality of the OLS estimators and tests depend on the assumptions of
More informationContents. Data. Introduction & recap Variance components Hierarchical model RFX and summary statistics Variance/covariance matrix «Take home» message
SPM course, CRC, Liege,, Septembre 2009 Contents Group analysis (RF) Variance components Hierarchical model RF and summary statistics Variance/covariance matrix «Tae home» message C. Phillips, Centre de
More information