New Machine Learning Methods for Neuroimaging

Size: px
Start display at page:

Download "New Machine Learning Methods for Neuroimaging"

Transcription

1 New Machine Learning Methods for Neuroimaging Gatsby Computational Neuroscience Unit University College London, UK Dept of Computer Science University of Helsinki, Finland

2 Outline Resting-state networks in EEG/MEG Variants of independent component analysis (ICA) Testing ICA: Are the components reliable? Intersubject consistency provides an plausible null hypothesis Causal analysis / effective connectivity Directionality of connectivities estimated using non-gaussianity Analysis of differences between many connectivity matrices Dynamic connectivity, intersubject connectivity differences Decoding in EEG/MEG A pipeline using ICA, wide-band rhythmic activity

3 Outline Resting-state networks in EEG/MEG Testing ICA: Are the components reliable? Causal analysis / effective connectivity Analysis of differences between many connectivity matrices Decoding in EEG/MEG

4 The brain at rest EEG/MEG resting state analysis by ICA The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA The subject s brain is being measured while the subject has no task the subject receives no stimulation Measurements by functional magnetic resonance imaging (fmri) electroencephalography (EEG) magnetoencephalography (MEG) Why is this data so interesting? Not dependent on subjective choices in experimental design (e.g. stimulation protocol, task) Not much analysis has been done so far Completely new viewpoint: rich internal dynamics

5 The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA Is anything happening in the brain at rest? Some brain areas are actually more active at rest Default-mode network(s) in PET and fmri (Raichle 2001) Brain activity is intrinsic instead of just responses to stimulation How to analyse resting state in more detail? (Raichle, 2010 based on Shulman et al 1997)

6 The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA Independent component analysis (ICA) Supervised methods cannot be used: no teaching signal Assume a linear mixing model x i = j a ij s j (1) x i are observed variables, s j independent latent variables. ICA finds both a ij and s j by maximising sparsity of the s j. Sparsity = probability density has heavy tails and peak at zero: gaussian sparse

7 The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA ICA finds resting-state networks in fmri a) Medial and b) lateral visual areas, c) Auditory system, d) Sensory-motor system, e) Default-mode network, f) Executive control, g) Dorsal visual stream (Beckmann et al, 2005) Very similar results obtained if subject watching a movie

8 How about EEG and MEG? The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA EEG and MEG are measurements of electrical activity in the brain Very high temporal accuracy (millisecond scale) Not so high spatial accuracy (less than in fmri) Typically characterized by oscillations, e.g. at around 10 Hz Up to 306 time series (signals), time points. Information very different from fmri

9 The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA Different sparsities of EEG/MEG data ICA finds components by maximizing sparsity, but sparsity of what? Depends on preprocessing and representation Assume we do wavelet or short-time Fourier transform We have different sparsities:

10 The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA Different sparsities of EEG/MEG data ICA finds components by maximizing sparsity, but sparsity of what? Depends on preprocessing and representation Assume we do wavelet or short-time Fourier transform We have different sparsities: Sparsity in time: Temporally modulated

11 The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA Different sparsities of EEG/MEG data ICA finds components by maximizing sparsity, but sparsity of what? Depends on preprocessing and representation Assume we do wavelet or short-time Fourier transform We have different sparsities: Sparsity in time: Temporally modulated Sparsity in frequency: narrow-band signals

12 The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA Different sparsities of EEG/MEG data ICA finds components by maximizing sparsity, but sparsity of what? Depends on preprocessing and representation Assume we do wavelet or short-time Fourier transform We have different sparsities: Sparsity in time: Temporally modulated Sparsity in frequency: narrow-band signals 2 1 Sparsity in space: Localised on cortex

13 Different sparsities of EEG/MEG data The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA ICA finds components by maximizing sparsity, but sparsity of what? Depends on preprocessing and representation Assume we do wavelet or short-time Fourier transform We have different sparsities: Sparsity in time: Temporally modulated Sparsity in frequency: narrow-band signals 2 1 Sparsity in space: Localised on cortex Plain ICA, Fourier-ICA, Spatial-ICA

14 EEG/MEG resting state analysis by ICA The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA Fourier-ICA: joint sparsity in time and frequency (Hyva rinen, Ramkumar, Parkkonen, Hari, NeuroImage, 2010) Aapo Hyva rinen

15 The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA Spatial sparsity (spatial ICA) Images observed at different time points are linear sums of source images = a11 +a a1n = a 21 = a n1 Maximizes spatial sparsity alone typical in fmri No assumption on temporal independence (Ramkumar et al, Human Brain Mapping, 2012)

16 Outline EEG/MEG resting state analysis by ICA The brain at rest Independent Component Analysis of fmri at rest Issues in ICA of EEG/MEG Different sparsities Fourier-ICA Spatial ICA Resting-state networks in EEG/MEG Testing ICA: Are the components reliable? Causal analysis / effective connectivity Analysis of differences between many connectivity matrices Decoding in EEG/MEG

17 Motivation Results Testing ICs: motivation ICA algorithms give a fixed number of components and do not tell which ones are reliable (statistically significant) How do we know that an estimated component is not just a random effect? Algorithmic artifacts also possible (local minima)

18 Motivation Results Testing ICs: motivation ICA algorithms give a fixed number of components and do not tell which ones are reliable (statistically significant) How do we know that an estimated component is not just a random effect? Algorithmic artifacts also possible (local minima) We develop a statistical test based on inter-subject consistency: Do ICA separately on several subjects A component is significant if it appears in two or more subjects in a sufficiently similar form We formulate a rigorous null hypothesis to quantify this idea (NeuroImage, 2011)

19 Motivation Results Testing ICs: results One IC Another IC b) Minimum norm estimate b) Minimum norm estimate 7 d) Modulation by stimulation 7 d) Modulation by stimulation 6 6 z score (neg) 5 4 z score (neg) beep spch v/hf v/bd tact 2 beep spch v/hf v/bd tact c) Fourier spectrum c) Fourier spectrum #6 # Frequency (Hz) Frequency (Hz)

20 Motivation Results Outline Resting-state networks in EEG/MEG Testing ICA: Are the components reliable? Causal analysis / effective connectivity Analysis of differences between many connectivity matrices Decoding in EEG/MEG

21 Introduction Structural equation models Simple measures of causal direction Results Causal analysis: Motivation Effective (directed) connectivity: Which area influences/drives which? Two fundamental approaches If time-resolution of measurements fast enough, we can use autoregressive modelling (Granger causality) Otherwise, we need structural equation models With EEG/MEG, we should first localize sources, e.g. by ICA Sources are often uncorrelated More meaningful to model dependencies of envelopes (amplitudes, variances)

22 Introduction Structural equation models Simple measures of causal direction Results Structural equation models How does an (externally imposed) change in one variable affect the others? x i = j i b ij x j + e i Difficult to estimate, not simple regression Classic methods fail in general

23 Introduction Structural equation models Simple measures of causal direction Results Structural equation models How does an (externally imposed) change in one variable affect the others? x4 x i = j i b ij x j + e i -0.3 x Difficult to estimate, not simple regression Classic methods fail in general Can be estimated if (Shimizu et al., JMLR, 2005) 1. the e i (t) are mutually independent 2. the e i (t) are non-gaussian, e.g. sparse 3. the b ij are acyclic: There is an ordering of x i where effects are all forward x x x7 x5 1 x6 0.37

24 Introduction Structural equation models Simple measures of causal direction Results Simple measures of causal direction The very simplest case: choose between regression models y = ρx + d (2) where d is independent of x, and symmetrically x = ρy + e (3) If data is Gaussian we can estimate ρ = E{xy} BUT: Both models have same likelihood! For non-gaussian data, approximate log-likelihood ratio as R = ρe{x g(y) g(x)y} (4) where g is a nonlinearity similar to those used in ICA Choose direction based on sign of R! (Hyvärinen and S.M. Smith, JMLR, 2013)

25 Introduction Structural equation models Simple measures of causal direction Results Sample of results on MEG (but for a different method) Black: positive influence, red: negative influence. Green: manually drawn grouping. Here, using GARCH model (Zhang and Hyvärinen, UAI2010)

26 Introduction Structural equation models Simple measures of causal direction Results Outline Resting-state networks in EEG/MEG Testing ICA: Are the components reliable? Causal analysis / effective connectivity Analysis of differences between many connectivity matrices Decoding in EEG/MEG

27 Motivation PCA of connectivity matrices Problem with PCA approach New decomposition after PCA Results on fcmri : Motivation Often the data consists of a number of connectivity matrices from different time windows from different subjects Differences or changes in connectivity can be of great interest dynamic connectivity intersubject differences Contrast with spatial ICA for fmri which analyses static connectivity. commonalities over subjects

28 Motivation PCA of connectivity matrices Problem with PCA approach New decomposition after PCA Results on fcmri Basic approach by PCA of connectivity matrices Compute connectivity matrix C t in many time windows t = 1,..., T e.g. sliding windows to analyse dynamice connectivity or, from different subjects Vectorize each C t, subtract means and do PCA on those vectors. Each obtained principal component is converted back to a matrix.

29 Motivation PCA of connectivity matrices Problem with PCA approach New decomposition after PCA Results on fcmri PCA on connectivity matrices (Leonardi et al, 2013)

30 Motivation PCA of connectivity matrices Problem with PCA approach New decomposition after PCA Results on fcmri Our approach: Motivation A principal component K of connectivity patterns are big and complex. How to visualize and analyse them further? Solution: Use linear components in the original data space. Conventional approach: Take a matrix principal component K and use rank-one approximation K = ww T. (Possibly repeated a couple of times.) However, Cannot really analyse connectivities between brain areas. Mainly connectivity inside one area (cliques). In real data, eigenstructure of K is quite unconventional:

31 Motivation PCA of connectivity matrices Problem with PCA approach New decomposition after PCA Results on fcmri New decomposition of PCA matrices Assume we are given a principal component (PC) of connectivity matrices, K Analyse matrix PC by two orthogonal linear components by optimizing: max w = v =1,w T v=0 w T Kv (5) where each vector w and v is a spatial pattern. Equivalent to approximation of K as wv T + vw T. Orthogonality of w and v essential, otherwise w = v and we get classic rank-one approximation by eigen-value decomposition.

32 Motivation PCA of connectivity matrices Problem with PCA approach New decomposition after PCA Results on fcmri Theorem on approximating matrix principal component Consider optimization problem on last slide max w T Kv (6) w = v =1,w T v=0 where K obtained from PCA of connectivity matrices. Theorem 1 The solution is w = 1 2 (e max + e min ), v = 1 2 (e max e min ) (7) where e max and e min are the eigenvectors corresponding to the smallest and largest eigenvalues of K. Important point: eigenvectors themselves do not give w, v.

33 Motivation PCA of connectivity matrices Problem with PCA approach New decomposition after PCA Results on fcmri Experiments on fmri: Methods Publicly available data set 1000 Functional connectomes 103 subjects used 177 brain regions One connectivity matrix for each subject Matlab code available; paper in Neural Computation, 2016.

34 Motivation PCA of connectivity matrices Problem with PCA approach New decomposition after PCA Results on fcmri Experiments on fmri: Results, PC #1 w v w: Task-positive network (?), v: Default-mode network

35 Motivation PCA of connectivity matrices Problem with PCA approach New decomposition after PCA Results on fcmri Experiments on fmri: Results, PC #2 w v w: Default-mode network, v: Frontal executive network

36 Motivation PCA of connectivity matrices Problem with PCA approach New decomposition after PCA Results on fcmri Outline Resting-state networks in EEG/MEG Testing ICA: Are the components reliable? Causal analysis / effective connectivity Analysis of differences between many connectivity matrices Decoding in EEG/MEG

37 Decoding EEG/MEG Our Spectral Decoding Toolbox Example results Decoding EEG/MEG based on envelopes Consider data with a stimulation or a task but not phase-locked ERPs e.g. watching a movie, practising mindfulness Decoding popular in fmri, and evoked EEG/MEG but less widely used for rhythmic activity in EEG/MEG few methods available For one-sec time windows, how to decode stimulation/task? We develop a method for decoding rhythmic activity (Kauppi et al, NIMG, 2013) with MATLAB package

38 Decoding EEG/MEG Our Spectral Decoding Toolbox Example results Principles of our Spectral Decoding Toolbox (SpeDeBox) Data can be wide-band (theta + alpha + beta, etc.) Algorithm finds optimal frequency band(s) Decoding based on envelopes / energies We learn two kinds of weights: Spatial pattern (weights) Spectral profile (weights) We use components given by ICA Avoids problems with extremely high-dim cortical projections Yet works on brain sources instead of sensors

39 Decoding EEG/MEG Our Spectral Decoding Toolbox Example results Example of SpeDeBox in action Data with four stimulation modalities: visual, auditory, tactile, rest

40 Decoding EEG/MEG Our Spectral Decoding Toolbox Example results Conclusion Exploratory analysis by ICA can give information about internal dynamics during rest activity not directly related to stimulation responses when stimulation too complex

41 Decoding EEG/MEG Our Spectral Decoding Toolbox Example results Conclusion Exploratory analysis by ICA can give information about internal dynamics during rest activity not directly related to stimulation responses when stimulation too complex We present various methods Finding EEG/MEG sources by different variants of ICA Analyzing their effective connectivity

42 Decoding EEG/MEG Our Spectral Decoding Toolbox Example results Conclusion Exploratory analysis by ICA can give information about internal dynamics during rest activity not directly related to stimulation responses when stimulation too complex We present various methods Finding EEG/MEG sources by different variants of ICA Analyzing their effective connectivity Significance (by consistency) should also be analyzed

43 Decoding EEG/MEG Our Spectral Decoding Toolbox Example results Conclusion Exploratory analysis by ICA can give information about internal dynamics during rest activity not directly related to stimulation responses when stimulation too complex We present various methods Finding EEG/MEG sources by different variants of ICA Analyzing their effective connectivity Significance (by consistency) should also be analyzed Dynamic connectivity from many connectivity matrices Further analysis going beyond PCA

44 Decoding EEG/MEG Our Spectral Decoding Toolbox Example results Conclusion Exploratory analysis by ICA can give information about internal dynamics during rest activity not directly related to stimulation responses when stimulation too complex We present various methods Finding EEG/MEG sources by different variants of ICA Analyzing their effective connectivity Significance (by consistency) should also be analyzed Dynamic connectivity from many connectivity matrices Further analysis going beyond PCA Alternative approach: Decoding of rhytmic EEG/MEG

Independent Component Analysis

Independent Component Analysis A Short Introduction to Independent Component Analysis with Some Recent Advances Aapo Hyvärinen Dept of Computer Science Dept of Mathematics and Statistics University of Helsinki Problem of blind source

More information

NeuroImage 58 (2011) Contents lists available at ScienceDirect. NeuroImage. journal homepage:

NeuroImage 58 (2011) Contents lists available at ScienceDirect. NeuroImage. journal homepage: NeuroImage 8 (11) 1 136 Contents lists available at ScienceDirect NeuroImage journal homepage: www.elsevier.com/locate/ynimg Testing the ICA mixing matrix based on inter-subject or inter-session consistency

More information

Pairwise measures of causal direction in linear non-gaussian acy

Pairwise measures of causal direction in linear non-gaussian acy Pairwise measures of causal direction in linear non-gaussian acyclic models Dept of Mathematics and Statistics & Dept of Computer Science University of Helsinki, Finland Abstract Estimating causal direction

More information

Testing the ICA mixing matrix based on inter-subject or inter-session consistency

Testing the ICA mixing matrix based on inter-subject or inter-session consistency Testing the ICA mixing matrix based on inter-subject or inter-session consistency Aapo Hyvärinen,a a Dept of Mathematics and Statistics, Dept of Computer Science / HIIT, and Dept of Psychology, University

More information

Spatial Source Filtering. Outline EEG/ERP. ERPs) Event-related Potentials (ERPs( EEG

Spatial Source Filtering. Outline EEG/ERP. ERPs) Event-related Potentials (ERPs( EEG Integration of /MEG/fMRI Vince D. Calhoun, Ph.D. Director, Image Analysis & MR Research The MIND Institute Outline fmri/ data Three Approaches to integration/fusion Prediction Constraints 2 nd Level Fusion

More information

NeuroImage 49 (2010) Contents lists available at ScienceDirect. NeuroImage. journal homepage:

NeuroImage 49 (2010) Contents lists available at ScienceDirect. NeuroImage. journal homepage: NeuroImage 49 (2010) 257 271 Contents lists available at ScienceDirect NeuroImage journal homepage: www.elsevier.com/locate/ynimg Independent component analysis of short-time Fourier transforms for spontaneous

More information

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts I-II Gatsby Unit University College London 27 Feb 2017 Outline Part I: Theory of ICA Definition and difference

More information

CIFAR Lectures: Non-Gaussian statistics and natural images

CIFAR Lectures: Non-Gaussian statistics and natural images CIFAR Lectures: Non-Gaussian statistics and natural images Dept of Computer Science University of Helsinki, Finland Outline Part I: Theory of ICA Definition and difference to PCA Importance of non-gaussianity

More information

Modelling temporal structure (in noise and signal)

Modelling temporal structure (in noise and signal) Modelling temporal structure (in noise and signal) Mark Woolrich, Christian Beckmann*, Salima Makni & Steve Smith FMRIB, Oxford *Imperial/FMRIB temporal noise: modelling temporal autocorrelation temporal

More information

Recipes for the Linear Analysis of EEG and applications

Recipes for the Linear Analysis of EEG and applications Recipes for the Linear Analysis of EEG and applications Paul Sajda Department of Biomedical Engineering Columbia University Can we read the brain non-invasively and in real-time? decoder 1001110 if YES

More information

A Multivariate Time-Frequency Based Phase Synchrony Measure for Quantifying Functional Connectivity in the Brain

A Multivariate Time-Frequency Based Phase Synchrony Measure for Quantifying Functional Connectivity in the Brain A Multivariate Time-Frequency Based Phase Synchrony Measure for Quantifying Functional Connectivity in the Brain Dr. Ali Yener Mutlu Department of Electrical and Electronics Engineering, Izmir Katip Celebi

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison [based on slides from Nina Balcan] slide 1 Goals for the lecture you should understand

More information

Neuroscience Introduction

Neuroscience Introduction Neuroscience Introduction The brain As humans, we can identify galaxies light years away, we can study particles smaller than an atom. But we still haven t unlocked the mystery of the three pounds of matter

More information

Natural Image Statistics

Natural Image Statistics Natural Image Statistics A probabilistic approach to modelling early visual processing in the cortex Dept of Computer Science Early visual processing LGN V1 retina From the eye to the primary visual cortex

More information

Dynamic Causal Modelling for EEG and MEG. Stefan Kiebel

Dynamic Causal Modelling for EEG and MEG. Stefan Kiebel Dynamic Causal Modelling for EEG and MEG Stefan Kiebel Overview 1 M/EEG analysis 2 Dynamic Causal Modelling Motivation 3 Dynamic Causal Modelling Generative model 4 Bayesian inference 5 Applications Overview

More information

Causal modelling combining instantaneous and lagged effects: an identifiable model based on non-gaussianity

Causal modelling combining instantaneous and lagged effects: an identifiable model based on non-gaussianity : an identifiable model based on non-gaussianity Aapo Hyvärinen Dept of Computer Science and HIIT, University of Helsinki, Finland Shohei Shimizu The Institute of Scientific and Industrial Research, Osaka

More information

Introduction to Machine Learning

Introduction to Machine Learning 10-701 Introduction to Machine Learning PCA Slides based on 18-661 Fall 2018 PCA Raw data can be Complex, High-dimensional To understand a phenomenon we measure various related quantities If we knew what

More information

Orthogonal Connectivity Factorization: Interpretable Decomposition of Variability in Correlation Matrices

Orthogonal Connectivity Factorization: Interpretable Decomposition of Variability in Correlation Matrices ARTICLE Communicated by Tatyana Sharpee Orthogonal Connectivity Factorization: Interpretable Decomposition of Variability in Correlation Matrices Aapo Hyvärinen aapo.hyvarinen@helsinki.fi Department of

More information

Independent Component Analysis

Independent Component Analysis A Short Introduction to Independent Component Analysis Aapo Hyvärinen Helsinki Institute for Information Technology and Depts of Computer Science and Psychology University of Helsinki Problem of blind

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Independent Component Analysis Barnabás Póczos Independent Component Analysis 2 Independent Component Analysis Model original signals Observations (Mixtures)

More information

Functional Connectivity and Network Methods

Functional Connectivity and Network Methods 18/Sep/2013" Functional Connectivity and Network Methods with functional magnetic resonance imaging" Enrico Glerean (MSc), Brain & Mind Lab, BECS, Aalto University" www.glerean.com @eglerean becs.aalto.fi/bml

More information

PCA, Kernel PCA, ICA

PCA, Kernel PCA, ICA PCA, Kernel PCA, ICA Learning Representations. Dimensionality Reduction. Maria-Florina Balcan 04/08/2015 Big & High-Dimensional Data High-Dimensions = Lot of Features Document classification Features per

More information

TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen

TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES. Mika Inki and Aapo Hyvärinen TWO METHODS FOR ESTIMATING OVERCOMPLETE INDEPENDENT COMPONENT BASES Mika Inki and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O. Box 54, FIN-215 HUT, Finland ABSTRACT

More information

Statistical Analysis of fmrl Data

Statistical Analysis of fmrl Data Statistical Analysis of fmrl Data F. Gregory Ashby The MIT Press Cambridge, Massachusetts London, England Preface xi Acronyms xv 1 Introduction 1 What Is fmri? 2 The Scanning Session 4 Experimental Design

More information

MultiDimensional Signal Processing Master Degree in Ingegneria delle Telecomunicazioni A.A

MultiDimensional Signal Processing Master Degree in Ingegneria delle Telecomunicazioni A.A MultiDimensional Signal Processing Master Degree in Ingegneria delle Telecomunicazioni A.A. 2017-2018 Pietro Guccione, PhD DEI - DIPARTIMENTO DI INGEGNERIA ELETTRICA E DELL INFORMAZIONE POLITECNICO DI

More information

Tutorial on Blind Source Separation and Independent Component Analysis

Tutorial on Blind Source Separation and Independent Component Analysis Tutorial on Blind Source Separation and Independent Component Analysis Lucas Parra Adaptive Image & Signal Processing Group Sarnoff Corporation February 09, 2002 Linear Mixtures... problem statement...

More information

Statistical Analysis Aspects of Resting State Functional Connectivity

Statistical Analysis Aspects of Resting State Functional Connectivity Statistical Analysis Aspects of Resting State Functional Connectivity Biswal s result (1995) Correlations between RS Fluctuations of left and right motor areas Why studying resting state? Human Brain =

More information

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004

Independent Component Analysis and Its Applications. By Qing Xue, 10/15/2004 Independent Component Analysis and Its Applications By Qing Xue, 10/15/2004 Outline Motivation of ICA Applications of ICA Principles of ICA estimation Algorithms for ICA Extensions of basic ICA framework

More information

Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA

Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA Unsupervised Feature Extraction by Time-Contrastive Learning and Nonlinear ICA with Hiroshi Morioka Dept of Computer Science University of Helsinki, Finland Facebook AI Summit, 13th June 2016 Abstract

More information

Dimension Reduction (PCA, ICA, CCA, FLD,

Dimension Reduction (PCA, ICA, CCA, FLD, Dimension Reduction (PCA, ICA, CCA, FLD, Topic Models) Yi Zhang 10-701, Machine Learning, Spring 2011 April 6 th, 2011 Parts of the PCA slides are from previous 10-701 lectures 1 Outline Dimension reduction

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

! Dimension Reduction

! Dimension Reduction ! Dimension Reduction Pamela K. Douglas NITP Summer 2013 Overview! What is dimension reduction! Motivation for performing reduction on your data! Intuitive Description of Common Methods! Applications in

More information

Independent Component Analysis. Contents

Independent Component Analysis. Contents Contents Preface xvii 1 Introduction 1 1.1 Linear representation of multivariate data 1 1.1.1 The general statistical setting 1 1.1.2 Dimension reduction methods 2 1.1.3 Independence as a guiding principle

More information

Introduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract

Introduction to Independent Component Analysis. Jingmei Lu and Xixi Lu. Abstract Final Project 2//25 Introduction to Independent Component Analysis Abstract Independent Component Analysis (ICA) can be used to solve blind signal separation problem. In this article, we introduce definition

More information

Multivariate Statistical Analysis

Multivariate Statistical Analysis Multivariate Statistical Analysis Fall 2011 C. L. Williams, Ph.D. Lecture 4 for Applied Multivariate Analysis Outline 1 Eigen values and eigen vectors Characteristic equation Some properties of eigendecompositions

More information

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani

PCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani PCA & ICA CE-717: Machine Learning Sharif University of Technology Spring 2015 Soleymani Dimensionality Reduction: Feature Selection vs. Feature Extraction Feature selection Select a subset of a given

More information

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models JMLR Workshop and Conference Proceedings 6:17 164 NIPS 28 workshop on causality Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Kun Zhang Dept of Computer Science and HIIT University

More information

Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior

Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 2: Multivariate fmri analysis using a sparsifying spatio-temporal prior Tom Heskes joint work with Marcel van Gerven

More information

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts III-IV

Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts III-IV Gatsby Theoretical Neuroscience Lectures: Non-Gaussian statistics and natural images Parts III-IV Aapo Hyvärinen Gatsby Unit University College London Part III: Estimation of unnormalized models Often,

More information

Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces

Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces LETTER Communicated by Bartlett Mel Emergence of Phase- and Shift-Invariant Features by Decomposition of Natural Images into Independent Feature Subspaces Aapo Hyvärinen Patrik Hoyer Helsinki University

More information

Independent Component Analysis

Independent Component Analysis Independent Component Analysis James V. Stone November 4, 24 Sheffield University, Sheffield, UK Keywords: independent component analysis, independence, blind source separation, projection pursuit, complexity

More information

Searching for Nested Oscillations in Frequency and Sensor Space. Will Penny. Wellcome Trust Centre for Neuroimaging. University College London.

Searching for Nested Oscillations in Frequency and Sensor Space. Will Penny. Wellcome Trust Centre for Neuroimaging. University College London. in Frequency and Sensor Space Oscillation Wellcome Trust Centre for Neuroimaging. University College London. Non- Workshop on Non-Invasive Imaging of Nonlinear Interactions. 20th Annual Computational Neuroscience

More information

HST.582J/6.555J/16.456J

HST.582J/6.555J/16.456J Blind Source Separation: PCA & ICA HST.582J/6.555J/16.456J Gari D. Clifford gari [at] mit. edu http://www.mit.edu/~gari G. D. Clifford 2005-2009 What is BSS? Assume an observation (signal) is a linear

More information

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007

HST.582J / 6.555J / J Biomedical Signal and Image Processing Spring 2007 MIT OpenCourseWare http://ocw.mit.edu HST.582J / 6.555J / 16.456J Biomedical Signal and Image Processing Spring 2007 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Probabilistic & Unsupervised Learning

Probabilistic & Unsupervised Learning Probabilistic & Unsupervised Learning Week 2: Latent Variable Models Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc ML/CSML, Dept Computer Science University College

More information

Unsupervised learning: beyond simple clustering and PCA

Unsupervised learning: beyond simple clustering and PCA Unsupervised learning: beyond simple clustering and PCA Liza Rebrova Self organizing maps (SOM) Goal: approximate data points in R p by a low-dimensional manifold Unlike PCA, the manifold does not have

More information

Structural equations and divisive normalization for energy-dependent component analysis

Structural equations and divisive normalization for energy-dependent component analysis Structural equations and divisive normalization for energy-dependent component analysis Jun-ichiro Hirayama Dept. of Systems Science Graduate School of of Informatics Kyoto University 611-11 Uji, Kyoto,

More information

Discovery of Linear Acyclic Models Using Independent Component Analysis

Discovery of Linear Acyclic Models Using Independent Component Analysis Created by S.S. in Jan 2008 Discovery of Linear Acyclic Models Using Independent Component Analysis Shohei Shimizu, Patrik Hoyer, Aapo Hyvarinen and Antti Kerminen LiNGAM homepage: http://www.cs.helsinki.fi/group/neuroinf/lingam/

More information

Dimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes. October 3, Statistics 202: Data Mining

Dimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes. October 3, Statistics 202: Data Mining Dimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes October 3, 2012 1 / 1 Combinations of features Given a data matrix X n p with p fairly large, it can

More information

Dynamic Causal Modelling for EEG/MEG: principles J. Daunizeau

Dynamic Causal Modelling for EEG/MEG: principles J. Daunizeau Dynamic Causal Modelling for EEG/MEG: principles J. Daunizeau Motivation, Brain and Behaviour group, ICM, Paris, France Overview 1 DCM: introduction 2 Dynamical systems theory 3 Neural states dynamics

More information

In: W. von der Linden, V. Dose, R. Fischer and R. Preuss (eds.), Maximum Entropy and Bayesian Methods, Munich 1998, Dordrecht. Kluwer, pp

In: W. von der Linden, V. Dose, R. Fischer and R. Preuss (eds.), Maximum Entropy and Bayesian Methods, Munich 1998, Dordrecht. Kluwer, pp In: W. von der Linden, V. Dose, R. Fischer and R. Preuss (eds.), Maximum Entropy and Bayesian Methods, Munich 1998, Dordrecht. Kluwer, pp. 17-6. CONVERGENT BAYESIAN FORMULATIONS OF BLIND SOURCE SEPARATION

More information

A MULTIVARIATE TIME-FREQUENCY BASED PHASE SYNCHRONY MEASURE AND APPLICATIONS TO DYNAMIC BRAIN NETWORK ANALYSIS. Ali Yener Mutlu

A MULTIVARIATE TIME-FREQUENCY BASED PHASE SYNCHRONY MEASURE AND APPLICATIONS TO DYNAMIC BRAIN NETWORK ANALYSIS. Ali Yener Mutlu A MULTIVARIATE TIME-FREQUENCY BASED PHASE SYNCHRONY MEASURE AND APPLICATIONS TO DYNAMIC BRAIN NETWORK ANALYSIS By Ali Yener Mutlu A DISSERTATION Submitted to Michigan State University in partial fulfillment

More information

A MULTIVARIATE MODEL FOR COMPARISON OF TWO DATASETS AND ITS APPLICATION TO FMRI ANALYSIS

A MULTIVARIATE MODEL FOR COMPARISON OF TWO DATASETS AND ITS APPLICATION TO FMRI ANALYSIS A MULTIVARIATE MODEL FOR COMPARISON OF TWO DATASETS AND ITS APPLICATION TO FMRI ANALYSIS Yi-Ou Li and Tülay Adalı University of Maryland Baltimore County Baltimore, MD Vince D. Calhoun The MIND Institute

More information

Machine learning for pervasive systems Classification in high-dimensional spaces

Machine learning for pervasive systems Classification in high-dimensional spaces Machine learning for pervasive systems Classification in high-dimensional spaces Department of Communications and Networking Aalto University, School of Electrical Engineering stephan.sigg@aalto.fi Version

More information

Brain Computer Interface Using Tensor Decompositions and Multi-way Analysis

Brain Computer Interface Using Tensor Decompositions and Multi-way Analysis 1 Brain Computer Interface Using Tensor Decompositions and Multi-way Analysis Andrzej CICHOCKI Laboratory for Advanced Brain Signal Processing http://www.bsp.brain.riken.jp/~cia/. RIKEN, Brain Science

More information

Nonlinear reverse-correlation with synthesized naturalistic noise

Nonlinear reverse-correlation with synthesized naturalistic noise Cognitive Science Online, Vol1, pp1 7, 2003 http://cogsci-onlineucsdedu Nonlinear reverse-correlation with synthesized naturalistic noise Hsin-Hao Yu Department of Cognitive Science University of California

More information

Generalization in high-dimensional factor models

Generalization in high-dimensional factor models Generalization in high-dimensional factor models DTU Informatics Technical University of Denmark Modern massive data = modern massive headache? MMDS Cluster headache: PET functional imaging shows activation

More information

Effective Connectivity & Dynamic Causal Modelling

Effective Connectivity & Dynamic Causal Modelling Effective Connectivity & Dynamic Causal Modelling Hanneke den Ouden Donders Centre for Cognitive Neuroimaging Radboud University Nijmegen Advanced SPM course Zurich, Februari 13-14, 2014 Functional Specialisation

More information

Independent Component Analysis

Independent Component Analysis 1 Independent Component Analysis Background paper: http://www-stat.stanford.edu/ hastie/papers/ica.pdf 2 ICA Problem X = AS where X is a random p-vector representing multivariate input measurements. S

More information

https://goo.gl/kfxweg KYOTO UNIVERSITY Statistical Machine Learning Theory Sparsity Hisashi Kashima kashima@i.kyoto-u.ac.jp DEPARTMENT OF INTELLIGENCE SCIENCE AND TECHNOLOGY 1 KYOTO UNIVERSITY Topics:

More information

A two-layer ICA-like model estimated by Score Matching

A two-layer ICA-like model estimated by Score Matching A two-layer ICA-like model estimated by Score Matching Urs Köster and Aapo Hyvärinen University of Helsinki and Helsinki Institute for Information Technology Abstract. Capturing regularities in high-dimensional

More information

c Springer, Reprinted with permission.

c Springer, Reprinted with permission. Zhijian Yuan and Erkki Oja. A FastICA Algorithm for Non-negative Independent Component Analysis. In Puntonet, Carlos G.; Prieto, Alberto (Eds.), Proceedings of the Fifth International Symposium on Independent

More information

Estimation of linear non-gaussian acyclic models for latent factors

Estimation of linear non-gaussian acyclic models for latent factors Estimation of linear non-gaussian acyclic models for latent factors Shohei Shimizu a Patrik O. Hoyer b Aapo Hyvärinen b,c a The Institute of Scientific and Industrial Research, Osaka University Mihogaoka

More information

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models

Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Distinguishing Causes from Effects using Nonlinear Acyclic Causal Models Kun Zhang Dept of Computer Science and HIIT University of Helsinki 14 Helsinki, Finland kun.zhang@cs.helsinki.fi Aapo Hyvärinen

More information

PERFORMANCE STUDY OF CAUSALITY MEASURES

PERFORMANCE STUDY OF CAUSALITY MEASURES PERFORMANCE STUDY OF CAUSALITY MEASURES T. Bořil, P. Sovka Department of Circuit Theory, Faculty of Electrical Engineering, Czech Technical University in Prague Abstract Analysis of dynamic relations in

More information

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012 Machine Learning CSE6740/CS7641/ISYE6740, Fall 2012 Principal Components Analysis Le Song Lecture 22, Nov 13, 2012 Based on slides from Eric Xing, CMU Reading: Chap 12.1, CB book 1 2 Factor or Component

More information

Simple-Cell-Like Receptive Fields Maximize Temporal Coherence in Natural Video

Simple-Cell-Like Receptive Fields Maximize Temporal Coherence in Natural Video Simple-Cell-Like Receptive Fields Maximize Temporal Coherence in Natural Video Jarmo Hurri and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O.Box 98, 215 HUT, Finland

More information

Machine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling

Machine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling Machine Learning B. Unsupervised Learning B.2 Dimensionality Reduction Lars Schmidt-Thieme, Nicolas Schilling Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University

More information

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Changing Basis We ve talked so far about re-writing our data using a new set of variables, or a new basis.

More information

Dynamic Causal Modelling for EEG and MEG

Dynamic Causal Modelling for EEG and MEG Dynamic Causal Modelling for EEG and MEG Stefan Kiebel Ma Planck Institute for Human Cognitive and Brain Sciences Leipzig, Germany Overview 1 M/EEG analysis 2 Dynamic Causal Modelling Motivation 3 Dynamic

More information

Electroencephalogram Based Causality Graph Analysis in Behavior Tasks of Parkinson s Disease Patients

Electroencephalogram Based Causality Graph Analysis in Behavior Tasks of Parkinson s Disease Patients University of Denver Digital Commons @ DU Electronic Theses and Dissertations Graduate Studies 1-1-2015 Electroencephalogram Based Causality Graph Analysis in Behavior Tasks of Parkinson s Disease Patients

More information

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Changing Basis We ve talked so far about re-writing our data using a new set of variables, or a new basis.

More information

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26 Principal Component Analysis Brett Bernstein CDS at NYU April 25, 2017 Brett Bernstein (CDS at NYU) Lecture 13 April 25, 2017 1 / 26 Initial Question Intro Question Question Let S R n n be symmetric. 1

More information

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data.

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data. Structure in Data A major objective in data analysis is to identify interesting features or structure in the data. The graphical methods are very useful in discovering structure. There are basically two

More information

Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis'

Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis' Lecture'12:' SSMs;'Independent'Component'Analysis;' Canonical'Correla;on'Analysis' Lester'Mackey' May'7,'2014' ' Stats'306B:'Unsupervised'Learning' Beyond'linearity'in'state'space'modeling' Credit:'Alex'Simma'

More information

Simple-Cell-Like Receptive Fields Maximize Temporal Coherence in Natural Video

Simple-Cell-Like Receptive Fields Maximize Temporal Coherence in Natural Video LETTER Communicated by Bruno Olshausen Simple-Cell-Like Receptive Fields Maximize Temporal Coherence in Natural Video Jarmo Hurri jarmo.hurri@hut.fi Aapo Hyvärinen aapo.hyvarinen@hut.fi Neural Networks

More information

Background Mathematics (2/2) 1. David Barber

Background Mathematics (2/2) 1. David Barber Background Mathematics (2/2) 1 David Barber University College London Modified by Samson Cheung (sccheung@ieee.org) 1 These slides accompany the book Bayesian Reasoning and Machine Learning. The book and

More information

Probabilistic Latent Semantic Analysis

Probabilistic Latent Semantic Analysis Probabilistic Latent Semantic Analysis Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

Independent Component Analysis. PhD Seminar Jörgen Ungh

Independent Component Analysis. PhD Seminar Jörgen Ungh Independent Component Analysis PhD Seminar Jörgen Ungh Agenda Background a motivater Independence ICA vs. PCA Gaussian data ICA theory Examples Background & motivation The cocktail party problem Bla bla

More information

Principal Component Analysis vs. Independent Component Analysis for Damage Detection

Principal Component Analysis vs. Independent Component Analysis for Damage Detection 6th European Workshop on Structural Health Monitoring - Fr..D.4 Principal Component Analysis vs. Independent Component Analysis for Damage Detection D. A. TIBADUIZA, L. E. MUJICA, M. ANAYA, J. RODELLAR

More information

7. Variable extraction and dimensionality reduction

7. Variable extraction and dimensionality reduction 7. Variable extraction and dimensionality reduction The goal of the variable selection in the preceding chapter was to find least useful variables so that it would be possible to reduce the dimensionality

More information

Discovery of non-gaussian linear causal models using ICA

Discovery of non-gaussian linear causal models using ICA Discovery of non-gaussian linear causal models using ICA Shohei Shimizu HIIT Basic Research Unit Dept. of Comp. Science University of Helsinki Finland Aapo Hyvärinen HIIT Basic Research Unit Dept. of Comp.

More information

Principal Component Analysis (PCA)

Principal Component Analysis (PCA) Principal Component Analysis (PCA) Additional reading can be found from non-assessed exercises (week 8) in this course unit teaching page. Textbooks: Sect. 6.3 in [1] and Ch. 12 in [2] Outline Introduction

More information

ARTEFACT DETECTION IN ASTROPHYSICAL IMAGE DATA USING INDEPENDENT COMPONENT ANALYSIS. Maria Funaro, Erkki Oja, and Harri Valpola

ARTEFACT DETECTION IN ASTROPHYSICAL IMAGE DATA USING INDEPENDENT COMPONENT ANALYSIS. Maria Funaro, Erkki Oja, and Harri Valpola ARTEFACT DETECTION IN ASTROPHYSICAL IMAGE DATA USING INDEPENDENT COMPONENT ANALYSIS Maria Funaro, Erkki Oja, and Harri Valpola Neural Networks Research Centre, Helsinki University of Technology P.O.Box

More information

EEG/MEG Inverse Solution Driven by fmri

EEG/MEG Inverse Solution Driven by fmri EEG/MEG Inverse Solution Driven by fmri Yaroslav Halchenko CS @ NJIT 1 Functional Brain Imaging EEG ElectroEncephaloGram MEG MagnetoEncephaloGram fmri Functional Magnetic Resonance Imaging others 2 Functional

More information

wissen leben WWU Münster

wissen leben WWU Münster MÜNSTER Sparsity Constraints in Bayesian Inversion Inverse Days conference in Jyväskylä, Finland. Felix Lucka 19.12.2012 MÜNSTER 2 /41 Sparsity Constraints in Inverse Problems Current trend in high dimensional

More information

Classification. The goal: map from input X to a label Y. Y has a discrete set of possible values. We focused on binary Y (values 0 or 1).

Classification. The goal: map from input X to a label Y. Y has a discrete set of possible values. We focused on binary Y (values 0 or 1). Regression and PCA Classification The goal: map from input X to a label Y. Y has a discrete set of possible values We focused on binary Y (values 0 or 1). But we also discussed larger number of classes

More information

Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego

Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Independent Component Analysis (ICA) Bhaskar D Rao University of California, San Diego Email: brao@ucsdedu References 1 Hyvarinen, A, Karhunen, J, & Oja, E (2004) Independent component analysis (Vol 46)

More information

Independent Component Analysis (ICA)

Independent Component Analysis (ICA) Independent Component Analysis (ICA) Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

Advanced Introduction to Machine Learning

Advanced Introduction to Machine Learning 10-715 Advanced Introduction to Machine Learning Homework 3 Due Nov 12, 10.30 am Rules 1. Homework is due on the due date at 10.30 am. Please hand over your homework at the beginning of class. Please see

More information

Independent component analysis: an introduction

Independent component analysis: an introduction Research Update 59 Techniques & Applications Independent component analysis: an introduction James V. Stone Independent component analysis (ICA) is a method for automatically identifying the underlying

More information

Methods for quantifying intra- and inter-subject variability of evoked potential data applied to the multifocal visual evoked potential

Methods for quantifying intra- and inter-subject variability of evoked potential data applied to the multifocal visual evoked potential Journal of Neuroscience Methods 165 (2007) 270 286 Methods for quantifying intra- and inter-subject variability of evoked potential data applied to the multifocal visual evoked potential Sangita Dandekar

More information

Network Modeling and Functional Data Methods for Brain Functional Connectivity Studies. Kehui Chen

Network Modeling and Functional Data Methods for Brain Functional Connectivity Studies. Kehui Chen Network Modeling and Functional Data Methods for Brain Functional Connectivity Studies Kehui Chen Department of Statistics, University of Pittsburgh Nonparametric Statistics Workshop, Ann Arbor, Oct 06,

More information

Dimensionality Reduction Techniques (DRT)

Dimensionality Reduction Techniques (DRT) Dimensionality Reduction Techniques (DRT) Introduction: Sometimes we have lot of variables in the data for analysis which create multidimensional matrix. To simplify calculation and to get appropriate,

More information

Machine Learning: Basis and Wavelet 김화평 (CSE ) Medical Image computing lab 서진근교수연구실 Haar DWT in 2 levels

Machine Learning: Basis and Wavelet 김화평 (CSE ) Medical Image computing lab 서진근교수연구실 Haar DWT in 2 levels Machine Learning: Basis and Wavelet 32 157 146 204 + + + + + - + - 김화평 (CSE ) Medical Image computing lab 서진근교수연구실 7 22 38 191 17 83 188 211 71 167 194 207 135 46 40-17 18 42 20 44 31 7 13-32 + + - - +

More information

STA 414/2104: Lecture 8

STA 414/2104: Lecture 8 STA 414/2104: Lecture 8 6-7 March 2017: Continuous Latent Variable Models, Neural networks With thanks to Russ Salakhutdinov, Jimmy Ba and others Outline Continuous latent variable models Background PCA

More information

Dynamic Causal Modelling for fmri

Dynamic Causal Modelling for fmri Dynamic Causal Modelling for fmri André Marreiros Friday 22 nd Oct. 2 SPM fmri course Wellcome Trust Centre for Neuroimaging London Overview Brain connectivity: types & definitions Anatomical connectivity

More information

THE functional role of simple and complex cells has

THE functional role of simple and complex cells has 37 A Novel Temporal Generative Model of Natural Video as an Internal Model in Early Vision Jarmo Hurri and Aapo Hyvärinen Neural Networks Research Centre Helsinki University of Technology P.O.Box 9800,

More information

Machine Learning 11. week

Machine Learning 11. week Machine Learning 11. week Feature Extraction-Selection Dimension reduction PCA LDA 1 Feature Extraction Any problem can be solved by machine learning methods in case of that the system must be appropriately

More information

Principles of DCM. Will Penny. 26th May Principles of DCM. Will Penny. Introduction. Differential Equations. Bayesian Estimation.

Principles of DCM. Will Penny. 26th May Principles of DCM. Will Penny. Introduction. Differential Equations. Bayesian Estimation. 26th May 2011 Dynamic Causal Modelling Dynamic Causal Modelling is a framework studying large scale brain connectivity by fitting differential equation models to brain imaging data. DCMs differ in their

More information