Estimating representational dissimilarity measures
|
|
- Nancy Lynch
- 5 years ago
- Views:
Transcription
1 Estimating representational dissimilarity measures lexander Walther MRC Cognition and rain Sciences Unit University of Cambridge Institute of Cognitive Neuroscience University College London
2 Overview Distance measures for Representational Similarity nalysis fmri pattern noise normalization Crossvalidated dissimilarity measures Distance measures vs. pattern classifiers
3 What does Representational Similarity nalysis measure? General linear model True Regression Coefficients Ordinary least squares ˆ = (X T X) X T Y Estimated regression coefficients True activity patterns U Dissimilarity measure
4 The Euclidean distance in RS fmri activity patterns U u 2 = u 2 2, + u,2 u 2 = u 2 2, + u,2.8.5!.3.6! u u u.6,.3 u,2 u,2 d(,) u,2.5 u,2 Voxel u,.8 Representational dissimilarity matrix d d Voxel 2 u, u, Squared Euclidean distance d(,) 2 = (u u ) 2 + (u u ) 2,,,2,2 = (u u )(u u ) T
5 Euclidean distance & Pearson correlation distance fmri activity patterns U C ! u.3.6.5! u u C Voxel 3 d(,) u u Voxel 2
6 Euclidean distance & Pearson correlation distance fmri activity patterns U.8.5.2! u.3.6.5! u Voxel 3 Voxel 2
7 Euclidean distance & Pearson correlation distance fmri activity patterns U C ! u.3.6.5! u u C u * = u u Voxel 3 u * = u u Voxel 2
8 Euclidean distance & Pearson correlation distance fmri activity patterns U C ! u.3.6.5! u u C u * = u u σ u Voxel 3 u * = u u σ u r = cos(u * u * ) u * Correlation distance d(,) = r α u * Voxel 2
9 Euclidean distance & Pearson correlation distance Euclidean distance is invariant to baseline shifts Correlation distance is invariant to scaling differences
10 Reliability of dissimilarity measures Split-half reliability analysis Stimulus Stimulus r Split Split 2 high low SVM LD Euclidean Correlation LDC LDt RDM split-half reliability [Pearson correlation].5 High SNR.7.35 Low SNR
11 Overview Distance measures for Representational Similarity nalysis fmri pattern noise normalization Crossvalidated dissimilarity measures Distance measures vs. pattern classifiers
12 Univariate & multivariate noise normalization General linear model.8.5!.3.6! Noise normalizations Univariate ˆb p * = ˆb p σ εp for p = P Voxel ˆΣ ε = T εt ε Multivariate û = ˆb ˆΣ 2 Voxel
13 Reliability of dissimilarity measures Split-half reliability analysis Stimulus Stimulus r Split Split 2 high low SVM LD Euclidean Correlation LDC LDt RDM split-half reliability [Pearson correlation].5 High SNR.7.35 Low SNR
14 From raw to noise normalized dissimilarity measures fmri beta weights ˆ.8.5!.3.6! ˆb ˆb Squared Euclidean distance d(,) 2 = ( ˆb ˆb )( ˆb ˆb ) T Multivariate noise normalization û = ˆb ˆΣ 2 Squared Mahalanobis distance d(,) 2 = (û û )(û û ) T Representational dissimilarity matrix d d
15 Overview Distance measures for Representational Similarity nalysis fmri pattern noise normalization Crossvalidated dissimilarity measures Distance measures vs. pattern classifiers
16 Conventional distances are positively biased
17 Crossvalidated distance estimates Run Run 2 Û ().8.5!.3.6! û () û () Û (2).7.6!.5.4! û (2) û (2) û () = u () + ε () û (2) = u (2) + ε (2) û () = u () + ε () û (2) = u (2) + ε (2) Crossvalidated distance d(,) = (û () û () )(û (2) û (2) ) T Training Testing
18 Conventional distances are positively biased
19 Conventional distances are positively biased Run Training ˆd(û,û ) = û () û () Run 2 Testing ( )( û (2) (2) û ) T Estimated activity patterns Training Testing True activity patterns û = u + ε û 2 = u 2 + ε 2 Estimated error = û ()û (2) T û ()û (2) T û ()û (2) T + û ()û (2) T Crossvalidation = ( u () + ε () )( u (2) + ε (2) ) T ( u () + ε () )( u (2) + ε (2) ) T ( u () + ε () )( u (2) + ε (2) ) T... = + ( u () + ε () )( u (2) + ε (2) ) T = u () u (2) T + u () ε (2)T + ε () u (2) T + ε () ε (2)T u () u (2) T u () ε (2)T ε () u (2) T ε () ε (2)T... = u () u (2) T u () ε (2)T... ε () u (2) T ε () ε (2)T + u () u (2) T + ε (2) u () T + ε () u (2) T + ε () ε (2)T E ( ˆd(û,û )) = u () (2) u T u () (2) u T u () (2) u T + u () (2) u T = ( u () () u )( u (2) (2) u ) T = d(u,u )
20 Crossvalidation schemes Crossvalidated distance ˆd(,) = (û () û () )(û (2) û (2) ) T Training Testing ˆ δ () ˆ δ (2) Split-half crossvalidation Leave-one-run-out crossvalidation Run ˆ δ () ˆ δ () ˆ δ () ˆ δ () ˆ δ () ˆ δ () ˆ δ () ˆ δ () ˆ δ () ˆ δ () Run 2 Run 3 ˆ δ (2) ˆ δ (3) ˆ δ (2) ˆ δ (3) ˆ δ (2) ˆ δ (3) = ˆ δ (2) ˆ δ (3) ˆ δ (2) ˆ δ (3) ˆ δ (2) ˆ δ (3) ˆ δ (2) ˆ δ (3) = ˆ δ (2) ˆ δ (3) ˆ δ (2) ˆ δ (3) ˆ δ (2) ˆ δ (3) Run 4 ˆ δ (4) ˆ δ (4) ˆ δ (4) ˆ δ (4) ˆ δ (4) ˆ δ (4) ˆ δ (4) ˆ δ (4) ˆ δ (4) ˆ δ (4) ˆd(,) = δ ˆ () + ˆ 3 2 δ (2)! δ ˆ (3) + ˆ 2 δ (4) T 4 ( δ ˆ () ) δ ˆ (2) + δ ˆ(3) + δ ˆ(4)! 3 T 6 ( δ ˆ () δ ˆ (2) T )!
21 Reliability of dissimilarity measures Split-half reliability analysis Stimulus Stimulus r Split Split 2 high low SVM LD Euclidean Correlation LDC LDt RDM split-half reliability [Pearson correlation].5 High SNR.7.35 Low SNR
22 From raw to noise normalized dissimilarity measures fmri beta weights ˆ.8.5!.3.6! ˆb ˆb Squared Euclidean distance d(,) = ( ˆb ˆb )( ˆb ˆb ) T Multivariate noise normalization û = ˆb ˆΣ 2 Squared Mahalanobis distance d(,) = (û û )(û û ) T Crossvalidation Representational dissimilarity matrix d d Crossvalidated Mahalanobis distance estimate ( Linear discriminant contrast, LDC) ˆd(,) = (û () u () )(u (2) u (2) ) T
23 From conventional to crossvalidated dissimilarity measures C! 62 fmri beta weights ˆ.8.5!.3.6!.5.!! C 62 u C ˆb ˆb ˆb C Crossvalidated Mahalanobis distance estimate ( Linear discriminant contrast, LDC) ˆd(i, j) = (û i () û j () )(û i (2) û j (2) ) T Representational dissimilarity matrix (FF) faces bodies inanimate Preserves ratio between distances faces inanimate bodies t test (FDR=.5) 6 3 dissimilarity [Linear discriminent contrast]
24 From crossvalidated Mahalanobis to Linear Discriminant t fmri beta weights ˆ.8.5!.3.6! ˆb ˆb Crossvalidated Mahalanobis distance estimate ( Linear discriminant contrast, LDC) ˆd(,) = (û () û () )(û (2) û (2) ) T Training w Testing Normalize by standard error SE LDC σ ε 2 = diag(w ˆΣ b w T ) SE LDC = σ ε 2 ( c X T ) X c T Representational dissimilarity matrix t t Linear discriminant t value, LDt ( ) ˆt(,) = (û () û () )(û (2) û (2) ) T SE LDC
25 Reliability of dissimilarity measures Split-half reliability analysis Stimulus Stimulus r Split Split 2 high low SVM LD Euclidean Correlation LDC LDt RDM split-half reliability [Pearson correlation].5 High SNR.7.35 Low SNR
26 Overview Distance measures for Representational Similarity nalysis fmri pattern noise normalization Crossvalidated dissimilarity measures Distance measures vs. pattern classifiers
27 Distances vs. linear classifiers Split-half reliability analysis Stimulus Stimulus r Split Split 2 high low Euclidean Correlation LDC LDt SVM LD RDM split-half reliability [Pearson correlation] High SNR Multivariate noise normalization Low SNR
28 Walther et al (in review) Key insights Euclidean and correlation distance measure dissimilarity differently and are invariant to different pattern transformations 2 Multivariate noise normalization significantly enhances the reliability of dissimilarity measures 3 Crossvalidated dissimilarity estimates are unbiased and ratio scale (with interpretable zero point) 4 Pair-wise classification accuracy is a quantized measure of representational distinctness and significantly less reliable than continuous distance measures
29 Thanks!
Representational similarity analysis. Nikolaus Kriegeskorte MRC Cognition and Brain Sciences Unit Cambridge, UK
Representational similarity analysis Nikolaus Kriegeskorte MRC Cognition and Brain Sciences Unit Cambridge, UK a c t i v i t y d i s s i m i l a r i t y Representational similarity analysis stimulus (e.g.
More informationIntroduction to representational similarity analysis
Introduction to representational similarity analysis Nikolaus Kriegeskorte MRC Cognition and Brain Sciences Unit Cambridge RSA Workshop, 16-17 February 2015 a c t i v i t y d i s s i m i l a r i t y Representational
More informationAdvanced topics in RSA. Nikolaus Kriegeskorte MRC Cognition and Brain Sciences Unit Cambridge, UK
Advanced topics in RSA Nikolaus Kriegeskorte MRC Cognition and Brain Sciences Unit Cambridge, UK Advanced topics menu How does the noise ceiling work? Why is Kendall s tau a often needed to compare RDMs?
More informationExtracting fmri features
Extracting fmri features PRoNTo course May 2018 Christophe Phillips, GIGA Institute, ULiège, Belgium c.phillips@uliege.be - http://www.giga.ulg.ac.be Overview Introduction Brain decoding problem Subject
More informationThe General Linear Model (GLM)
The General Linear Model (GLM) Dr. Frederike Petzschner Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering, University of Zurich & ETH Zurich With many thanks for slides & images
More informationBayesian Decision and Bayesian Learning
Bayesian Decision and Bayesian Learning Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1 / 30 Bayes Rule p(x ω i
More informationNeuroimaging for Machine Learners Validation and inference
GIGA in silico medicine, ULg, Belgium http://www.giga.ulg.ac.be Neuroimaging for Machine Learners Validation and inference Christophe Phillips, Ir. PhD. PRoNTo course June 2017 Univariate analysis: Introduction:
More informationApplication: Can we tell what people are looking at from their brain activity (in real time)? Gaussian Spatial Smooth
Application: Can we tell what people are looking at from their brain activity (in real time? Gaussian Spatial Smooth 0 The Data Block Paradigm (six runs per subject Three Categories of Objects (counterbalanced
More informationThe General Linear Model (GLM)
he General Linear Model (GLM) Klaas Enno Stephan ranslational Neuromodeling Unit (NU) Institute for Biomedical Engineering University of Zurich & EH Zurich Wellcome rust Centre for Neuroimaging Institute
More informationEstimation of Parameters
CHAPTER Probability, Statistics, and Reliability for Engineers and Scientists FUNDAMENTALS OF STATISTICAL ANALYSIS Second Edition A. J. Clark School of Engineering Department of Civil and Environmental
More informationThe General Linear Model. Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London
The General Linear Model Guillaume Flandin Wellcome Trust Centre for Neuroimaging University College London SPM Course Lausanne, April 2012 Image time-series Spatial filter Design matrix Statistical Parametric
More informationEvent-related fmri. Christian Ruff. Laboratory for Social and Neural Systems Research Department of Economics University of Zurich
Event-related fmri Christian Ruff Laboratory for Social and Neural Systems Research Department of Economics University of Zurich Institute of Neurology University College London With thanks to the FIL
More informationSearchlight-based multi-voxel pattern analysis of fmri by cross-validated MANOVA
Searchlight-based multi-voxel pattern analysis of fmri by cross-validated MANOVA arxiv:1401.4122v2 [q-bio.nc] 7 Feb 2014 Carsten Allefeldˆ1,2*ˆ; John-Dylan Haynesˆ1 6**ˆ Affiliations: 1. Bernstein Center
More informationMultivariate Statistical Analysis
Multivariate Statistical Analysis Fall 2011 C. L. Williams, Ph.D. Lecture 9 for Applied Multivariate Analysis Outline Two sample T 2 test 1 Two sample T 2 test 2 Analogous to the univariate context, we
More informationStatistical Inference
Statistical Inference Jean Daunizeau Wellcome rust Centre for Neuroimaging University College London SPM Course Edinburgh, April 2010 Image time-series Spatial filter Design matrix Statistical Parametric
More informationPeak Detection for Images
Peak Detection for Images Armin Schwartzman Division of Biostatistics, UC San Diego June 016 Overview How can we improve detection power? Use a less conservative error criterion Take advantage of prior
More informationUniversity of Cambridge Engineering Part IIB Module 4F10: Statistical Pattern Processing Handout 2: Multivariate Gaussians
Engineering Part IIB: Module F Statistical Pattern Processing University of Cambridge Engineering Part IIB Module F: Statistical Pattern Processing Handout : Multivariate Gaussians. Generative Model Decision
More informationIntroduction to Machine Learning
Outline Introduction to Machine Learning Bayesian Classification Varun Chandola March 8, 017 1. {circular,large,light,smooth,thick}, malignant. {circular,large,light,irregular,thick}, malignant 3. {oval,large,dark,smooth,thin},
More informationThe exam is closed book, closed notes except your one-page (two sides) or two-page (one side) crib sheet.
CS 189 Spring 013 Introduction to Machine Learning Final You have 3 hours for the exam. The exam is closed book, closed notes except your one-page (two sides) or two-page (one side) crib sheet. Please
More informationCHAPTER 2. Types of Effect size indices: An Overview of the Literature
CHAPTER Types of Effect size indices: An Overview of the Literature There are different types of effect size indices as a result of their different interpretations. Huberty (00) names three different types:
More informationHoldout and Cross-Validation Methods Overfitting Avoidance
Holdout and Cross-Validation Methods Overfitting Avoidance Decision Trees Reduce error pruning Cost-complexity pruning Neural Networks Early stopping Adjusting Regularizers via Cross-Validation Nearest
More informationIntroduction to Machine Learning
Introduction to Machine Learning Bayesian Classification Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE 474/574
More informationData Analysis I: Single Subject
Data Analysis I: Single Subject ON OFF he General Linear Model (GLM) y= X fmri Signal = Design Matrix our data = what we CAN explain x β x Betas + + how much x of it we CAN + explain ε Residuals what
More informationMinwise hashing for large-scale regression and classification with sparse data
Minwise hashing for large-scale regression and classification with sparse data Nicolai Meinshausen (Seminar für Statistik, ETH Zürich) joint work with Rajen Shah (Statslab, University of Cambridge) Simons
More informationDetection of Anomalies in Texture Images using Multi-Resolution Features
Detection of Anomalies in Texture Images using Multi-Resolution Features Electrical Engineering Department Supervisor: Prof. Israel Cohen Outline Introduction 1 Introduction Anomaly Detection Texture Segmentation
More information6348 Final, Fall 14. Closed book, closed notes, no electronic devices. Points (out of 200) in parentheses.
6348 Final, Fall 14. Closed book, closed notes, no electronic devices. Points (out of 200) in parentheses. 0 11 1 1.(5) Give the result of the following matrix multiplication: 1 10 1 Solution: 0 1 1 2
More informationLDA, QDA, Naive Bayes
LDA, QDA, Naive Bayes Generative Classification Models Marek Petrik 2/16/2017 Last Class Logistic Regression Maximum Likelihood Principle Logistic Regression Predict probability of a class: p(x) Example:
More informationOverview of SPM. Overview. Making the group inferences we want. Non-sphericity Beyond Ordinary Least Squares. Model estimation A word on power
Group Inference, Non-sphericity & Covariance Components in SPM Alexa Morcom Edinburgh SPM course, April 011 Centre for Cognitive & Neural Systems/ Department of Psychology University of Edinburgh Overview
More informationStatistical Inference
Statistical Inference J. Daunizeau Institute of Empirical Research in Economics, Zurich, Switzerland Brain and Spine Institute, Paris, France SPM Course Edinburgh, April 2011 Image time-series Spatial
More informationUniversity of Cambridge Engineering Part IIB Module 4F10: Statistical Pattern Processing Handout 2: Multivariate Gaussians
University of Cambridge Engineering Part IIB Module 4F: Statistical Pattern Processing Handout 2: Multivariate Gaussians.2.5..5 8 6 4 2 2 4 6 8 Mark Gales mjfg@eng.cam.ac.uk Michaelmas 2 2 Engineering
More informationResearch Design - - Topic 15a Introduction to Multivariate Analyses 2009 R.C. Gardner, Ph.D.
Research Design - - Topic 15a Introduction to Multivariate Analses 009 R.C. Gardner, Ph.D. Major Characteristics of Multivariate Procedures Overview of Multivariate Techniques Bivariate Regression and
More informationSupervised Learning: Linear Methods (1/2) Applied Multivariate Statistics Spring 2012
Supervised Learning: Linear Methods (1/2) Applied Multivariate Statistics Spring 2012 Overview Review: Conditional Probability LDA / QDA: Theory Fisher s Discriminant Analysis LDA: Example Quality control:
More informationCMU-Q Lecture 24:
CMU-Q 15-381 Lecture 24: Supervised Learning 2 Teacher: Gianni A. Di Caro SUPERVISED LEARNING Hypotheses space Hypothesis function Labeled Given Errors Performance criteria Given a collection of input
More informationChapter 3: Maximum-Likelihood & Bayesian Parameter Estimation (part 1)
HW 1 due today Parameter Estimation Biometrics CSE 190 Lecture 7 Today s lecture was on the blackboard. These slides are an alternative presentation of the material. CSE190, Winter10 CSE190, Winter10 Chapter
More informationMultivariate Regression Generalized Likelihood Ratio Tests for FMRI Activation
Multivariate Regression Generalized Likelihood Ratio Tests for FMRI Activation Daniel B Rowe Division of Biostatistics Medical College of Wisconsin Technical Report 40 November 00 Division of Biostatistics
More informationStatistical Machine Learning
Statistical Machine Learning Christoph Lampert Spring Semester 2015/2016 // Lecture 12 1 / 36 Unsupervised Learning Dimensionality Reduction 2 / 36 Dimensionality Reduction Given: data X = {x 1,..., x
More informationProblem Set 2. MAS 622J/1.126J: Pattern Recognition and Analysis. Due: 5:00 p.m. on September 30
Problem Set 2 MAS 622J/1.126J: Pattern Recognition and Analysis Due: 5:00 p.m. on September 30 [Note: All instructions to plot data or write a program should be carried out using Matlab. In order to maintain
More informationNon-linear Measure Based Process Monitoring and Fault Diagnosis
Non-linear Measure Based Process Monitoring and Fault Diagnosis 12 th Annual AIChE Meeting, Reno, NV [275] Data Driven Approaches to Process Control 4:40 PM, Nov. 6, 2001 Sandeep Rajput Duane D. Bruns
More informationAdvanced Econometrics
Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 16, 2013 Outline Univariate
More informationEigenfaces. Face Recognition Using Principal Components Analysis
Eigenfaces Face Recognition Using Principal Components Analysis M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, 3(1), pp. 71-86, 1991. Slides : George Bebis, UNR
More informationOutline. Supervised Learning. Hong Chang. Institute of Computing Technology, Chinese Academy of Sciences. Machine Learning Methods (Fall 2012)
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Linear Models for Regression Linear Regression Probabilistic Interpretation
More informationF & B Approaches to a simple model
A6523 Signal Modeling, Statistical Inference and Data Mining in Astrophysics Spring 215 http://www.astro.cornell.edu/~cordes/a6523 Lecture 11 Applications: Model comparison Challenges in large-scale surveys
More informationMachine Learning Basics
Security and Fairness of Deep Learning Machine Learning Basics Anupam Datta CMU Spring 2019 Image Classification Image Classification Image classification pipeline Input: A training set of N images, each
More informationA. Motivation To motivate the analysis of variance framework, we consider the following example.
9.07 ntroduction to Statistics for Brain and Cognitive Sciences Emery N. Brown Lecture 14: Analysis of Variance. Objectives Understand analysis of variance as a special case of the linear model. Understand
More informationSupport Vector Machine. Industrial AI Lab.
Support Vector Machine Industrial AI Lab. Classification (Linear) Autonomously figure out which category (or class) an unknown item should be categorized into Number of categories / classes Binary: 2 different
More informationEEL 851: Biometrics. An Overview of Statistical Pattern Recognition EEL 851 1
EEL 851: Biometrics An Overview of Statistical Pattern Recognition EEL 851 1 Outline Introduction Pattern Feature Noise Example Problem Analysis Segmentation Feature Extraction Classification Design Cycle
More informationInterpreting Regression Results
Interpreting Regression Results Carlo Favero Favero () Interpreting Regression Results 1 / 42 Interpreting Regression Results Interpreting regression results is not a simple exercise. We propose to split
More informationStatistical Analysis of fmrl Data
Statistical Analysis of fmrl Data F. Gregory Ashby The MIT Press Cambridge, Massachusetts London, England Preface xi Acronyms xv 1 Introduction 1 What Is fmri? 2 The Scanning Session 4 Experimental Design
More informationMS&E 226. In-Class Midterm Examination Solutions Small Data October 20, 2015
MS&E 226 In-Class Midterm Examination Solutions Small Data October 20, 2015 PROBLEM 1. Alice uses ordinary least squares to fit a linear regression model on a dataset containing outcome data Y and covariates
More informationSignal Modeling Techniques in Speech Recognition. Hassan A. Kingravi
Signal Modeling Techniques in Speech Recognition Hassan A. Kingravi Outline Introduction Spectral Shaping Spectral Analysis Parameter Transforms Statistical Modeling Discussion Conclusions 1: Introduction
More informationClassifier Performance. Assessment and Improvement
Classifier Performance Assessment and Improvement Error Rates Define the Error Rate function Q( ω ˆ,ω) = δ( ω ˆ ω) = 1 if ω ˆ ω = 0 0 otherwise When training a classifier, the Apparent error rate (or Test
More information6.867 Machine learning
6.867 Machine learning Mid-term eam October 8, 6 ( points) Your name and MIT ID: .5.5 y.5 y.5 a).5.5 b).5.5.5.5 y.5 y.5 c).5.5 d).5.5 Figure : Plots of linear regression results with different types of
More informationLecture 9: Classification, LDA
Lecture 9: Classification, LDA Reading: Chapter 4 STATS 202: Data mining and analysis October 13, 2017 1 / 21 Review: Main strategy in Chapter 4 Find an estimate ˆP (Y X). Then, given an input x 0, we
More informationRecipes for the Linear Analysis of EEG and applications
Recipes for the Linear Analysis of EEG and applications Paul Sajda Department of Biomedical Engineering Columbia University Can we read the brain non-invasively and in real-time? decoder 1001110 if YES
More informationLecture 9: Classification, LDA
Lecture 9: Classification, LDA Reading: Chapter 4 STATS 202: Data mining and analysis Jonathan Taylor, 10/12 Slide credits: Sergio Bacallado 1 / 1 Review: Main strategy in Chapter 4 Find an estimate ˆP
More informationHigh Dimensional Discriminant Analysis
High Dimensional Discriminant Analysis Charles Bouveyron LMC-IMAG & INRIA Rhône-Alpes Joint work with S. Girard and C. Schmid ASMDA Brest May 2005 Introduction Modern data are high dimensional: Imagery:
More informationContinuous Univariate Distributions
Continuous Univariate Distributions Volume 2 Second Edition NORMAN L. JOHNSON University of North Carolina Chapel Hill, North Carolina SAMUEL KOTZ University of Maryland College Park, Maryland N. BALAKRISHNAN
More informationDecoding conceptual representations
Decoding conceptual representations!!!! Marcel van Gerven! Computational Cognitive Neuroscience Lab (www.ccnlab.net) Artificial Intelligence Department Donders Centre for Cognition Donders Institute for
More information6-1. Canonical Correlation Analysis
6-1. Canonical Correlation Analysis Canonical Correlatin analysis focuses on the correlation between a linear combination of the variable in one set and a linear combination of the variables in another
More informationMathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory
Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 206 Jonathan Pillow Homework 8: Logistic Regression & Information Theory Due: Tuesday, April 26, 9:59am Optimization Toolbox One
More informationSolutions. Part I Logistic regression backpropagation with a single training example
Solutions Part I Logistic regression backpropagation with a single training example In this part, you are using the Stochastic Gradient Optimizer to train your Logistic Regression. Consequently, the gradients
More informationDISCRIMINANT ANALYSIS. 1. Introduction
DISCRIMINANT ANALYSIS. Introduction Discrimination and classification are concerned with separating objects from different populations into different groups and with allocating new observations to one
More informationAn investigation into the use of maximum posterior probability estimates for assessing the accuracy of large maps
An investigation into the use of maximum posterior probability estimates for assessing the accuracy of large maps Brian Steele 1 and Dave Patterson2 Dept. of Mathematical Sciences University of Montana
More informationDEPARTMENT OF COMPUTER SCIENCE Autumn Semester MACHINE LEARNING AND ADAPTIVE INTELLIGENCE
Data Provided: None DEPARTMENT OF COMPUTER SCIENCE Autumn Semester 203 204 MACHINE LEARNING AND ADAPTIVE INTELLIGENCE 2 hours Answer THREE of the four questions. All questions carry equal weight. Figures
More informationMotivating the Covariance Matrix
Motivating the Covariance Matrix Raúl Rojas Computer Science Department Freie Universität Berlin January 2009 Abstract This note reviews some interesting properties of the covariance matrix and its role
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More informationLecture 9: Classification, LDA
Lecture 9: Classification, LDA Reading: Chapter 4 STATS 202: Data mining and analysis October 13, 2017 1 / 21 Review: Main strategy in Chapter 4 Find an estimate ˆP (Y X). Then, given an input x 0, we
More informationY (Nominal/Categorical) 1. Metric (interval/ratio) data for 2+ IVs, and categorical (nominal) data for a single DV
1 Neuendorf Discriminant Analysis The Model X1 X2 X3 X4 DF2 DF3 DF1 Y (Nominal/Categorical) Assumptions: 1. Metric (interval/ratio) data for 2+ IVs, and categorical (nominal) data for a single DV 2. Linearity--in
More informationHomework #2 Due date: 2/19/2013 (Tuesday) Translate the slice Two Main Paths in Lecture Part II: Neurophysiology and BOLD to words.
Homework #2 Due date: 2/19/2013 (Tuesday) 1. What is BOLD? In your own words, fully explain the mechanism of BOLD fmri (from stimulus input to image voxel signal). (20 points) Translate the slice Two Main
More informationManual: R package HTSmix
Manual: R package HTSmix Olga Vitek and Danni Yu May 2, 2011 1 Overview High-throughput screens (HTS) measure phenotypes of thousands of biological samples under various conditions. The phenotypes are
More informationSupport Vector Regression (SVR) Descriptions of SVR in this discussion follow that in Refs. (2, 6, 7, 8, 9). The literature
Support Vector Regression (SVR) Descriptions of SVR in this discussion follow that in Refs. (2, 6, 7, 8, 9). The literature suggests the design variables should be normalized to a range of [-1,1] or [0,1].
More informationCOMPARISON OF GMM WITH SECOND-ORDER LEAST SQUARES ESTIMATION IN NONLINEAR MODELS. Abstract
Far East J. Theo. Stat. 0() (006), 179-196 COMPARISON OF GMM WITH SECOND-ORDER LEAST SQUARES ESTIMATION IN NONLINEAR MODELS Department of Statistics University of Manitoba Winnipeg, Manitoba, Canada R3T
More informationFMRI Neurologic Synchrony Measures for Alzheimer s Patients With Monte Carlo Critical Values
FMRI Neurologic Synchrony Measures for Alzheimer s Patients With Monte Carlo Critical Values Daniel B. Rowe Division of Biostatistics Medical College of Wisconsin Technical Report 41 January 2003 Division
More informationPrincipal component analysis
Principal component analysis Motivation i for PCA came from major-axis regression. Strong assumption: single homogeneous sample. Free of assumptions when used for exploration. Classical tests of significance
More information26:010:557 / 26:620:557 Social Science Research Methods
26:010:557 / 26:620:557 Social Science Research Methods Dr. Peter R. Gillett Associate Professor Department of Accounting & Information Systems Rutgers Business School Newark & New Brunswick 1 Overview
More informationLinear Regression. Machine Learning CSE546 Kevin Jamieson University of Washington. Oct 5, Kevin Jamieson 1
Linear Regression Machine Learning CSE546 Kevin Jamieson University of Washington Oct 5, 2017 1 The regression problem Given past sales data on zillow.com, predict: y = House sale price from x = {# sq.
More informationMulti-Category Classification by Soft-Max Combination of Binary Classifiers
Multi-Category Classification by Soft-Max Combination of Binary Classifiers K. Duan, S. S. Keerthi, W. Chu, S. K. Shevade, A. N. Poo Department of Mechanical Engineering, National University of Singapore
More informationCSE446: non-parametric methods Spring 2017
CSE446: non-parametric methods Spring 2017 Ali Farhadi Slides adapted from Carlos Guestrin and Luke Zettlemoyer Linear Regression: What can go wrong? What do we do if the bias is too strong? Might want
More informationCSC2541 Lecture 5 Natural Gradient
CSC2541 Lecture 5 Natural Gradient Roger Grosse Roger Grosse CSC2541 Lecture 5 Natural Gradient 1 / 12 Motivation Two classes of optimization procedures used throughout ML (stochastic) gradient descent,
More informationDynamic Data Modeling, Recognition, and Synthesis. Rui Zhao Thesis Defense Advisor: Professor Qiang Ji
Dynamic Data Modeling, Recognition, and Synthesis Rui Zhao Thesis Defense Advisor: Professor Qiang Ji Contents Introduction Related Work Dynamic Data Modeling & Analysis Temporal localization Insufficient
More informationMachine learning strategies for fmri analysis
Machine learning strategies for fmri analysis DTU Informatics Technical University of Denmark Co-workers: Morten Mørup, Kristoffer Madsen, Peter Mondrup, Daniel Jacobsen, Stephen Strother,. OUTLINE Do
More informationOutline Introduction OLS Design of experiments Regression. Metamodeling. ME598/494 Lecture. Max Yi Ren
1 / 34 Metamodeling ME598/494 Lecture Max Yi Ren Department of Mechanical Engineering, Arizona State University March 1, 2015 2 / 34 1. preliminaries 1.1 motivation 1.2 ordinary least square 1.3 information
More informationSupport Vector Machine. Industrial AI Lab. Prof. Seungchul Lee
Support Vector Machine Industrial AI Lab. Prof. Seungchul Lee Classification (Linear) Autonomously figure out which category (or class) an unknown item should be categorized into Number of categories /
More informationStatistical Learning with the Lasso, spring The Lasso
Statistical Learning with the Lasso, spring 2017 1 Yeast: understanding basic life functions p=11,904 gene values n number of experiments ~ 10 Blomberg et al. 2003, 2010 The Lasso fmri brain scans function
More informationInterpreting Regression Results -Part II
Interpreting Regression Results -Part II Carlo Favero Favero () Interpreting Regression Results -Part II / 9 The Partitioned Regression Model Given the linear model: y = Xβ + ɛ, Partition X in two blocks
More informationThe Bayes classifier
The Bayes classifier Consider where is a random vector in is a random variable (depending on ) Let be a classifier with probability of error/risk given by The Bayes classifier (denoted ) is the optimal
More informationClass 4: Classification. Quaid Morris February 11 th, 2011 ML4Bio
Class 4: Classification Quaid Morris February 11 th, 211 ML4Bio Overview Basic concepts in classification: overfitting, cross-validation, evaluation. Linear Discriminant Analysis and Quadratic Discriminant
More informationActive and Semi-supervised Kernel Classification
Active and Semi-supervised Kernel Classification Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London Work done in collaboration with Xiaojin Zhu (CMU), John Lafferty (CMU),
More informationResampling Based Design, Evaluation and Interpretation of Neuroimaging Models Lars Kai Hansen DTU Compute Technical University of Denmark
Resampling Based Design, Evaluation and Interpretation of Neuroimaging Models Lars Kai Hansen DTU Compute Technical University of Denmark Co-workers: Trine Abrahamsen Peter Mondrup, Kristoffer Madsen,...
More informationCS7267 MACHINE LEARNING
CS7267 MACHINE LEARNING ENSEMBLE LEARNING Ref: Dr. Ricardo Gutierrez-Osuna at TAMU, and Aarti Singh at CMU Mingon Kang, Ph.D. Computer Science, Kennesaw State University Definition of Ensemble Learning
More informationApplication of Raman Spectroscopy for Detection of Aflatoxins and Fumonisins in Ground Maize Samples
Application of Raman Spectroscopy for Detection of Aflatoxins and Fumonisins in Ground Maize Samples Kyung-Min Lee and Timothy J. Herrman Office of the Texas State Chemist, Texas A&M AgriLife Research
More informationIs cross-validation valid for small-sample microarray classification?
Is cross-validation valid for small-sample microarray classification? Braga-Neto et al. Bioinformatics 2004 Topics in Bioinformatics Presented by Lei Xu October 26, 2004 1 Review 1) Statistical framework
More informationDiscriminant analysis and supervised classification
Discriminant analysis and supervised classification Angela Montanari 1 Linear discriminant analysis Linear discriminant analysis (LDA) also known as Fisher s linear discriminant analysis or as Canonical
More informationGaussian Process Regression with K-means Clustering for Very Short-Term Load Forecasting of Individual Buildings at Stanford
Gaussian Process Regression with K-means Clustering for Very Short-Term Load Forecasting of Individual Buildings at Stanford Carol Hsin Abstract The objective of this project is to return expected electricity
More informationConfidence Estimation Methods for Neural Networks: A Practical Comparison
, 6-8 000, Confidence Estimation Methods for : A Practical Comparison G. Papadopoulos, P.J. Edwards, A.F. Murray Department of Electronics and Electrical Engineering, University of Edinburgh Abstract.
More informationHigh Dimensional Discriminant Analysis
High Dimensional Discriminant Analysis Charles Bouveyron 1,2, Stéphane Girard 1, and Cordelia Schmid 2 1 LMC IMAG, BP 53, Université Grenoble 1, 38041 Grenoble cedex 9 France (e-mail: charles.bouveyron@imag.fr,
More informationECON Program Evaluation, Binary Dependent Variable, Misc.
ECON 351 - Program Evaluation, Binary Dependent Variable, Misc. Maggie Jones () 1 / 17 Readings Chapter 13: Section 13.2 on difference in differences Chapter 7: Section on binary dependent variables Chapter
More informationPart I. Linear regression & LASSO. Linear Regression. Linear Regression. Week 10 Based in part on slides from textbook, slides of Susan Holmes
Week 10 Based in part on slides from textbook, slides of Susan Holmes Part I Linear regression & December 5, 2012 1 / 1 2 / 1 We ve talked mostly about classification, where the outcome categorical. If
More informationPiotr Majer Risk Patterns and Correlated Brain Activities
Alena My²i ková Piotr Majer Song Song Alena Myšičková Peter N. C. Mohr Peter N. C. Mohr Wolfgang K. Härdle Song Song Hauke R. Heekeren Wolfgang K. Härdle Hauke R. Heekeren C.A.S.E. Centre C.A.S.E. for
More informationIncorporating detractors into SVM classification
Incorporating detractors into SVM classification AGH University of Science and Technology 1 2 3 4 5 (SVM) SVM - are a set of supervised learning methods used for classification and regression SVM maximal
More information