Standard P300 scalp. Fz, Cz, Pz Posterior Electrodes: PO 7, PO 8, O Z
|
|
- Katherine Fisher
- 5 years ago
- Views:
Transcription
1 Brain Computer Interface Step Wise Linear Discriminant Analysis and Linear Regression Instructor: Dr. Ravi Sankar Student: Kun Li & David Seebran
2 Brain-Computer Interface Utilize the electrical l activity i of the brain as the carrier of the communicated signal can all be viewed as methods for providing the user with control over the variance of the EEG. The methods by which such control over the variance has been achieved often focused on controlling the spectral composition of the EEG.
3 Krusienski, et al swork Standard P300 scalp locations: Fz, Cz, Pz Posterior Electrodes: PO 7, PO 8, O Z
4 Data Collection The participant i sat upright in front of a video monitor, focused attention on a specified letter of the matrix on the display and silently counted the number of times the target character intensified, until a new character was specified for selection. The rows and columns were intensified for 00 ms with 75 ms interval. S Sequence = a complete cycle of six row and six column intensifications. ifi ti Epoch = 5 sequences Session = 36 epochs Therefore, each session contains (6+6)*5*36 = 6480 stimuli For each channel used in the analysis, 800-ms segments of data (92 samples) following each intensification were extracted for the offline analysis. The data segments were concatenated by channel for each intensification, creating a single feature vector corresponding to each stimulus.
5 Linear Discriminant Analysis The purpose of Discriminant i i Analysis is to classify objects (people, customers, things, etc.) into one of two or more groups based on a set of features that describe the objects. Statistically, The classification rule is to assign an object to the group with highest conditional probability (minimize the total error of classification).. This is called Bayes Rule.
6 If there are g groups, the Bayes' rule is to assign the object to group i where P( i x) > P( j x), for j i Practically, P(i x) is hard to obtain, but it can be computed by using P(x i) Pi ( x ) = jj P( x i) P( i) P( x j) P( j)
7 If we assume that each group has multivariate Normal distribution and all group has the same covariance matrix, we get what is called Linear discriminant Analysis formula: μ T μ μ T i i k i i i f = μc x μc μ + ln( P ) 2 Assign object tkk to group i that t has maximum f i
8 Example Curvature Diameter Factory ABC ABC produces very expensive and high h quality chip rings that their qualities are measured di in term of curvature and diameter. Result of quality control by experts is given in the table below Quality control results Passed Passed Passed Passed Not Passed Not Passed Not Passed
9 Here comes a new chip ring that has curvature 2.8 and diameter Will this one pass the quality control? We make x = features of all data. Each row represents one object; each column stands for one feature. y = group of the object of all data. Each row represents one object and it has only one column.
10 so x = y =
11 Let x i = features data for group i, then x x = = Let u i = mean of features in group i, which is average of x i, then μ [ ] = μ [ ] 2 =
12 u = global mean vector, that is mean of the whole data set. μ = [ ] Let x i o = mean corrected data x o o = 2 x =
13 covariance matrix of group i c i = ( x ) x o T o i i n i pooled within group covariance matrix g Crs (,) = n i c i (,) rs n = i
14 P = prior probability vector P i = n i N then P 4 7 = 3 7 Discriminant Function = μ T μ μ T + i i k i i i f C x C l( ln( P) 2
15 We should assign object tkk to group i that t has maximum f i. For the chip ring we want to classify ( curvature 2.8 and diameter 5.46 ) f = f 2 = Therefore, it belongs to group 2 and can t pass the quality control.
16 Mathematically, the idea of LDA is to minimize the within group scatter and maximize the between group scatter. three scatter matrices, called within-class, between-class and total scatter matrices are defined as follows: k ( ) ( j) T Sω = ( x c j n )( x c ) X j= x j k S b = nj c c c c n j = n ( j ) ( j ) T ( )( ) S t n = ( xi c )( xi c ) i= Where c (j) is the centroid of the j-th class, and c is the global centroid. T
17 It follows from the definition that : S = S + S t b ω In the lower-dimensional space resulting from the linear transformation G, the scatter matrices become: S G S G S L T ω = L T ω b = G S G b S L t = T G SG t
18 The optimal transformation,, of LDA is computed by solving the following optimization problem (Duda et al., 2000; Fukunaga, 990) G trace S S LDA L L = arg max{ ( b ( t ) )} When deals with binary-class problems, The optimal transformation G F is given by F G = S c c + () (2) t ( )
19 Linear Regression linear regression model has the following form: T f ( x ) = x ω + b A popular approach for estimating and is the least squares, in which h the following objective function is minimized: i i T L ( ω, b ) = X ω + be y = f ( x i ) y 2 2 n 2 2 Where X = [x,x 2, x n ] is the data matrix, e is the vectors of all ones, and y is the vector of class labels. i= i
20 Assume that tb both th{ {x i } and d{ {y i } are centered. The bias term b becomes zero and and we look for a linear model by minimizing: The optimal is given by: T L( ω) = X ω y 2 2nn 2nn ω = S ( c c ) = G n 2 + () (2) 2 t 2 n2 2 F
21 Choosing w A combination of forward and backward stepwise regression is implemented. Starting with no initial model terms, the most statistically significant predictor variable having a p-value <0., is added to the model. After each new entry to the model, a backward stepwise regression is performed to remove the least significant variables, having p-values >0.5.
22 Problem According to the choosing rules, the more tests we make, the higher the likelihood of falsely rejecting the null hypothesis. Suppose we set a cutoff of p=0.05. If H 0 (H 0 is the null hypothesis) is always true, then we would reject it 5% of the time. But if we had two independent tests, we would falsely reject at least one H 0 : -(-.05) 2 = , or almost 0% of the time. If we had 20 independent tests, we would falsely reject at least one H 0 : -(-.05) 20 = 0.645, almost 2/3 of the time.
23 Thank You!!!
24 References [Farwell and Donchin (988)] Farwell LA, Donchin E. Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroenceph Clin Neurophysiol 988;70: [Fabiani et al., 987] Fabiani M, Gratton G, Karis D, Donchin E. Definition, identification, and reliability of measurement of the P300 component of the event-related brain potential. Adv Psychophysiol 987;2: [Krusienski DJ, et al] Toward enhanced P300 speller performance, Krusienski DJ, et al., J Neurosci Methods (2007) [BCI, 2003] BCI Competition II Data Set IIb. Results (Blankertz and Curio); (2003) ii/results/. [BCI, 2005] BCI Competition III. Data Set II Results; (2005) de/projects/bci/competition iii/results/. [Blankertz et al., 2004] Blankertz B, M uller KR, Curio G, Vaughan TM, Schalk G, Wolpaw JR, et al. The BCI competition 2003: progress and perspectives in detection and discrimination of EEG single trials. IEEE Trans Biomed Eng 2004;5(6): [Blankertz et al., 2006] Blankertz B, M uller KR, Krusienski DJ, Schalk G, Wolpaw JR, Schl ogl A, et al. The BCI competition III: validating alternative approaches to actual BCI problems. IEEE Trans Neural Syst Rehabil Eng 2006;4(2): [Kaper et al., 2004] Kaper M, Meinicke i P, Grossekathoefer U, Lingner T, Ritter H. BCI competition 2003-data set IIb: support vector machines for the P300 speller paradigm. IEEE Trans Biomed Eng 2004;5: [Spencer et al., 200] Spencer KM, Dien J, Donchin E. Spatiotemporal analysis of the late ERP responses to deviant stimuli. Psychophysiology 200;38:
25 [Vaughan et al., 2003] Vaughan TM,McFarland DJ, Schalk G, Sellers E,WolpawJR. Multichannel data from a brain computer interface (BCI) speller using a P300 (i.e., oddball) protocol. Soc Neurosci Abs [Donchin et al., 2000] Donchin E, Spencer KM, Wijesinghe R. The mental prosthesis: assessing the speed of a P300-based brain computer interface. IEEE Trans Rehabil Eng 2000;8:74 9. [Sellers and Donchin, 2006] Sellers EW, Donchin E. A P300-based brain computer interface: initial tests by ALS patients. Clin Neurophysiol 2006;7: [Bostanov, 2004] Bostanov V. BCI competition 2003-data sets Ib and IIb: feature extraction from event-related brain potentials with the continuous wavelet transform and the t-value scalogram. IEEE Trans Biomed Eng 2004;5: [Meinicke et al., 2002] Meinicke P, Kaper M, Hoppe F, Huemann M, Ritter H. Improving transfer rates in brain computer interface: a case study. NIPS :07 : [Thulasidas et al., 2006] Thulasidas M, Cuntai G, Wu J. Robust classification of EEG signal for brain computer interface. IEEE Trans Neural Syst Rehabil Eng 2006;4():24 9. [Serby et al., 2005] Serby H, Yom-Tov E, Inbar GF. An improved P300-based brain computer interface. IEEE Trans Neural Syst Rehabil Eng 2005;3: [Fukunaga, 990] Fukunaga, K. (990). Introduction to statistical pattern classification. USA: Academic Press. [Duda et al., 2000] Duda, R., Hart, P., & Stork, D. (2000). Pattern classification. Wiley. [Jieping Ye] Jieping Ye, Least Squares Linear Discriminant Analysis, Department of Computer Science and Engineering, g Arizona State University ity, Tempe, AZ USA [Kardi Teknomo] Kardi Teknomo, Linear Discriminant Analysis Numerical Example,
HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems
HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems Silvia Chiappa and Samy Bengio {chiappa,bengio}@idiap.ch IDIAP, P.O. Box 592, CH-1920 Martigny, Switzerland Abstract. We compare the use
More informationIntroduction to Signal Detection and Classification. Phani Chavali
Introduction to Signal Detection and Classification Phani Chavali Outline Detection Problem Performance Measures Receiver Operating Characteristics (ROC) F-Test - Test Linear Discriminant Analysis (LDA)
More informationA Graphical Model Framework for Decoding in the Visual ERP-Based BCI Speller
LETTER Communicated by Christoph Guger A Graphical Model Framework for Decoding in the Visual ERP-Based BCI Speller S. M. M. Martens smm.martens@gmail.com J. M. Mooij joris.mooij@tuebingen.mpg.de N. J.
More informationMachine Learning Linear Classification. Prof. Matteo Matteucci
Machine Learning Linear Classification Prof. Matteo Matteucci Recall from the first lecture 2 X R p Regression Y R Continuous Output X R p Y {Ω 0, Ω 1,, Ω K } Classification Discrete Output X R p Y (X)
More informationLecture 4 Discriminant Analysis, k-nearest Neighbors
Lecture 4 Discriminant Analysis, k-nearest Neighbors Fredrik Lindsten Division of Systems and Control Department of Information Technology Uppsala University. Email: fredrik.lindsten@it.uu.se fredrik.lindsten@it.uu.se
More informationRecipes for the Linear Analysis of EEG and applications
Recipes for the Linear Analysis of EEG and applications Paul Sajda Department of Biomedical Engineering Columbia University Can we read the brain non-invasively and in real-time? decoder 1001110 if YES
More informationContents Lecture 4. Lecture 4 Linear Discriminant Analysis. Summary of Lecture 3 (II/II) Summary of Lecture 3 (I/II)
Contents Lecture Lecture Linear Discriminant Analysis Fredrik Lindsten Division of Systems and Control Department of Information Technology Uppsala University Email: fredriklindsten@ituuse Summary of lecture
More informationA Maxmin Approach to Optimize Spatial Filters for EEG Single-Trial Classification
A Maxmin Approach to Optimize Spatial Filters for EEG Single-Trial Classification Motoaki Kawanabe 1,2, Carmen Vidaurre 2, and Benjamin Blankertz 2,1 and Klaus-Robert Müller 2 1 IDA group, FIRST, Fraunhofer
More informationApplications of High Dimensional Clustering on Riemannian Manifolds in Brain Measurements
Applications of High Dimensional Clustering on Riemannian Manifolds in Brain Measurements Nathalie Gayraud Supervisor: Maureen Clerc BCI-Lift Project/Team ATHENA Team, Inria Sophia Antipolis - Mediterranee
More informationBayesian Classification of Single-Trial Event-Related Potentials in EEG
Bayesian Classification of Single-Trial Event-Related Potentials in EEG Jens Kohlmorgen Benjamin Blankertz Fraunhofer FIRST.IDA Kekuléstr. 7, D-12489 Berlin, Germany E-mail: {jek, blanker}@first.fraunhofer.de
More informationEEL 851: Biometrics. An Overview of Statistical Pattern Recognition EEL 851 1
EEL 851: Biometrics An Overview of Statistical Pattern Recognition EEL 851 1 Outline Introduction Pattern Feature Noise Example Problem Analysis Segmentation Feature Extraction Classification Design Cycle
More informationRecognition and Classification of P300s in EEG Signals by Means of Feature Extraction Using Wavelet Decomposition
Proceedings of nternational Joint Conference on Neural Networs, Atlanta, Georgia, USA, June 9, 2009 Recognition and Classification of P300s in EEG Signals by Means of Feature Extraction Using Wavelet Decomposition
More informationPattern Recognition 2
Pattern Recognition 2 KNN,, Dr. Terence Sim School of Computing National University of Singapore Outline 1 2 3 4 5 Outline 1 2 3 4 5 The Bayes Classifier is theoretically optimum. That is, prob. of error
More informationp(x ω i 0.4 ω 2 ω
p( ω i ). ω.3.. 9 3 FIGURE.. Hypothetical class-conditional probability density functions show the probability density of measuring a particular feature value given the pattern is in category ω i.if represents
More informationMultiway Canonical Correlation Analysis for Frequency Components Recognition in SSVEP-based BCIs
Multiway Canonical Correlation Analysis for Frequency Components Recognition in SSVEP-based BCIs Yu Zhang 1,2, Guoxu Zhou 1, Qibin Zhao 1, Akinari Onishi 1,3, Jing Jin 2, Xingyu Wang 2, and Andrzej Cichocki
More informationSingle-trial P300 detection with Kalman filtering and SVMs
Single-trial P300 detection with Kalman filtering and SVMs Lucie Daubigney and Olivier Pietquin SUPELEC / UMI 2958 (GeorgiaTech - CNRS) 2 rue Edouard Belin, 57070 Metz - France firstname.lastname@supelec.fr
More informationStatistical inference for MEG
Statistical inference for MEG Vladimir Litvak Wellcome Trust Centre for Neuroimaging University College London, UK MEG-UK 2014 educational day Talk aims Show main ideas of common methods Explain some of
More informationClassification: Linear Discriminant Analysis
Classification: Linear Discriminant Analysis Discriminant analysis uses sample information about individuals that are known to belong to one of several populations for the purposes of classification. Based
More informationp(d θ ) l(θ ) 1.2 x x x
p(d θ ).2 x 0-7 0.8 x 0-7 0.4 x 0-7 l(θ ) -20-40 -60-80 -00 2 3 4 5 6 7 θ ˆ 2 3 4 5 6 7 θ ˆ 2 3 4 5 6 7 θ θ x FIGURE 3.. The top graph shows several training points in one dimension, known or assumed to
More informationBayesian decision theory Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory
Bayesian decision theory 8001652 Introduction to Pattern Recognition. Lectures 4 and 5: Bayesian decision theory Jussi Tohka jussi.tohka@tut.fi Institute of Signal Processing Tampere University of Technology
More informationPolicies for neural prosthetic control: initial experiments with a text interface
008 American Control Conference Westin Seattle Hotel, Seattle, Washington, USA June 11-13, 008 FrA16. Policies for neural prosthetic control: initial experiments with a text interface Cyrus Omar, Miles
More informationMultivariate statistical methods and data mining in particle physics
Multivariate statistical methods and data mining in particle physics RHUL Physics www.pp.rhul.ac.uk/~cowan Academic Training Lectures CERN 16 19 June, 2008 1 Outline Statement of the problem Some general
More informationSPELLING IN PARALLEL: TOWARDS A RAPID, SPATIALLY INDEPENDENT BCI
SPELLING IN PARALLEL: TOWARDS A RAPID, SPATIALLY INDEPENDENT BCI J.M. Stivers 1, V.R. de Sa 1 1 Cognitive Science Department, University of California, San Diego E-mail: jstivers@ucsd.edu ABSTRACT: BCI
More informationMachine learning comes from Bayesian decision theory in statistics. There we want to minimize the expected value of the loss function.
Bayesian learning: Machine learning comes from Bayesian decision theory in statistics. There we want to minimize the expected value of the loss function. Let y be the true label and y be the predicted
More informationAdvanced statistical methods for data analysis Lecture 1
Advanced statistical methods for data analysis Lecture 1 RHUL Physics www.pp.rhul.ac.uk/~cowan Universität Mainz Klausurtagung des GK Eichtheorien exp. Tests... Bullay/Mosel 15 17 September, 2008 1 Outline
More informationTranslation invariant classification of non-stationary signals
Translation invariant classification of non-stationary signals Vincent Guigue, Alain Rakotomamonjy and Stéphane Canu Laboratoire Perception, Systèmes, Information - CNRS - FRE Avenue de l Université, St
More informationAlgorithmisches Lernen/Machine Learning
Algorithmisches Lernen/Machine Learning Part 1: Stefan Wermter Introduction Connectionist Learning (e.g. Neural Networks) Decision-Trees, Genetic Algorithms Part 2: Norman Hendrich Support-Vector Machines
More informationEngineering Part IIB: Module 4F10 Statistical Pattern Processing Lecture 5: Single Layer Perceptrons & Estimating Linear Classifiers
Engineering Part IIB: Module 4F0 Statistical Pattern Processing Lecture 5: Single Layer Perceptrons & Estimating Linear Classifiers Phil Woodland: pcw@eng.cam.ac.uk Michaelmas 202 Engineering Part IIB:
More informationNaïve Bayes classification
Naïve Bayes classification 1 Probability theory Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. Examples: A person s height, the outcome of a coin toss
More informationExecuted Movement Using EEG Signals through a Naive Bayes Classifier
Micromachines 2014, 5, 1082-1105; doi:10.3390/mi5041082 Article OPEN ACCESS micromachines ISSN 2072-666X www.mdpi.com/journal/micromachines Executed Movement Using EEG Signals through a Naive Bayes Classifier
More informationMulti-domain Feature of Event-Related Potential Extracted by Nonnegative Tensor Factorization: 5 vs. 14 Electrodes EEG Data
Multi-domain Feature of Event-Related Potential Extracted by Nonnegative Tensor Factorization: 5 vs. 14 Electrodes EEG Data Fengyu Cong 1, Anh Huy Phan 2, Piia Astikainen 3, Qibin Zhao 2, Jari K. Hietanen
More informationMinimum Error-Rate Discriminant
Discriminants Minimum Error-Rate Discriminant In the case of zero-one loss function, the Bayes Discriminant can be further simplified: g i (x) =P (ω i x). (29) J. Corso (SUNY at Buffalo) Bayesian Decision
More informationEvidence build-up facilitates on-line adaptivity in dynamic environments: example of the BCI P300-speller
Evidence build-up facilitates on-line adaptivity in dynamic environments: example of the BCI P3-speller Emmanuel Daucé 1,2 and Eoin Thomas 3 1- INSERM UMR 116 - Institut de Neuroscience des Systèmes Faculté
More informationBayes Classifiers. CAP5610 Machine Learning Instructor: Guo-Jun QI
Bayes Classifiers CAP5610 Machine Learning Instructor: Guo-Jun QI Recap: Joint distributions Joint distribution over Input vector X = (X 1, X 2 ) X 1 =B or B (drinking beer or not) X 2 = H or H (headache
More informationClassifying EEG for Brain Computer Interfaces Using Gaussian Process
Manuscript [Word or (La)TeX] Classifying EEG for Brain Computer Interfaces Using Gaussian Process Mingjun Zhong 1, Fabien Lotte, Mark Girolami, Anatole Lécuyer zmingjun@irisa.fr, fabien.lotte@irisa.fr,
More informationLogistic Regression for Single Trial EEG Classification
Logistic Regression for Single Trial EEG Classification Ryota Tomioka,3 ryotat@first.fhg.de Kazuyuki Aihara,4 aihara@sat.t.u-tokyo.ac.jp Klaus-Robert Müller 2,3 klaus@first.fhg.de Dept. of Mathematical
More informationClustering VS Classification
MCQ Clustering VS Classification 1. What is the relation between the distance between clusters and the corresponding class discriminability? a. proportional b. inversely-proportional c. no-relation Ans:
More informationMachine Learning. Regression-Based Classification & Gaussian Discriminant Analysis. Manfred Huber
Machine Learning Regression-Based Classification & Gaussian Discriminant Analysis Manfred Huber 2015 1 Logistic Regression Linear regression provides a nice representation and an efficient solution to
More informationMachine Learning 2017
Machine Learning 2017 Volker Roth Department of Mathematics & Computer Science University of Basel 21st March 2017 Volker Roth (University of Basel) Machine Learning 2017 21st March 2017 1 / 41 Section
More informationBayesian Decision Theory
Introduction to Pattern Recognition [ Part 4 ] Mahdi Vasighi Remarks It is quite common to assume that the data in each class are adequately described by a Gaussian distribution. Bayesian classifier is
More informationIntroduction to Machine Learning
1, DATA11002 Introduction to Machine Learning Lecturer: Teemu Roos TAs: Ville Hyvönen and Janne Leppä-aho Department of Computer Science University of Helsinki (based in part on material by Patrik Hoyer
More informationTHESIS ANALYSIS OF TEMPORAL STRUCTURE AND NORMALITY IN EEG DATA. Submitted by Artem Sokolov Department of Computer Science
THESIS ANALYSIS OF TEMPORAL STRUCTURE AND NORMALITY IN EEG DATA Submitted by Artem Sokolov Department of Computer Science In partial fulfillment of the requirements for the Degree of Master of Science
More informationUniversity of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries
University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout :. The Multivariate Gaussian & Decision Boundaries..15.1.5 1 8 6 6 8 1 Mark Gales mjfg@eng.cam.ac.uk Lent
More informationNaïve Bayes classification. p ij 11/15/16. Probability theory. Probability theory. Probability theory. X P (X = x i )=1 i. Marginal Probability
Probability theory Naïve Bayes classification Random variable: a variable whose possible values are numerical outcomes of a random phenomenon. s: A person s height, the outcome of a coin toss Distinguish
More informationRiemannian geometry applied to BCI classification
Riemannian geometry applied to BCI classification Alexandre Barachant, Stephane Bonnet, Marco Congedo, Christian Jutten To cite this version: Alexandre Barachant, Stephane Bonnet, Marco Congedo, Christian
More informationIntroduction to Machine Learning Spring 2018 Note 18
CS 189 Introduction to Machine Learning Spring 2018 Note 18 1 Gaussian Discriminant Analysis Recall the idea of generative models: we classify an arbitrary datapoint x with the class label that maximizes
More informationRobust EEG Channel Selection across Sessions in Brain-Computer Interface Involving Stroke Patients
WCCI 2012 IEEE World Congress on Computational Intelligence June, 10-15, 2012 - Brisbane, Australia IJCNN Robust EEG Channel Selection across Sessions in Brain-Computer Interface Involving Stroke Patients
More informationOptimizing the Channel Selection and Classification Accuracy in EEG-Based BCI
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 58, NO. 6, JUNE 2011 1865 Optimizing the Channel Selection and Classification Accuracy in EEG-Based BCI Mahnaz Arvaneh*, Student Member, IEEE, Cuntai Guan,
More informationMATH 567: Mathematical Techniques in Data Science Logistic regression and Discriminant Analysis
Logistic regression MATH 567: Mathematical Techniques in Data Science Logistic regression and Discriminant Analysis Dominique Guillot Departments of Mathematical Sciences University of Delaware March 6,
More informationProblem #1 #2 #3 #4 #5 #6 Total Points /6 /8 /14 /10 /8 /10 /56
STAT 391 - Spring Quarter 2017 - Midterm 1 - April 27, 2017 Name: Student ID Number: Problem #1 #2 #3 #4 #5 #6 Total Points /6 /8 /14 /10 /8 /10 /56 Directions. Read directions carefully and show all your
More informationRECOGNITION ALGORITHM FOR DEVELOPING A BRAIN- COMPUTER INTERFACE USING FUNCTIONAL NEAR INFRARED SPECTROSCOPY
The 212 International Conference on Green Technology and Sustainable Development (GTSD212) RECOGNITION ALGORITHM FOR DEVELOPING A BRAIN- COMPUTER INTERFACE USING FUNCTIONAL NEAR INFRARED SPECTROSCOPY Cuong
More informationClassification. Chapter Introduction. 6.2 The Bayes classifier
Chapter 6 Classification 6.1 Introduction Often encountered in applications is the situation where the response variable Y takes values in a finite set of labels. For example, the response Y could encode
More informationChannel Selection Procedure using Riemannian distance for BCI applications
Channel Selection Procedure using Riemannian distance for BCI applications Alexandre Barachant, Stephane Bonnet To cite this version: Alexandre Barachant, Stephane Bonnet. Channel Selection Procedure using
More informationINF Anne Solberg One of the most challenging topics in image analysis is recognizing a specific object in an image.
INF 4300 700 Introduction to classifiction Anne Solberg anne@ifiuiono Based on Chapter -6 6inDuda and Hart: attern Classification 303 INF 4300 Introduction to classification One of the most challenging
More informationA Statistical Analysis of Fukunaga Koontz Transform
1 A Statistical Analysis of Fukunaga Koontz Transform Xiaoming Huo Dr. Xiaoming Huo is an assistant professor at the School of Industrial and System Engineering of the Georgia Institute of Technology,
More informationMachine Learning and Related Disciplines
Machine Learning and Related Disciplines The 15 th Winter School of Statistical Physics POSCO International Center & POSTECH, Pohang 2018. 1. 8-12 (Mon.-Fri.) Yung-Kyun Noh Machine Learning Interdisciplinary
More informationp(x ω i 0.4 ω 2 ω
p(x ω i ).4 ω.3.. 9 3 4 5 x FIGURE.. Hypothetical class-conditional probability density functions show the probability density of measuring a particular feature value x given the pattern is in category
More informationLDA, QDA, Naive Bayes
LDA, QDA, Naive Bayes Generative Classification Models Marek Petrik 2/16/2017 Last Class Logistic Regression Maximum Likelihood Principle Logistic Regression Predict probability of a class: p(x) Example:
More informationFeature selection and extraction Spectral domain quality estimation Alternatives
Feature selection and extraction Error estimation Maa-57.3210 Data Classification and Modelling in Remote Sensing Markus Törmä markus.torma@tkk.fi Measurements Preprocessing: Remove random and systematic
More informationMathematical Tools for Neuroscience (NEU 314) Princeton University, Spring 2016 Jonathan Pillow. Homework 8: Logistic Regression & Information Theory
Mathematical Tools for Neuroscience (NEU 34) Princeton University, Spring 206 Jonathan Pillow Homework 8: Logistic Regression & Information Theory Due: Tuesday, April 26, 9:59am Optimization Toolbox One
More informationGeneralized Sparse Discriminant Analysis for Event-Related Potential Classification
Generalized Sparse Discriminant Analysis for Event-Related Potential Classification Victoria Peterson a,, Hugo Leonardo Rufiner a,b, Ruben Daniel Spies c a Instituto de Investigación en Señales, Sistemas
More informationISyE 6416: Computational Statistics Spring Lecture 5: Discriminant analysis and classification
ISyE 6416: Computational Statistics Spring 2017 Lecture 5: Discriminant analysis and classification Prof. Yao Xie H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology
More informationAdaptive Error Detection Method for P300-based Spelling Using Riemannian Geometry
Adaptive Error Detection Method for P300-based Spelling Using Riemannian Geometry Attaullah Sahito, M. Abdul Rahman, Jamil Ahmed Department of Computer Science Sukkur Institute of Business Administration
More informationAn Introduction to Statistical and Probabilistic Linear Models
An Introduction to Statistical and Probabilistic Linear Models Maximilian Mozes Proseminar Data Mining Fakultät für Informatik Technische Universität München June 07, 2017 Introduction In statistical learning
More informationLearning Multiple Tasks with a Sparse Matrix-Normal Penalty
Learning Multiple Tasks with a Sparse Matrix-Normal Penalty Yi Zhang and Jeff Schneider NIPS 2010 Presented by Esther Salazar Duke University March 25, 2011 E. Salazar (Reading group) March 25, 2011 1
More informationMixed Effects Models for Single-Trial ERP Detection in Noninvasive Brain Computer Interface Design
Mixed Effects Models for Single-Trial ERP Detection in Noninvasive Brain Computer Interface Design Yonghong Huang a, Deniz Erdogmus b,a, Kenneth Hild II a, Misha Pavel a, Santosh Mathan c a Augmented Cognition
More informationONLINE CLASSIFICATION ACCURACY IS A POOR METRIC TO STUDY MENTAL IMAGERY-BASED BCI USER LEARNING: AN EXPERIMENTAL DEMONSTRATION AND NEW METRICS
ONLINE CLASSIFICATION ACCURACY IS A POOR METRIC TO STUDY MENTAL IMAGERY-BASED BCI USER LEARNING: AN EXPERIMENTAL DEMONSTRATION AND NEW METRICS F. Lotte 1, C. Jeunet 1,2,3 1 Inria Bordeaux Sud-Ouest / LaBRI,
More informationLecture 9: Classification, LDA
Lecture 9: Classification, LDA Reading: Chapter 4 STATS 202: Data mining and analysis October 13, 2017 1 / 21 Review: Main strategy in Chapter 4 Find an estimate ˆP (Y X). Then, given an input x 0, we
More informationECE662: Pattern Recognition and Decision Making Processes: HW TWO
ECE662: Pattern Recognition and Decision Making Processes: HW TWO Purdue University Department of Electrical and Computer Engineering West Lafayette, INDIANA, USA Abstract. In this report experiments are
More informationLecture 9: Classification, LDA
Lecture 9: Classification, LDA Reading: Chapter 4 STATS 202: Data mining and analysis October 13, 2017 1 / 21 Review: Main strategy in Chapter 4 Find an estimate ˆP (Y X). Then, given an input x 0, we
More informationCh 4. Linear Models for Classification
Ch 4. Linear Models for Classification Pattern Recognition and Machine Learning, C. M. Bishop, 2006. Department of Computer Science and Engineering Pohang University of Science and echnology 77 Cheongam-ro,
More informationDETECTION OF VISUAL EVOKED POTENTIALS USING RAMANUJAN PERIODICITY TRANSFORM FOR REAL TIME BRAIN COMPUTER INTERFACES
DETECTION OF VISUAL EVOKED POTENTIALS USING RAMANUJAN PERIODICITY TRANSFORM FOR REAL TIME BRAIN COMPUTER INTERFACES Pouria Saidi, George Atia and Azadeh Vosoughi Department of Electrical and Computer Engineering,
More informationCS 446 Machine Learning Fall 2016 Nov 01, Bayesian Learning
CS 446 Machine Learning Fall 206 Nov 0, 206 Bayesian Learning Professor: Dan Roth Scribe: Ben Zhou, C. Cervantes Overview Bayesian Learning Naive Bayes Logistic Regression Bayesian Learning So far, we
More informationIntroduction to Machine Learning
Outline Introduction to Machine Learning Bayesian Classification Varun Chandola March 8, 017 1. {circular,large,light,smooth,thick}, malignant. {circular,large,light,irregular,thick}, malignant 3. {oval,large,dark,smooth,thin},
More informationAdaptive Spatial Filters with predefined Region of Interest for EEG based Brain-Computer-Interfaces
Adaptive Spatial Filters with predefined Region of Interest for EEG based Brain-Computer-Interfaces Moritz Grosse-Wentrup Institute of Automatic Control Engineering Technische Universität München 80333
More informationRegularized Common Spatial Pattern with Aggregation for EEG Classification in Small-Sample Setting
REGULARIZED COMMON SPATIAL PATTERN WITH AGGREGATION FOR EEG CLASSIFICATION IN SMALL-SAMPLE SETTING 1 Regularized Common Spatial Pattern with Aggregation for EEG Classification in Small-Sample Setting Haiping
More informationPattern Recognition and Machine Learning
Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability
More informationSlice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, Frequency and Time Domains
Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, and Domains Qibin Zhao, Cesar F. Caiafa, Andrzej Cichocki, and Liqing Zhang 2 Laboratory for Advanced Brain Signal Processing,
More informationCS534 Machine Learning - Spring Final Exam
CS534 Machine Learning - Spring 2013 Final Exam Name: You have 110 minutes. There are 6 questions (8 pages including cover page). If you get stuck on one question, move on to others and come back to the
More informationCSC 411: Lecture 09: Naive Bayes
CSC 411: Lecture 09: Naive Bayes Class based on Raquel Urtasun & Rich Zemel s lectures Sanja Fidler University of Toronto Feb 8, 2015 Urtasun, Zemel, Fidler (UofT) CSC 411: 09-Naive Bayes Feb 8, 2015 1
More informationCLASSICAL NORMAL-BASED DISCRIMINANT ANALYSIS
CLASSICAL NORMAL-BASED DISCRIMINANT ANALYSIS EECS 833, March 006 Geoff Bohling Assistant Scientist Kansas Geological Survey geoff@gs.u.edu 864-093 Overheads and resources available at http://people.u.edu/~gbohling/eecs833
More informationPattern Recognition Problem. Pattern Recognition Problems. Pattern Recognition Problems. Pattern Recognition: OCR. Pattern Recognition Books
Introduction to Statistical Pattern Recognition Pattern Recognition Problem R.P.W. Duin Pattern Recognition Group Delft University of Technology The Netherlands prcourse@prtools.org What is this? What
More informationIntroduction to Machine Learning
1, DATA11002 Introduction to Machine Learning Lecturer: Antti Ukkonen TAs: Saska Dönges and Janne Leppä-aho Department of Computer Science University of Helsinki (based in part on material by Patrik Hoyer,
More informationSupervised Learning: Linear Methods (1/2) Applied Multivariate Statistics Spring 2012
Supervised Learning: Linear Methods (1/2) Applied Multivariate Statistics Spring 2012 Overview Review: Conditional Probability LDA / QDA: Theory Fisher s Discriminant Analysis LDA: Example Quality control:
More informationINF Introduction to classifiction Anne Solberg Based on Chapter 2 ( ) in Duda and Hart: Pattern Classification
INF 4300 151014 Introduction to classifiction Anne Solberg anne@ifiuiono Based on Chapter 1-6 in Duda and Hart: Pattern Classification 151014 INF 4300 1 Introduction to classification One of the most challenging
More informationMark your answers ON THE EXAM ITSELF. If you are not sure of your answer you may wish to provide a brief explanation.
CS 189 Spring 2015 Introduction to Machine Learning Midterm You have 80 minutes for the exam. The exam is closed book, closed notes except your one-page crib sheet. No calculators or electronic items.
More informationClassification Methods II: Linear and Quadratic Discrimminant Analysis
Classification Methods II: Linear and Quadratic Discrimminant Analysis Rebecca C. Steorts, Duke University STA 325, Chapter 4 ISL Agenda Linear Discrimminant Analysis (LDA) Classification Recall that linear
More informationLearning Linear Detectors
Learning Linear Detectors Instructor - Simon Lucey 16-423 - Designing Computer Vision Apps Today Detection versus Classification Bayes Classifiers Linear Classifiers Examples of Detection 3 Learning: Detection
More informationPattern Classification
Pattern Classification Introduction Parametric classifiers Semi-parametric classifiers Dimensionality reduction Significance testing 6345 Automatic Speech Recognition Semi-Parametric Classifiers 1 Semi-Parametric
More informationLearning with multiple models. Boosting.
CS 2750 Machine Learning Lecture 21 Learning with multiple models. Boosting. Milos Hauskrecht milos@cs.pitt.edu 5329 Sennott Square Learning with multiple models: Approach 2 Approach 2: use multiple models
More informationMachine Learning. Gaussian Mixture Models. Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall
Machine Learning Gaussian Mixture Models Zhiyao Duan & Bryan Pardo, Machine Learning: EECS 349 Fall 2012 1 The Generative Model POV We think of the data as being generated from some process. We assume
More informationCourse in Data Science
Course in Data Science About the Course: In this course you will get an introduction to the main tools and ideas which are required for Data Scientist/Business Analyst/Data Analyst. The course gives an
More informationDiscriminative Direction for Kernel Classifiers
Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering
More informationStatistics 135 Fall 2008 Final Exam
Name: SID: Statistics 135 Fall 2008 Final Exam Show your work. The number of points each question is worth is shown at the beginning of the question. There are 10 problems. 1. [2] The normal equations
More informationStudy on Classification Methods Based on Three Different Learning Criteria. Jae Kyu Suhr
Study on Classification Methods Based on Three Different Learning Criteria Jae Kyu Suhr Contents Introduction Three learning criteria LSE, TER, AUC Methods based on three learning criteria LSE:, ELM TER:
More informationAdaptive changes of rhythmic EEG oscillations in space implications for brain-machine interface applications.
ESA 29 Adaptive changes of rhythmic EEG oscillations in space implications for brain-machine interface applications. G. Cheron, A.M. Cebolla, M. Petieau, A. Bengoetxea, E. Palmero-Soler, A. Leroy, B. Dan
More informationIntroduction to Machine Learning
Introduction to Machine Learning Bayesian Classification Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB CSE 474/574
More informationTanagra Tutorials. The classification is based on a good estimation of the conditional probability P(Y/X). It can be rewritten as follows.
1 Topic Understanding the naive bayes classifier for discrete predictors. The naive bayes approach is a supervised learning method which is based on a simplistic hypothesis: it assumes that the presence
More informationMIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October,
MIDTERM: CS 6375 INSTRUCTOR: VIBHAV GOGATE October, 23 2013 The exam is closed book. You are allowed a one-page cheat sheet. Answer the questions in the spaces provided on the question sheets. If you run
More informationLinear Classifiers as Pattern Detectors
Intelligent Systems: Reasoning and Recognition James L. Crowley ENSIMAG 2 / MoSIG M1 Second Semester 2014/2015 Lesson 16 8 April 2015 Contents Linear Classifiers as Pattern Detectors Notation...2 Linear
More information