Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, Frequency and Time Domains

Size: px
Start display at page:

Download "Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, Frequency and Time Domains"

Transcription

1 Slice Oriented Tensor Decomposition of EEG Data for Feature Extraction in Space, and Domains Qibin Zhao, Cesar F. Caiafa, Andrzej Cichocki, and Liqing Zhang 2 Laboratory for Advanced Brain Signal Processing, Brain Science Institute, RIKEN, Saitama, Japan 2 Department of Computer Science and Engineering, Shanghai Jiao Tong University Shanghai, China Abstract. In this paper we apply a novel tensor decomposition model of SOD (slice oriented decomposition) to extract slice features from the multichannel time-frequency representation of EEG signals measured for MI (motor imagery) tasks in application to BCI (brain computer interface). The advantages of the SOD based feature extraction approach lie in its capability to obtain slice matrix components across the space, time and frequency domains and the discriminative features across different classes without any prior knowledge of the discriminative frequency bands. Furthermore, the combination of horizontal, lateral and frontal slice features makes our method more robust for the outlier problem. The experiment results demonstrate the effectiveness and robustness of our method. Key words: Tensor decomposition, EEG, BCI Introduction Tensors (also known as n-way arrays) are used in a variety of applications ranging from neuroscience and psychometrics to chemometrics [ 3]. From a viewpoint of data analysis, tensor decomposition is very attractive because it takes into account spatial and temporal correlations between variables more accurately than 2D matrix factorizations, and it usually provides sparse common factors or hidden components with physiological meaning and interpretation. In most applications, especially in neuroscience (EEG, fmri), the standard PARAFAC and Tucker models were used[4 6]. Feature extraction for high dimension data and high noise data plays an important role in machine learning and pattern recognition. In the real world, the extracted feature of an object often has some specialized structures and such Corresponding author. qbzhao@brain.riken.jp. On leave from Engineering Faculty, University of Buenos Aires, ARGENTINA.

2 2 Qibin Zhao, Cesar F. Caiafa, Andrzej Cichocki, and Liqing Zhang structures are in the form of 2nd or even higher-order tensor. Recently, multilinear algebra, the algebra of high-order tensors, was applied for analyzing the multifactor structure image ensembles, EEG signals [7] and etc. These methods, such as tensor PCA [8], tensor LDA [9, ], tensor subspace analysis [ 3], treat original data as second- or high-order tensors. For supervised feature classification [4], the tensor factorization can lead to structured dimensionality reduction by learning multiple interrelated subspaces. Discriminant analysis using tensor representation [5] can avoid the curse of dimensionality dilemma and overcome the small sample size problem. In the most existing tensor decomposition models, high-dimension tensors are decomposed to many rank- vector components on each mode. PARAFAC model can be explained as a special case of Tucker model in which the core tensor is reduced to a super-diagonal tensor. Unlike most existing models such as PARAFAC, Tucker and HOSVD, our SOD model is to represent a 3D tensor by outer product of slice matrices and corresponding vectors on each tensor mode rather than rank- components. Therefore, the structure of tensor data associated to its horizontal, lateral and frontal slices can be captured. Based on the SOD model, we developed a feature extraction framework for single-trial EEG classification. This paper is organized as follows: in section 2, SOD model and its main properties are introduced briefly, then the feature extraction framework based on SOD are proposed; in section 3, data analysis results on EEG data are presented and discussed; in section 4, the main conclusions and future perspectives of improvement are presented. 2 Method 2. SOD model In [6], the Slice Oriented Decomposition (SOD) model was recently proposed as a decomposition method of 3-way tensors that captures the structure of data slices providing also a compact representation. SOD takes into account the interactions among the three modes of a tensor Y R I J K by decomposing it as a sum of elemental (simple) tensors: P Q R Ŷ = H p + L q + F r = p= q= r= P Q R H p u p + L q 2 v q + F r 3 w r, () p= where matrices H p, L q and F r are called matrix components, vectors u p, v q and w r are called vector components, H, L, F R I J K and n is the n-mode outer product (n =, 2 or 3) defined as follows: q= r= [H] ijk = [H u] ijk = h jk u i, (2) [L] ijk = [L 2 v] ijk = l ik v j, (3) [F] ijk = [F 3 w] ijk = f ij w k. (4)

3 Title Suppressed Due to Excessive Length 3 The effect of the n-mode outer product is to create simple tensors where slices are scaled versions of a basic matrix. In Fig. -(a) the equation () is illustrated while in Fig. -(b) the SOD compact representation is shown. When vector and matrix components are constrained to be nonnegative we arrive to the Non-negative SOD (NN-SOD) for which an Alternate Least Squared (ALS) Newton based algorithm is available [6]. (a) SOD as sum of simple tensors I J K P L q p= u p H p Q q= v q R r= w r F r (b) Compact representation of SOD P horizontal components ( J K) (, P<<I) H p I J K Tensor R frontal components ( I J) (, R<<K) F r Q lateral components ( I K) (, Q<<J) L q Fig.. Slice Oriented Decomposition (SOD) model 2.2 Feature extraction To apply SOD for extracting slice features along horizontal, lateral and frontal directions, a tensor data X can be projected on slice matrix components of H p, L q and F r along each tensor mode as: û p = X () vect(h p ), ˆv q = X (2) vect(l q ),ŵ r = X (3) vect(f r ). (5) Then the correlation coefficients between û p, ˆv q,ŵ r and u p,v q,w r calculated by R U Û, R V ˆV and R WŴ can be combined as: f = [diag(r U Û ); diag(r V ˆV ); diag(rwŵ)], (6) where R U Û denotes the correlation matrix between matrix U with each column of u p, p =...P and Û with each column of û p, p =...P. Thus the vector

4 4 Qibin Zhao, Cesar F. Caiafa, Andrzej Cichocki, and Liqing Zhang f denotes the similarity between original Y and X at the linear combination patterns for all slice matrix components. Therefore, the vector f can also be interpreted as a group of projecting coefficients on all slice-based tensor basis. For the classification in the multi-class case, we first obtain class-specific slice components group by applying SOD on the c-th class averaged tensor data Y c, c =...C. Then we obtain a slice components group for each class. Thus, the features of new tensor data calculated by Eq.(6) can be used to train a linear classifier. (a) Left hand (b) Right hand Fig. 2. Averaged 3-way tensors of space-frequency-time representation for EEG signals during MI tasks. The size of tensor data is (i.e., channels frequency time). (a) for left hand class and (b) for right hand class. 3 Experiments and Results In our application, EEG signals with only 5 electrodes (i.e., C3, Cp3, Cz, Cp4, C4) over the motor cortex were recorded from the scalp at a sampling rate of 256Hz for 2 classes MI-based BCI experiments. In the experimental sessions used for the present study, labeled trials of EEG signals were recorded in the following way: the subjects were sitting in a comfortable chair with arms lying relaxed on the armrests. Each trial consists of 2s for relaxation and 4s for movement imagination (i.e., left hand or right hand) tasks following visual cue stimulus. The EEG data are transformed from the time-domain to the time-frequency domain using a complex Morlet continuous wavelet transform (CWT) with center frequency ω c = and bandwidth parameter ω b = 2. The frequency range from 6Hz to 3Hz at.5hz step are focused in our application. Thus, we obtain EEG tensor representation X R N d N f N t which is a 3-way time-varying EEG wavelet coefficients array, where N d, N f, N t are the number of channels, frequency bins, and time points respectively. In our application, we only consider the time-frequency power features of EEG trials, hence a square operation is performed on X in advance. In order to find the invariable feature structure through all trials, we first preprocessed EEG tensors by averaging the same class

5 Title Suppressed Due to Excessive Length 5 H H2 H H2.5 u C3 Cp3 Cz Cp4 C4.5 u2 C3 Cp3 Cz Cp4 C4.5 u C3 Cp3 Cz Cp4 C4.5 u2 C3 Cp3 Cz Cp4 C4 (a) Horizontal Slice (b) Horizontal Slice L L2 L L2 Channel Channel v v (Hz) (Hz) (c) Lateral Slice v v (Hz) (Hz) (d) Lateral Slice F F2 F F2 Channel Channel.5 w (s) w (s)..5 w (s) w (s) (e) Frontal Slice (f) Frontal Slice Fig.3. Results of NN-SOD with P = Q = R = 2 applied to class-specific EEG tensors. (a)(c)(e) are slice components for left class, (b)(d)(f) are slice components for right class. (a)(b) are horizontal slice matrix H p and combination vector u p; (c)(d) are Lateral slice matrix L q and combination vector v q; (e)(f) are frontal slice matrix F r and combination vector w r. The tensor size is , i.e., 5 channels, 49 frequency bins and 24 sample points.

6 6 Qibin Zhao, Cesar F. Caiafa, Andrzej Cichocki, and Liqing Zhang as Y c = M Σ i class c X i, M is the trial number of c-th class. Fig.2 shows the 3D averaged tensors for each class. The SOD model with non-negative constraints was performed for slice decomposition on each of class-specific tensors (i.e., the space, frequency and time domain). Fig.3 presents the decomposition results as the horizontal, lateral and frontal slices with 2 components on each mode, i.e., P = Q = R = 2. In the horizontal slice components (Fig.3(a)) for left class, the time-frequency matrix H mainly focuses around Hz (µ-rhythm) throughout the whole 4s duration of one trial. Then the vector u which represents the space distribution of the corresponding slice demonstrates that the slice H is decreased from channel C3 to C4. This is the ERS phenomena. Meanwhile the slice H 2 and corresponding vector u 2 demonstrate the ERD phenomena of decreasing power of µ-rhythm on right motor area of brain. Similar with left class, Fig.3(b) shows the time-frequency slice components and the distribution on channels domain, H denotes interrupt of µ-rhythm on left and right hemisphere of brain, H 2 denotes low β-rhythm in the frequency of 6-2Hz on right hemisphere of brain. Therefore, the significance of ERD/ERS for left hand and right hand are not same for specific subject. Similar to the horizontal slices, Fig.3(c),3(d) present the space-time lateral slice components and distribution vectors in the frequency domain. Fig.3(e),3(f) present the space-frequency frontal slice components and distribution vectors in the time domain. In Fig.3(e), F denotes the continuous β-rhythm focused on the left hemisphere of brain..8 Slice components from left class Slice components from right class r 2 value Components Fig.4. r 2 -value of slice components. The first 6 components are obtained by left class tensor, and the last 6 components are obtained by right class tensor. The order of each 6 components are H,H 2,L,L 2,F,F 2. In order to find the most discriminative slice components for two mental tasks, the r 2 -value are calculated for each slice components. Fig.4 shows r 2 - value along slice components, the first 6 components for left class and the last

7 Title Suppressed Due to Excessive Length 7 6 components for right class. It can be clearly seen that H of left class and H 2 of right class have most discriminative ability, which illustrates that most discriminative information between left and right class lies in the space distribution of time-frequency slices. Therefore, BCI researches always obtained high performance only by spatial filters, e.g., CSP method. However, the CSP algorithm is also known for its tendency to overfit, i.e., to learn the non-discriminative brain rhythm which has an overlapping frequency range with most discriminative brain rhythm. Especially in the small training samples case, CSP is suffered for the outlier problem because of high dependence upon the distribution properties of training data. As compared with CSP, our method are more stable in case of small training samples and more robust to deal with the nonstationary of EEG signals. To prove that, we has trained a SOD model and SVM classifier on first experiment run and tested our method on several subsequent runs. Fig.5 presents the classification performance of runs for subject A and 8 runs for subject B. The results demonstrate that the relative stable performance can be obtained by our method when compared with CSP. Therefore, the generalization ability of our method seems to be more suitable for online BCI system. 8 8 Accuracy (%) CSP SOD Accuracy (%) CSP SOD Runs (a) Subject A Runs (b) Subject B Fig.5. The classification performance of subsequent runs based on the model trained on the first run. One experiment run contains only 2 trials for each class and the duration of mental tasks is 4s for each trial. (a) for subject A and (b) for subject B. 4 Conclusions In this study, we have presented a novel tensor feature extraction framework for EEG classification based on SOD algorithm. Through applying the non-negative SOD, the slice features on each tensor mode can be easily obtained. Data analysis on EEG signals from BCI experiments demonstrates the effectiveness of our method. Compared with traditional tensor learning methods, our method is able to extract slice matrices from tensor data on multi-mode simultaneously, hence the space-frequency, space-time, and time-frequency structure features

8 8 Qibin Zhao, Cesar F. Caiafa, Andrzej Cichocki, and Liqing Zhang can be captured from 3D tensor data. Classification performance on several experiment runs also confirmed the robustness of our method. To further improve the discriminative ability, the class information will be additionally considered in the cost function and the semi-supervised feature extraction method will be studied in next step. References. Smilde, A., Bro, R., Geladi, P.: Multi-way Analysis with Applications in the Chemical Sciences. Wiley (24) 2. Heiler, M., Schnorr, C.: Controlling Sparseness in Non-negative Tensor Factorization. Computer Vision ECCV 26: 9th European Conference on Computer Vision, Graz, Austria, May 7-3, 26: Proceedings (26) 3. Cichocki, A., Zdunek, R., Choi, S., Plemmons, R., Amari, S.: Non-Negative Tensor Factorization using Alpha and Beta Divergences. Acoustics, Speech and Signal Processing, 27. ICASSP 27. IEEE International Conference on 3 (27) 4. Mørup, M., Hansen, L., Herrmann, C., Parnas, J., Arnfred, S.: Parallel Factor Analysis as an exploratory tool for wavelet transformed event-related EEG. Neuroimage 29(3) (26) Miwakeichi, F., Martínez-Montes, E., Valdés-Sosa, P., Nishiyama, N., Mizuhara, H., Yamaguchi, Y.: Decomposing EEG data into space time frequency components using Parallel Factor Analysis. Neuroimage 22(3) (24) Cichocki, A., Zdunek, R., Phan, A.H., Amari, S.: Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation. Wiley (November 29) 7. Mřrup, M., Hansen, L., Arnfred, S.: ERPWAVELAB A toolbox for multi-channel analysis of time frequency transformed event related potentials. Journal of Neuroscience Methods 6(2) (27) Yang, J., Zhang, D., Frangi, A., J.Y., Y.: Two-dimensional pca: A new approach to appearance-based face representation and recognition. IEEE Trans. Pattern Anal. Mach. Intell. 26() (24) Tao, D., Li, X., Wu, X., Maybank, S.: General Tensor Discriminant Analysis and Gabor Features for Gait Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 29() (27) Ye, J., Janardan, R., Li, Q.: Two-dimensional linear discriminant analysis. NIPS (24). Wang, X., Tang, X.: A unified framework for subspace face recognition. IEEE Trans. Pattern Analysis and Machine Intelligence 26(9) (24) Fu, Y., Huang, T.: Image classification using correlation tensor analysis. IEEE Transaction on Image Processing 7(2) (28) He, X., Cai, D., Niyogi, P.: Tensor subspace analysis. NIPS6 (26) 4. Tao, D., Li, X., Hu, W., Maybank, S., Wu, X.: Supervised tensor learning. Proceedings of the Fifth IEEE International Conference on Data Mining (ICDM 5) (27) 5. Yan, S., Xu, D., Q., Y., Zhang, L., X., T., Zhang, H.: Discriminant analysis with tensor representation. IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 5) (25) 6. Caiafa, C.F., Cichocki, A.: Slice Oriented Decomposition (SOD): A New Tensor Decomposition for Representation of 3-way Data. submitted February 29.

Brain Computer Interface Using Tensor Decompositions and Multi-way Analysis

Brain Computer Interface Using Tensor Decompositions and Multi-way Analysis 1 Brain Computer Interface Using Tensor Decompositions and Multi-way Analysis Andrzej CICHOCKI Laboratory for Advanced Brain Signal Processing http://www.bsp.brain.riken.jp/~cia/. RIKEN, Brain Science

More information

Regularized Alternating Least Squares Algorithms for Non-negative Matrix/Tensor Factorization

Regularized Alternating Least Squares Algorithms for Non-negative Matrix/Tensor Factorization Regularized Alternating Least Squares Algorithms for Non-negative Matrix/Tensor Factorization Andrzej CICHOCKI and Rafal ZDUNEK Laboratory for Advanced Brain Signal Processing, RIKEN BSI, Wako-shi, Saitama

More information

Multiway Canonical Correlation Analysis for Frequency Components Recognition in SSVEP-based BCIs

Multiway Canonical Correlation Analysis for Frequency Components Recognition in SSVEP-based BCIs Multiway Canonical Correlation Analysis for Frequency Components Recognition in SSVEP-based BCIs Yu Zhang 1,2, Guoxu Zhou 1, Qibin Zhao 1, Akinari Onishi 1,3, Jing Jin 2, Xingyu Wang 2, and Andrzej Cichocki

More information

Novel Multi-layer Non-negative Tensor Factorization with Sparsity Constraints

Novel Multi-layer Non-negative Tensor Factorization with Sparsity Constraints Novel Multi-layer Non-negative Tensor Factorization with Sparsity Constraints Andrzej CICHOCKI 1, Rafal ZDUNEK 1, Seungjin CHOI, Robert PLEMMONS 2, and Shun-ichi AMARI 1 1 RIKEN Brain Science Institute,

More information

Nonnegative Tensor Factorization for Continuous EEG Classification a

Nonnegative Tensor Factorization for Continuous EEG Classification a POSTECH-MLG-2007-004 Machine Learning Group Department of Computer Science, POSTECH Nonnegative Tensor Factorization for Continuous EEG Classification a Hyekyoung Lee, Yong-Deok Kim, Andrzej Cichocki,

More information

CVPR A New Tensor Algebra - Tutorial. July 26, 2017

CVPR A New Tensor Algebra - Tutorial. July 26, 2017 CVPR 2017 A New Tensor Algebra - Tutorial Lior Horesh lhoresh@us.ibm.com Misha Kilmer misha.kilmer@tufts.edu July 26, 2017 Outline Motivation Background and notation New t-product and associated algebraic

More information

Multiscale Tensor Decomposition

Multiscale Tensor Decomposition Multiscale Tensor Decomposition Alp Ozdemir 1, Mark A. Iwen 1,2 and Selin Aviyente 1 1 Department of Electrical and Computer Engineering, Michigan State University 2 Deparment of the Mathematics, Michigan

More information

FLEXIBLE HALS ALGORITHMS FOR SPARSE NON-NEGATIVE MATRIX/TENSOR FACTORIZATION. RIKEN Brain Science Institute, LABSP, Wako-shi, Saitama , JAPAN

FLEXIBLE HALS ALGORITHMS FOR SPARSE NON-NEGATIVE MATRIX/TENSOR FACTORIZATION. RIKEN Brain Science Institute, LABSP, Wako-shi, Saitama , JAPAN FLEXIBLE HALS ALGORITHMS FOR SPARSE NON-NEGATIVE MATRIX/TENSOR FACTORIZATION Andre CICHOCKI, Anh Huy PHAN and Cesar CAIAFA RIKEN Brain Science Institute, LABSP, Wako-shi, Saitama 351-198, JAPAN ABSTRACT

More information

KERNEL-BASED TENSOR PARTIAL LEAST SQUARES FOR RECONSTRUCTION OF LIMB MOVEMENTS

KERNEL-BASED TENSOR PARTIAL LEAST SQUARES FOR RECONSTRUCTION OF LIMB MOVEMENTS KERNEL-BASED TENSOR PARTIAL LEAST SQUARES FOR RECONSTRUCTION OF LIMB MOVEMENTS Qibin Zhao, Guoxu Zhou, Tulay Adali 2, Liqing Zhang 3, Andrzej Cichocki RIKEN Brain Science Institute, Japan 2 University

More information

Robust Speaker Modeling Based on Constrained Nonnegative Tensor Factorization

Robust Speaker Modeling Based on Constrained Nonnegative Tensor Factorization Robust Speaker Modeling Based on Constrained Nonnegative Tensor Factorization Qiang Wu, Liqing Zhang, and Guangchuan Shi Department of Computer Science and Engineering, Shanghai Jiao Tong University, Shanghai

More information

Research Article Single-Trial Decoding of Bistable Perception Based on Sparse Nonnegative Tensor Decomposition

Research Article Single-Trial Decoding of Bistable Perception Based on Sparse Nonnegative Tensor Decomposition Hindawi Publishing Corporation Computational Intelligence and Neuroscience Volume 8, Article ID 64387, pages doi:./8/64387 Research Article Single-Trial Decoding of Bistable Perception Based on Sparse

More information

Nonnegative Tensor Factorization with Smoothness Constraints

Nonnegative Tensor Factorization with Smoothness Constraints Nonnegative Tensor Factorization with Smoothness Constraints Rafal ZDUNEK 1 and Tomasz M. RUTKOWSKI 2 1 Institute of Telecommunications, Teleinformatics and Acoustics, Wroclaw University of Technology,

More information

LOW RANK TENSOR DECONVOLUTION

LOW RANK TENSOR DECONVOLUTION LOW RANK TENSOR DECONVOLUTION Anh-Huy Phan, Petr Tichavský, Andrze Cichocki Brain Science Institute, RIKEN, Wakoshi, apan Systems Research Institute PAS, Warsaw, Poland Institute of Information Theory

More information

Recipes for the Linear Analysis of EEG and applications

Recipes for the Linear Analysis of EEG and applications Recipes for the Linear Analysis of EEG and applications Paul Sajda Department of Biomedical Engineering Columbia University Can we read the brain non-invasively and in real-time? decoder 1001110 if YES

More information

Linear Discriminant Analysis Using Rotational Invariant L 1 Norm

Linear Discriminant Analysis Using Rotational Invariant L 1 Norm Linear Discriminant Analysis Using Rotational Invariant L 1 Norm Xi Li 1a, Weiming Hu 2a, Hanzi Wang 3b, Zhongfei Zhang 4c a National Laboratory of Pattern Recognition, CASIA, Beijing, China b University

More information

THERE is an increasing need to handle large multidimensional

THERE is an increasing need to handle large multidimensional 1 Matrix Product State for Feature Extraction of Higher-Order Tensors Johann A. Bengua 1, Ho N. Phien 1, Hoang D. Tuan 1 and Minh N. Do 2 arxiv:1503.00516v4 [cs.cv] 20 Jan 2016 Abstract This paper introduces

More information

HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems

HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems HMM and IOHMM Modeling of EEG Rhythms for Asynchronous BCI Systems Silvia Chiappa and Samy Bengio {chiappa,bengio}@idiap.ch IDIAP, P.O. Box 592, CH-1920 Martigny, Switzerland Abstract. We compare the use

More information

N-mode Analysis (Tensor Framework) Behrouz Saghafi

N-mode Analysis (Tensor Framework) Behrouz Saghafi N-mode Analysis (Tensor Framework) Behrouz Saghafi N-mode Analysis (Tensor Framework) Drawback of 1-mode analysis (e.g. PCA): Captures the variance among just a single factor Our training set contains

More information

CP DECOMPOSITION AND ITS APPLICATION IN NOISE REDUCTION AND MULTIPLE SOURCES IDENTIFICATION

CP DECOMPOSITION AND ITS APPLICATION IN NOISE REDUCTION AND MULTIPLE SOURCES IDENTIFICATION International Conference on Computer Science and Intelligent Communication (CSIC ) CP DECOMPOSITION AND ITS APPLICATION IN NOISE REDUCTION AND MULTIPLE SOURCES IDENTIFICATION Xuefeng LIU, Yuping FENG,

More information

Dimensionality Reduction:

Dimensionality Reduction: Dimensionality Reduction: From Data Representation to General Framework Dong XU School of Computer Engineering Nanyang Technological University, Singapore What is Dimensionality Reduction? PCA LDA Examples:

More information

SICE??, VOL.50Exx??, NO.7xx XXXX 200x 1

SICE??, VOL.50Exx??, NO.7xx XXXX 200x 1 SICE??, VOL.50Exx??, NO.7xx XXXX 200x 1 INVITED PAPER Special Issue on Measurement of Brain Functions and Bio-Signals Tensor Decompositions: New Concepts in Brain Data Analysis? Andrzej CICHOCKI a), SMMARY

More information

Nonnegative matrix factorization with α-divergence

Nonnegative matrix factorization with α-divergence Nonnegative matrix factorization with α-divergence Andrzej Cichocki a, Hyekyoung Lee b, Yong-Deok Kim b, Seungjin Choi b, a Laboratory for Advanced Brain Signal Processing Brain Science Institute, RIKEN

More information

Myoelectrical signal classification based on S transform and two-directional 2DPCA

Myoelectrical signal classification based on S transform and two-directional 2DPCA Myoelectrical signal classification based on S transform and two-directional 2DPCA Hong-Bo Xie1 * and Hui Liu2 1 ARC Centre of Excellence for Mathematical and Statistical Frontiers Queensland University

More information

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 17, NO. 2, MARCH Yuanqing Li, Andrzej Cichocki, and Shun-Ichi Amari, Fellow, IEEE

IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 17, NO. 2, MARCH Yuanqing Li, Andrzej Cichocki, and Shun-Ichi Amari, Fellow, IEEE IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL 17, NO 2, MARCH 2006 419 Blind Estimation of Channel Parameters and Source Components for EEG Signals: A Sparse Factorization Approach Yuanqing Li, Andrzej Cichocki,

More information

FROM BASIS COMPONENTS TO COMPLEX STRUCTURAL PATTERNS Anh Huy Phan, Andrzej Cichocki, Petr Tichavský, Rafal Zdunek and Sidney Lehky

FROM BASIS COMPONENTS TO COMPLEX STRUCTURAL PATTERNS Anh Huy Phan, Andrzej Cichocki, Petr Tichavský, Rafal Zdunek and Sidney Lehky FROM BASIS COMPONENTS TO COMPLEX STRUCTURAL PATTERNS Anh Huy Phan, Andrzej Cichocki, Petr Tichavský, Rafal Zdunek and Sidney Lehky Brain Science Institute, RIKEN, Wakoshi, Japan Institute of Information

More information

Novel Alternating Least Squares Algorithm for Nonnegative Matrix and Tensor Factorizations

Novel Alternating Least Squares Algorithm for Nonnegative Matrix and Tensor Factorizations Novel Alternating Least Squares Algorithm for Nonnegative Matrix and Tensor Factorizations Anh Huy Phan 1, Andrzej Cichocki 1,, Rafal Zdunek 1,2,andThanhVuDinh 3 1 Lab for Advanced Brain Signal Processing,

More information

DYNAMIC TEXTURE RECOGNITION USING ENHANCED LBP FEATURES

DYNAMIC TEXTURE RECOGNITION USING ENHANCED LBP FEATURES DYNAMIC TEXTURE RECOGNITION USING ENHANCED FEATURES Jianfeng Ren BeingThere Centre Institute of Media Innovation Nanyang Technological University 50 Nanyang Drive, Singapore 637553. Xudong Jiang, Junsong

More information

THE Partial Least Squares (PLS) is a well-established

THE Partial Least Squares (PLS) is a well-established Higher-Order Partial Least Squares (HOPLS): A Generalized Multi-Linear Regression Method Qibin Zhao, Cesar F. Caiafa, Danilo P. Mandic, Zenas C. Chao, Yasuo Nagasaka, Naotaka Fujii, Liqing Zhang and Andrzej

More information

Dynamic Data Modeling, Recognition, and Synthesis. Rui Zhao Thesis Defense Advisor: Professor Qiang Ji

Dynamic Data Modeling, Recognition, and Synthesis. Rui Zhao Thesis Defense Advisor: Professor Qiang Ji Dynamic Data Modeling, Recognition, and Synthesis Rui Zhao Thesis Defense Advisor: Professor Qiang Ji Contents Introduction Related Work Dynamic Data Modeling & Analysis Temporal localization Insufficient

More information

Tutorial on Methods for Interpreting and Understanding Deep Neural Networks. Part 3: Applications & Discussion

Tutorial on Methods for Interpreting and Understanding Deep Neural Networks. Part 3: Applications & Discussion Tutorial on Methods for Interpreting and Understanding Deep Neural Networks W. Samek, G. Montavon, K.-R. Müller Part 3: Applications & Discussion ICASSP 2017 Tutorial W. Samek, G. Montavon & K.-R. Müller

More information

To be published in Optics Letters: Blind Multi-spectral Image Decomposition by 3D Nonnegative Tensor Title: Factorization Authors: Ivica Kopriva and A

To be published in Optics Letters: Blind Multi-spectral Image Decomposition by 3D Nonnegative Tensor Title: Factorization Authors: Ivica Kopriva and A o be published in Optics Letters: Blind Multi-spectral Image Decomposition by 3D Nonnegative ensor itle: Factorization Authors: Ivica Kopriva and Andrzej Cichocki Accepted: 21 June 2009 Posted: 25 June

More information

A Brief Guide for TDALAB Ver 1.1. Guoxu Zhou and Andrzej Cichocki

A Brief Guide for TDALAB Ver 1.1. Guoxu Zhou and Andrzej Cichocki A Brief Guide for TDALAB Ver 1.1 Guoxu Zhou and Andrzej Cichocki April 30, 2013 Contents 1 Preliminary 2 1.1 Highlights of TDALAB...................... 2 1.2 Install and Run TDALAB....................

More information

Semisupervised Sparse Multilinear Discriminant Analysis

Semisupervised Sparse Multilinear Discriminant Analysis Huang K, Zhang LQ. Semisupervised sparse multilinear discriminant analysis. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY 29(6): 1058 1071 Nov. 2014. DOI 10.1007/s11390-014-1490-1 Semisupervised Sparse Multilinear

More information

Adaptive Spatial Filters with predefined Region of Interest for EEG based Brain-Computer-Interfaces

Adaptive Spatial Filters with predefined Region of Interest for EEG based Brain-Computer-Interfaces Adaptive Spatial Filters with predefined Region of Interest for EEG based Brain-Computer-Interfaces Moritz Grosse-Wentrup Institute of Automatic Control Engineering Technische Universität München 80333

More information

A New Generation of Brain-Computer Interfaces Driven by Discovery of Latent EEG-fMRI Linkages Using Tensor Decomposition

A New Generation of Brain-Computer Interfaces Driven by Discovery of Latent EEG-fMRI Linkages Using Tensor Decomposition A New Generation of Brain-Computer Interfaces Driven by Discovery of Latent EEG-fMRI Linkages Using Tensor Decomposition Gopikrishna Deshpande AU MRI Research Center AU Department of Electrical and Computer

More information

Iterative Laplacian Score for Feature Selection

Iterative Laplacian Score for Feature Selection Iterative Laplacian Score for Feature Selection Linling Zhu, Linsong Miao, and Daoqiang Zhang College of Computer Science and echnology, Nanjing University of Aeronautics and Astronautics, Nanjing 2006,

More information

Automatic Rank Determination in Projective Nonnegative Matrix Factorization

Automatic Rank Determination in Projective Nonnegative Matrix Factorization Automatic Rank Determination in Projective Nonnegative Matrix Factorization Zhirong Yang, Zhanxing Zhu, and Erkki Oja Department of Information and Computer Science Aalto University School of Science and

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Independent Component Analysis Barnabás Póczos Independent Component Analysis 2 Independent Component Analysis Model original signals Observations (Mixtures)

More information

Local Feature Extraction Models from Incomplete Data in Face Recognition Based on Nonnegative Matrix Factorization

Local Feature Extraction Models from Incomplete Data in Face Recognition Based on Nonnegative Matrix Factorization American Journal of Software Engineering and Applications 2015; 4(3): 50-55 Published online May 12, 2015 (http://www.sciencepublishinggroup.com/j/ajsea) doi: 10.11648/j.ajsea.20150403.12 ISSN: 2327-2473

More information

Dominant Feature Vectors Based Audio Similarity Measure

Dominant Feature Vectors Based Audio Similarity Measure Dominant Feature Vectors Based Audio Similarity Measure Jing Gu 1, Lie Lu 2, Rui Cai 3, Hong-Jiang Zhang 2, and Jian Yang 1 1 Dept. of Electronic Engineering, Tsinghua Univ., Beijing, 100084, China 2 Microsoft

More information

FAST AND EFFECTIVE MODEL ORDER SELECTION METHOD TO DETERMINE THE NUMBER OF SOURCES IN A LINEAR TRANSFORMATION MODEL

FAST AND EFFECTIVE MODEL ORDER SELECTION METHOD TO DETERMINE THE NUMBER OF SOURCES IN A LINEAR TRANSFORMATION MODEL FAST AND EFFECTIVE MODEL ORDER SELECTION METHOD TO DETERMINE THE NUMBER OF SOURCES IN A LINEAR TRANSFORMATION MODEL Fengyu Cong 1, Asoke K Nandi 1,2, Zhaoshui He 3, Andrzej Cichocki 4, Tapani Ristaniemi

More information

Discriminant Uncorrelated Neighborhood Preserving Projections

Discriminant Uncorrelated Neighborhood Preserving Projections Journal of Information & Computational Science 8: 14 (2011) 3019 3026 Available at http://www.joics.com Discriminant Uncorrelated Neighborhood Preserving Projections Guoqiang WANG a,, Weijuan ZHANG a,

More information

Single-channel source separation using non-negative matrix factorization

Single-channel source separation using non-negative matrix factorization Single-channel source separation using non-negative matrix factorization Mikkel N. Schmidt Technical University of Denmark mns@imm.dtu.dk www.mikkelschmidt.dk DTU Informatics Department of Informatics

More information

Stationary Common Spatial Patterns for Brain-Computer Interfacing

Stationary Common Spatial Patterns for Brain-Computer Interfacing Stationary Common Spatial Patterns for Brain-Computer Interfacing Wojciech Samek, Carmen Vidaurre, Klaus-Robert Müller, and Motoaki Kawanabe Machine Learning Group, Department of Computer Science, Berlin

More information

Applications of High Dimensional Clustering on Riemannian Manifolds in Brain Measurements

Applications of High Dimensional Clustering on Riemannian Manifolds in Brain Measurements Applications of High Dimensional Clustering on Riemannian Manifolds in Brain Measurements Nathalie Gayraud Supervisor: Maureen Clerc BCI-Lift Project/Team ATHENA Team, Inria Sophia Antipolis - Mediterranee

More information

Tensor decompositions for feature extraction and classification of high dimensional datasets

Tensor decompositions for feature extraction and classification of high dimensional datasets Invited Paper Tensor decompositions for feature extraction and classification of high dimensional datasets Anh Huy Phan a) and Andrzej Cichocki,b) Brain Science Institute, RIKEN, - Hirosawa, Wakoshi, Saitama

More information

Non-local Image Denoising by Using Bayesian Low-rank Tensor Factorization on High-order Patches

Non-local Image Denoising by Using Bayesian Low-rank Tensor Factorization on High-order Patches www.ijcsi.org https://doi.org/10.5281/zenodo.1467648 16 Non-local Image Denoising by Using Bayesian Low-rank Tensor Factorization on High-order Patches Lihua Gui, Xuyang Zhao, Qibin Zhao and Jianting Cao

More information

Common and Discriminative Subspace Kernel-Based Multiblock Tensor Partial Least Squares Regression

Common and Discriminative Subspace Kernel-Based Multiblock Tensor Partial Least Squares Regression Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI-6) Common and Discriminative Subspace Kernel-Based Multiblock Tensor Partial Least Squares Regression Ming Hou, Qibin Zhao,3,

More information

Coupled Matrix/Tensor Decompositions:

Coupled Matrix/Tensor Decompositions: Coupled Matrix/Tensor Decompositions: An Introduction Laurent Sorber Mikael Sørensen Marc Van Barel Lieven De Lathauwer KU Leuven Belgium Lieven.DeLathauwer@kuleuven-kulak.be 1 Canonical Polyadic Decomposition

More information

Theoretical Performance Analysis of Tucker Higher Order SVD in Extracting Structure from Multiple Signal-plus-Noise Matrices

Theoretical Performance Analysis of Tucker Higher Order SVD in Extracting Structure from Multiple Signal-plus-Noise Matrices Theoretical Performance Analysis of Tucker Higher Order SVD in Extracting Structure from Multiple Signal-plus-Noise Matrices Himanshu Nayar Dept. of EECS University of Michigan Ann Arbor Michigan 484 email:

More information

Multi-domain Feature of Event-Related Potential Extracted by Nonnegative Tensor Factorization: 5 vs. 14 Electrodes EEG Data

Multi-domain Feature of Event-Related Potential Extracted by Nonnegative Tensor Factorization: 5 vs. 14 Electrodes EEG Data Multi-domain Feature of Event-Related Potential Extracted by Nonnegative Tensor Factorization: 5 vs. 14 Electrodes EEG Data Fengyu Cong 1, Anh Huy Phan 2, Piia Astikainen 3, Qibin Zhao 2, Jari K. Hietanen

More information

Non-Negative Matrix Factorization with Quasi-Newton Optimization

Non-Negative Matrix Factorization with Quasi-Newton Optimization Non-Negative Matrix Factorization with Quasi-Newton Optimization Rafal ZDUNEK, Andrzej CICHOCKI Laboratory for Advanced Brain Signal Processing BSI, RIKEN, Wako-shi, JAPAN Abstract. Non-negative matrix

More information

Multiple Similarities Based Kernel Subspace Learning for Image Classification

Multiple Similarities Based Kernel Subspace Learning for Image Classification Multiple Similarities Based Kernel Subspace Learning for Image Classification Wang Yan, Qingshan Liu, Hanqing Lu, and Songde Ma National Laboratory of Pattern Recognition, Institute of Automation, Chinese

More information

Bayesian Classification of Single-Trial Event-Related Potentials in EEG

Bayesian Classification of Single-Trial Event-Related Potentials in EEG Bayesian Classification of Single-Trial Event-Related Potentials in EEG Jens Kohlmorgen Benjamin Blankertz Fraunhofer FIRST.IDA Kekuléstr. 7, D-12489 Berlin, Germany E-mail: {jek, blanker}@first.fraunhofer.de

More information

/16/$ IEEE 1728

/16/$ IEEE 1728 Extension of the Semi-Algebraic Framework for Approximate CP Decompositions via Simultaneous Matrix Diagonalization to the Efficient Calculation of Coupled CP Decompositions Kristina Naskovska and Martin

More information

Linear Prediction Based Blind Source Extraction Algorithms in Practical Applications

Linear Prediction Based Blind Source Extraction Algorithms in Practical Applications Linear Prediction Based Blind Source Extraction Algorithms in Practical Applications Zhi-Lin Zhang 1,2 and Liqing Zhang 1 1 Department of Computer Science and Engineering, Shanghai Jiao Tong University,

More information

Computational Linear Algebra

Computational Linear Algebra Computational Linear Algebra PD Dr. rer. nat. habil. Ralf-Peter Mundani Computation in Engineering / BGU Scientific Computing in Computer Science / INF Winter Term 2018/19 Part 6: Some Other Stuff PD Dr.

More information

Standard P300 scalp. Fz, Cz, Pz Posterior Electrodes: PO 7, PO 8, O Z

Standard P300 scalp. Fz, Cz, Pz Posterior Electrodes: PO 7, PO 8, O Z Brain Computer Interface Step Wise Linear Discriminant Analysis and Linear Regression Instructor: Dr. Ravi Sankar Student: Kun Li & David Seebran Brain-Computer Interface Utilize the electrical l activity

More information

RECOGNITION ALGORITHM FOR DEVELOPING A BRAIN- COMPUTER INTERFACE USING FUNCTIONAL NEAR INFRARED SPECTROSCOPY

RECOGNITION ALGORITHM FOR DEVELOPING A BRAIN- COMPUTER INTERFACE USING FUNCTIONAL NEAR INFRARED SPECTROSCOPY The 212 International Conference on Green Technology and Sustainable Development (GTSD212) RECOGNITION ALGORITHM FOR DEVELOPING A BRAIN- COMPUTER INTERFACE USING FUNCTIONAL NEAR INFRARED SPECTROSCOPY Cuong

More information

Scaling Neighbourhood Methods

Scaling Neighbourhood Methods Quick Recap Scaling Neighbourhood Methods Collaborative Filtering m = #items n = #users Complexity : m * m * n Comparative Scale of Signals ~50 M users ~25 M items Explicit Ratings ~ O(1M) (1 per billion)

More information

New Machine Learning Methods for Neuroimaging

New Machine Learning Methods for Neuroimaging New Machine Learning Methods for Neuroimaging Gatsby Computational Neuroscience Unit University College London, UK Dept of Computer Science University of Helsinki, Finland Outline Resting-state networks

More information

Notes on Tensor Decompositions

Notes on Tensor Decompositions Notes on Tensor Decompositions Alex Williams September 2016 These are some informal notes for a chalk-talk tutorial given at Janelia as part of the junior scientist workshop on theoretical neuroscience.

More information

1 Introduction Blind source separation (BSS) is a fundamental problem which is encountered in a variety of signal processing problems where multiple s

1 Introduction Blind source separation (BSS) is a fundamental problem which is encountered in a variety of signal processing problems where multiple s Blind Separation of Nonstationary Sources in Noisy Mixtures Seungjin CHOI x1 and Andrzej CICHOCKI y x Department of Electrical Engineering Chungbuk National University 48 Kaeshin-dong, Cheongju Chungbuk

More information

A Fast Augmented Lagrangian Algorithm for Learning Low-Rank Matrices

A Fast Augmented Lagrangian Algorithm for Learning Low-Rank Matrices A Fast Augmented Lagrangian Algorithm for Learning Low-Rank Matrices Ryota Tomioka 1, Taiji Suzuki 1, Masashi Sugiyama 2, Hisashi Kashima 1 1 The University of Tokyo 2 Tokyo Institute of Technology 2010-06-22

More information

Discriminative Direction for Kernel Classifiers

Discriminative Direction for Kernel Classifiers Discriminative Direction for Kernel Classifiers Polina Golland Artificial Intelligence Lab Massachusetts Institute of Technology Cambridge, MA 02139 polina@ai.mit.edu Abstract In many scientific and engineering

More information

Nonnegative Tensor Factorization with Smoothness Constraints

Nonnegative Tensor Factorization with Smoothness Constraints Nonnegative Tensor Factorization with Smoothness Constraints Rafal Zdunek 1 and Tomasz M. Rutkowski 2 1 Institute of Telecommunications, Teleinformatics and Acoustics, Wroclaw University of Technology,

More information

Group Sparse Non-negative Matrix Factorization for Multi-Manifold Learning

Group Sparse Non-negative Matrix Factorization for Multi-Manifold Learning LIU, LU, GU: GROUP SPARSE NMF FOR MULTI-MANIFOLD LEARNING 1 Group Sparse Non-negative Matrix Factorization for Multi-Manifold Learning Xiangyang Liu 1,2 liuxy@sjtu.edu.cn Hongtao Lu 1 htlu@sjtu.edu.cn

More information

Robust extraction of specific signals with temporal structure

Robust extraction of specific signals with temporal structure Robust extraction of specific signals with temporal structure Zhi-Lin Zhang, Zhang Yi Computational Intelligence Laboratory, School of Computer Science and Engineering, University of Electronic Science

More information

Empirical Discriminative Tensor Analysis for Crime Forecasting

Empirical Discriminative Tensor Analysis for Crime Forecasting Empirical Discriminative Tensor Analysis for Crime Forecasting Yang Mu 1, Wei Ding 1, Melissa Morabito 2, Dacheng Tao 3, 1 Department of Computer Science, University of Massachusetts Boston,100 Morrissey

More information

Towards a Regression using Tensors

Towards a Regression using Tensors February 27, 2014 Outline Background 1 Background Linear Regression Tensorial Data Analysis 2 Definition Tensor Operation Tensor Decomposition 3 Model Attention Deficit Hyperactivity Disorder Data Analysis

More information

RESEARCH ON COMPLEX THREE ORDER CUMULANTS COUPLING FEATURES IN FAULT DIAGNOSIS

RESEARCH ON COMPLEX THREE ORDER CUMULANTS COUPLING FEATURES IN FAULT DIAGNOSIS RESEARCH ON COMPLEX THREE ORDER CUMULANTS COUPLING FEATURES IN FAULT DIAGNOSIS WANG YUANZHI School of Computer and Information, Anqing Normal College, Anqing 2460, China ABSTRACT Compared with bispectrum,

More information

Symmetric Two Dimensional Linear Discriminant Analysis (2DLDA)

Symmetric Two Dimensional Linear Discriminant Analysis (2DLDA) Symmetric Two Dimensional inear Discriminant Analysis (2DDA) Dijun uo, Chris Ding, Heng Huang University of Texas at Arlington 701 S. Nedderman Drive Arlington, TX 76013 dijun.luo@gmail.com, {chqding,

More information

NMF WITH SPECTRAL AND TEMPORAL CONTINUITY CRITERIA FOR MONAURAL SOUND SOURCE SEPARATION. Julian M. Becker, Christian Sohn and Christian Rohlfing

NMF WITH SPECTRAL AND TEMPORAL CONTINUITY CRITERIA FOR MONAURAL SOUND SOURCE SEPARATION. Julian M. Becker, Christian Sohn and Christian Rohlfing NMF WITH SPECTRAL AND TEMPORAL CONTINUITY CRITERIA FOR MONAURAL SOUND SOURCE SEPARATION Julian M. ecker, Christian Sohn Christian Rohlfing Institut für Nachrichtentechnik RWTH Aachen University D-52056

More information

Note on Algorithm Differences Between Nonnegative Matrix Factorization And Probabilistic Latent Semantic Indexing

Note on Algorithm Differences Between Nonnegative Matrix Factorization And Probabilistic Latent Semantic Indexing Note on Algorithm Differences Between Nonnegative Matrix Factorization And Probabilistic Latent Semantic Indexing 1 Zhong-Yuan Zhang, 2 Chris Ding, 3 Jie Tang *1, Corresponding Author School of Statistics,

More information

A Survey of Multilinear Subspace Learning for Tensor Data

A Survey of Multilinear Subspace Learning for Tensor Data A Survey of Multilinear Subspace Learning for Tensor Data Haiping Lu a, K. N. Plataniotis b, A. N. Venetsanopoulos b,c a Institute for Infocomm Research, Agency for Science, Technology and Research, #21-01

More information

ENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition

ENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition ENGG5781 Matrix Analysis and Computations Lecture 10: Non-Negative Matrix Factorization and Tensor Decomposition Wing-Kin (Ken) Ma 2017 2018 Term 2 Department of Electronic Engineering The Chinese University

More information

Dimension Reduction Using Nonnegative Matrix Tri-Factorization in Multi-label Classification

Dimension Reduction Using Nonnegative Matrix Tri-Factorization in Multi-label Classification 250 Int'l Conf. Par. and Dist. Proc. Tech. and Appl. PDPTA'15 Dimension Reduction Using Nonnegative Matrix Tri-Factorization in Multi-label Classification Keigo Kimura, Mineichi Kudo and Lu Sun Graduate

More information

OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES

OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES OBJECT DETECTION AND RECOGNITION IN DIGITAL IMAGES THEORY AND PRACTICE Bogustaw Cyganek AGH University of Science and Technology, Poland WILEY A John Wiley &. Sons, Ltd., Publication Contents Preface Acknowledgements

More information

Two-Stage Temporally Correlated Source Extraction Algorithm with Its Application in Extraction of Event-Related Potentials

Two-Stage Temporally Correlated Source Extraction Algorithm with Its Application in Extraction of Event-Related Potentials Two-Stage Temporally Correlated Source Extraction Algorithm with Its Application in Extraction of Event-Related Potentials Zhi-Lin Zhang,2, Liqing Zhang, Xiu-Ling Wu,JieLi, and Qibin Zhao Department of

More information

Machine learning strategies for fmri analysis

Machine learning strategies for fmri analysis Machine learning strategies for fmri analysis DTU Informatics Technical University of Denmark Co-workers: Morten Mørup, Kristoffer Madsen, Peter Mondrup, Daniel Jacobsen, Stephen Strother,. OUTLINE Do

More information

arxiv: v2 [cs.lg] 5 Aug 2014

arxiv: v2 [cs.lg] 5 Aug 2014 DuSK: A Dual Structure-preserving Kernel for Supervised Tensor Learning with Applications to Neuroimages Lifang He Xiangnan Kong Philip S Yu Ann B Ragin Zhifeng Hao Xiaowei Yang arxiv:14078289v2 [cslg]

More information

Fisher Tensor Decomposition for Unconstrained Gait Recognition

Fisher Tensor Decomposition for Unconstrained Gait Recognition Fisher Tensor Decomposition for Unconstrained Gait Recognition Wenjuan Gong, Michael Sapienza, and Fabio Cuzzolin Oxford Brookes University, UK {wgong,michael.sapienza-2011,fabio.cuzzolin}@brookes.ac.uk

More information

POLYNOMIAL SINGULAR VALUES FOR NUMBER OF WIDEBAND SOURCES ESTIMATION AND PRINCIPAL COMPONENT ANALYSIS

POLYNOMIAL SINGULAR VALUES FOR NUMBER OF WIDEBAND SOURCES ESTIMATION AND PRINCIPAL COMPONENT ANALYSIS POLYNOMIAL SINGULAR VALUES FOR NUMBER OF WIDEBAND SOURCES ESTIMATION AND PRINCIPAL COMPONENT ANALYSIS Russell H. Lambert RF and Advanced Mixed Signal Unit Broadcom Pasadena, CA USA russ@broadcom.com Marcel

More information

HYPERSPECTRAL imaging sensors collect hyperspectral

HYPERSPECTRAL imaging sensors collect hyperspectral 1 Hyperspectral Image Spectral-Spatial Feature Extraction via Tensor Principal Component Analysis Yuemei Ren, Liang Liao, Member, IEEE, Stephen John Maybank, Fellow, IEEE, Yanning Zhang, Senior member,

More information

Recovering Tensor Data from Incomplete Measurement via Compressive Sampling

Recovering Tensor Data from Incomplete Measurement via Compressive Sampling Recovering Tensor Data from Incomplete Measurement via Compressive Sampling Jason R. Holloway hollowjr@clarkson.edu Carmeliza Navasca cnavasca@clarkson.edu Department of Electrical Engineering Clarkson

More information

Algorithms for Nonnegative Tensor Factorization

Algorithms for Nonnegative Tensor Factorization Algorithms for Nonnegative Tensor Factorization Markus Flatz Technical Report 2013-05 November 2013 Department of Computer Sciences Jakob-Haringer-Straße 2 5020 Salzburg Austria www.cosy.sbg.ac.at Technical

More information

PROJECTIVE NON-NEGATIVE MATRIX FACTORIZATION WITH APPLICATIONS TO FACIAL IMAGE PROCESSING

PROJECTIVE NON-NEGATIVE MATRIX FACTORIZATION WITH APPLICATIONS TO FACIAL IMAGE PROCESSING st Reading October 0, 200 8:4 WSPC/-IJPRAI SPI-J068 008 International Journal of Pattern Recognition and Artificial Intelligence Vol. 2, No. 8 (200) 0 c World Scientific Publishing Company PROJECTIVE NON-NEGATIVE

More information

Low-Rank Tensor Decomposition Based Dynamic Network Tracking

Low-Rank Tensor Decomposition Based Dynamic Network Tracking Low-Rank Tensor Decomposition Based Dynamic Network Tracking David M. Zoltowski, Selin Aviyente Department of Electrical and Computer Engineering Michigan State University East Lansing, MI 48824, USA zoltow11@msu.edu,

More information

MODEL-BASED SOURCE SEPARATION FOR MULTI-CLASS MOTOR IMAGERY

MODEL-BASED SOURCE SEPARATION FOR MULTI-CLASS MOTOR IMAGERY MODEL-BASED SOURCE SEPARATION FOR MULTI-CLASS MOTOR IMAGERY C. Gouy-Pailler 1, M. Congedo 1, C. Jutten 1, C. Brunner 2 and G. Pfurtscheller 2 1 GIPSA-lab/CNRS/INPG-UJF-UPMF-Stendhal 46 Avenue Félix Viallet,

More information

Kazuhiro Fukui, University of Tsukuba

Kazuhiro Fukui, University of Tsukuba Subspace Methods Kazuhiro Fukui, University of Tsukuba Synonyms Multiple similarity method Related Concepts Principal component analysis (PCA) Subspace analysis Dimensionality reduction Definition Subspace

More information

The Canonical Tensor Decomposition and Its Applications to Social Network Analysis

The Canonical Tensor Decomposition and Its Applications to Social Network Analysis The Canonical Tensor Decomposition and Its Applications to Social Network Analysis Evrim Acar, Tamara G. Kolda and Daniel M. Dunlavy Sandia National Labs Sandia is a multiprogram laboratory operated by

More information

Motor imagery EEG discrimination using Hilbert-Huang Entropy.

Motor imagery EEG discrimination using Hilbert-Huang Entropy. Biomedical Research 017; 8 (): 77-733 ISSN 0970-938X www.biomedres.info Motor imagery EEG discrimination using Hilbert-Huang Entropy. Yuyi 1, Zhaoyun 1*, Li Surui, Shi Lijuan 1, Li Zhenxin 1, Dong Bingchao

More information

c Springer, Reprinted with permission.

c Springer, Reprinted with permission. Zhijian Yuan and Erkki Oja. A FastICA Algorithm for Non-negative Independent Component Analysis. In Puntonet, Carlos G.; Prieto, Alberto (Eds.), Proceedings of the Fifth International Symposium on Independent

More information

ON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM. Brain Science Institute, RIKEN, Wako-shi, Saitama , Japan

ON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM. Brain Science Institute, RIKEN, Wako-shi, Saitama , Japan ON SOME EXTENSIONS OF THE NATURAL GRADIENT ALGORITHM Pando Georgiev a, Andrzej Cichocki b and Shun-ichi Amari c Brain Science Institute, RIKEN, Wako-shi, Saitama 351-01, Japan a On leave from the Sofia

More information

Blind Source Separation of Single Channel Mixture Using Tensorization and Tensor Diagonalization

Blind Source Separation of Single Channel Mixture Using Tensorization and Tensor Diagonalization Blind Source Separation of Single Channel Mixture Using Tensorization and Tensor Diagonalization Anh-Huy Phan 1, Petr Tichavský 2(B), and Andrzej Cichocki 1,3 1 Lab for Advanced Brain Signal Processing,

More information

Spatial Harmonic Analysis of EEG Data

Spatial Harmonic Analysis of EEG Data Spatial Harmonic Analysis of EEG Data Uwe Graichen Institute of Biomedical Engineering and Informatics Ilmenau University of Technology Singapore, 8/11/2012 Outline 1 Motivation 2 Introduction 3 Material

More information

Combining EMD with ICA to analyze combined EEG-fMRI Data

Combining EMD with ICA to analyze combined EEG-fMRI Data AL-BADDAI, AL-SUBARI, et al.: COMBINED BEEMD-ICA 1 Combining EMD with ICA to analyze combined EEG-fMRI Data Saad M. H. Al-Baddai 1,2 saad.albaddai@yahoo.com arema S. A. Al-Subari 1,2 s.karema@yahoo.com

More information

Electroencephalogram Based Causality Graph Analysis in Behavior Tasks of Parkinson s Disease Patients

Electroencephalogram Based Causality Graph Analysis in Behavior Tasks of Parkinson s Disease Patients University of Denver Digital Commons @ DU Electronic Theses and Dissertations Graduate Studies 1-1-2015 Electroencephalogram Based Causality Graph Analysis in Behavior Tasks of Parkinson s Disease Patients

More information

Executed Movement Using EEG Signals through a Naive Bayes Classifier

Executed Movement Using EEG Signals through a Naive Bayes Classifier Micromachines 2014, 5, 1082-1105; doi:10.3390/mi5041082 Article OPEN ACCESS micromachines ISSN 2072-666X www.mdpi.com/journal/micromachines Executed Movement Using EEG Signals through a Naive Bayes Classifier

More information

Truncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences

Truncation Strategy of Tensor Compressive Sensing for Noisy Video Sequences Journal of Information Hiding and Multimedia Signal Processing c 2016 ISSN 207-4212 Ubiquitous International Volume 7, Number 5, September 2016 Truncation Strategy of Tensor Compressive Sensing for Noisy

More information