WWTP diagnosis based on robust principal component analysis

Size: px
Start display at page:

Download "WWTP diagnosis based on robust principal component analysis"

Transcription

1 WWTP diagnosis based on robust principal component analysis Y. Tharrault, G. Mourot, J. Ragot, M.-F. Harkat Centre de Recherche en Automatique de Nancy Nancy-université, CNRS 2, Avenue de la forêt de Haye Vandœuvre-lès-Nancy Cedex 7 th IFAC Symposium on Fault Detection, Supervision and Safety of Technical Processes July 3, 2009, Barcelona, Spain Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 1 / 21

2 Outline 1 Principle of the Principal component analysis 2 Robust PCA 3 Fault detection and isolation 4 Application to hydraulic part of a wastewater treatment plant Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 2 / 21

3 Principle of the Principal component analysis Data matrix X R N n in a normal process operation Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 3 / 21

4 Principle of the Principal component analysis Data matrix X R N n in a normal process operation PCA Maximization of the variance projections T = X P T R N n : principal component matrix P R n n : projection matrix Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 3 / 21

5 Principle of the Principal component analysis Data matrix X R N n in a normal process operation PCA Maximization of the variance projections T = X P T R N n : principal component matrix P R n n : projection matrix Decomposition in eigenvalues/eigenvectors of the covariance matrix Σ = 1 N 1 X T X = PΛP T with PP T = P T P = I n Σ = [ ] [ ][ ] Λ P l P l 0 P T l n l 0 Λ n l Pn l T Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 3 / 21

6 Principle of the Principal component analysis Data matrix X R N n in a normal process operation PCA Maximization of the variance projections T = X P T R N n : principal component matrix P R n n : projection matrix Decomposition in eigenvalues/eigenvectors of the covariance matrix Σ = 1 N 1 X T X = PΛP T with PP T = P T P = I n Σ = [ ] [ ][ ] Λ P l P l 0 P T l n l 0 Λ n l Pn l T Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 3 / 21

7 Principle of the Principal component analysis Data matrix X R N n in a normal process operation PCA Maximization of the variance projections T = X P T R N n : principal component matrix P R n n : projection matrix Decomposition in eigenvalues/eigenvectors of the covariance matrix Σ = 1 N 1 X T X = PΛP T with PP T = P T P = I n Σ = [ ] [ ][ ] Λ P l P l 0 P T l n l 0 Λ n l Pn l T Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 3 / 21

8 Principle of the Principal component analysis Data matrix X R N n in a normal process operation PCA Maximization of the variance projections T = X P T R N n : principal component matrix P R n n : projection matrix Decomposition in eigenvalues/eigenvectors of the covariance matrix Σ = 1 N 1 X T X = PΛP T with PP T = P T P = I n Σ = [ ] [ ][ ] Λ P l P l 0 P T l n l 0 Λ n l Pn l T Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 3 / 21

9 Principle of the Principal component analysis Decomposition x 2 ˆX E x 1 Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 4 / 21

10 Principle of the Principal component analysis Decomposition Principal part ˆX = X C l with C l = P l P T l x 2 ˆX E x 1 Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 4 / 21

11 Principle of the Principal component analysis Decomposition Principal part ˆX = X C l with C l = P l P T l x 2 ˆX Residual part E = X ˆX = X (I n C l ) = X P n l P T n l E x 1 Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 4 / 21

12 Principle of the Principal component analysis Decomposition Principal part ˆX = X C l with C l = P l P T l x 2 ˆX Residual part E = X ˆX = X (I n C l ) = X P n l P T n l E x 1 Determination of the number of principal components l Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 4 / 21

13 PCA weakness Sensitive to outliers z 2 u 1 u 1 Outliers: Data different from the normal operating conditions (faulty data, data obtained during shutdown or stratup periods or data issued form different operating mode) u 2 u 2 z 1 Robust PCA with respect to outliers Outliers detection and isolation Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 5 / 21

14 Outliers n = 2 variables (x 1, x 2 ) X = [x 1 x 2 ] l = 1 3 x 2 P 1 P 2 1 x P P Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 6 / 21

15 Outliers n = 2 variables (x 1, x 2 ) X = [x 1 x 2 ] l = 1 Outliers 1 : green observation Outliers 2 : red observation 3 x 2 P 1 P 2 1 x P P Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 6 / 21

16 Robust PCA Residual r(k) r(k) = Pn l T (x(k) µ) 2 with x(k) an observation µ the mean of the data X P n l is the eigenvector matrix of the robust covariance matrix corresponding to its n l smallest eigenvalues Objective function PCA minimise the following criterion: 1 N 1 N k=1 (r(k)) Weigth function r(k) with the constraint P T n l P n l = I n l. 1 r(k) Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 7 / 21

17 Robust PCA Residual r(k) r(k) = Pn l T (x(k) µ) 2 with x(k) an observation µ the mean of the data X P n l is the eigenvector matrix of the robust covariance matrix corresponding to its n l smallest eigenvalues The general scale-m estimator minimizes the following objective function with the constraint P T n l P n l = I n l : N 1 N k=1 ρ ( ) r(k) ˆσ with the function ρ as the objective function, ˆσ the robust scale of the residual r ( k) calculated by minimising the following criterion: N 1 N k=1 ρ ( ) r(k) = δ ˆσ Objective function ρ Weight function w r(k) ˆσ r(k) ˆσ Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 8 / 21

18 Robust PCA Initialization with a robust covariance matrix Robust covariance matrix N 1 N w(i,j)(x(i) x(j))(x(i) x(j)) T i=1 j=i+1 V = N 1 N w(i, j) i=1 j=i+1 where the weights w(i, j) themselves are defined by: ( w(i,j) = exp β ) 2 (x(i) x(j))t Σ 1 (x(i) x(j)) with β turning parameter For β = 0, V = 2Σ Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS 09 9 / 21

19 Robust PCA Only robust to outliers with a projection onto the residual space sacle M-estimator The scale M-estimator maximizes the following criterion with the constraint P T l P l = I l : with ˆσ the robust scale of the residual r and ρ the objective function. N 1 N k=1 ( P T ρ l (x(k) µ) 2 ) ˆσ Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

20 Fault detection and isolation The reconstruction ˆx R Minimizing the influence of fault ˆx R (k) = x(k) Ξ R f R with x(k) : an observation f R : the fault magnitude (unknown) Ξ R : matrix of reconstruction directions Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

21 Fault detection and isolation The reconstruction ˆx R Minimizing the influence of fault ˆx R (k) = x(k) Ξ R f R with x(k) : an observation f R : the fault magnitude (unknown) Ξ R : matrix of reconstruction directions For example, to reconstruct 2 variables (R = 2,4) among 5 variables [ Ξ R = ] T Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

22 Fault detection and isolation The reconstruction ˆx R Minimizing the influence of fault ˆx R (k) = x(k) Ξ R f R with x(k) : an observation f R : the fault magnitude (unknown) Ξ R : matrix of reconstruction directions For example, to reconstruct 2 variables (R = 2,4) among 5 variables [ Ξ R = ] T Estimation of the fault magnitude ˆfR : with D R (k) = ˆx R (k) T PΛ 1 P Tˆx R (k) ˆfR = argmin{d R (k)} f R Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

23 Fault detection and isolation The reconstruction vector ˆx R (k) of the vector x(k) is given by: ˆx R (k) = ( I Ξ R (Ξ T R PΛ 1 P T Ξ R ) 1 Ξ T R PΛ 1 P T) x(k) Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

24 Fault detection and isolation The reconstruction vector ˆx R (k) of the vector x(k) is given by: ˆx R (k) = ( I Ξ R (Ξ T R PΛ 1 P T Ξ R ) 1 Ξ T R PΛ 1 P T) x(k) Condition of reconstruction: Existence of (Ξ T R PΛ 1 P T Ξ R ) 1 matrix Ξ T R PΛ 1 P T Ξ R of full rank Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

25 Fault detection and isolation The reconstruction vector ˆx R (k) of the vector x(k) is given by: ˆx R (k) = ( I Ξ R (Ξ T R PΛ 1 P T Ξ R ) 1 Ξ T R PΛ 1 P T) x(k) Condition of reconstruction: Existence of (Ξ T R PΛ 1 P T Ξ R ) 1 matrix Ξ T R PΛ 1 P T Ξ R of full rank To reconstruct a fault, it must be at least projected onto the principal space (r l) or onto the principal space (r n l). This implies that the number of reconstructed variables r must respect the following inequality: r max(n l,l) with r : Number of reconstructed variables l : Number of principal components n : Number of variables Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

26 Fault detection and isolation The reconstruction vector ˆx R (k) of the vector x(k) is given by: ˆx R (k) = ( I Ξ R (Ξ T R PΛ 1 P T Ξ R ) 1 Ξ T R PΛ 1 P T) x(k) Condition of reconstruction: Existence of (Ξ T R PΛ 1 P T Ξ R ) 1 matrix Ξ T R PΛ 1 P T Ξ R of full rank To reconstruct a fault, it must be at least projected onto the principal space (r l) or onto the principal space (r n l). This implies that the number of reconstructed variables r must respect the following inequality: r max(n l,l) with r : Number of reconstructed variables l : Number of principal components n : Number of variables The number of maximum reconstruction: with C r n denotes the combination of r from n. max(n l,l) 1 C r n r=1 Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

27 Fault detection and isolation Fault detection indicator D R D R (k) = ˆx R (k) T PΛ 1 P Tˆx R (k) Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

28 Fault detection and isolation Fault detection indicator D R D R (k) = ˆx R (k) T PΛ 1 P Tˆx R (k) Fault detection A fault is detected, if: D R (k) > γ 2 α with γ 2 α the detection threshold of indicator D R, Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

29 Fault detection and isolation Fault detection indicator D R D R (k) = ˆx R (k) T PΛ 1 P Tˆx R (k) Fault detection A fault is detected, if: D R (k) > γ 2 α with γ 2 α the detection threshold of indicator D R, Fault isolation For the faulty observations, the faulty variables ˆR are determined as follows: ˆR = arg D R (k) < γ 2 α R I with γ 2 α the detection threshold of indicator D R and I all combinations of possible reconstruction directions. Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

30 Fault detection and isolation Reduction of the number of reconstruction Determination of the colinear direction projection a global indicator K is built: K(R 1,R 2 ) = max{(d(r 1,R 2 ), d(r1,r 2 )} with R 1 and R 2 correspond to sets of variable reconstruction and d(r 1,R 2 ) distance between two sub-spaces onto principal space and d(r1,r 2 ) distance between two sub-spaces onto residual space d(r 1,R 2 ) = ˆΞ R1 (ˆΞ T R 1 ˆΞR1 ) 1ˆΞT R1 ˆΞ R2 (ˆΞ T R 2 ˆΞR2 ) 1ˆΞT R2 2 d(r 1,R 2 ) = Ξ R1 ( Ξ T R 1 Ξ R1 ) 1 Ξ T R 1 Ξ R2 ( Ξ T R 2 Ξ R2 ) 1 Ξ T R 2 2 with ˆΞ R1 = Λ 1/2 l P T l Ξ R 1, Ξ R1 = Λ 1/2 n l PT n l Ξ R 1. For example, to reconstruct 2 variables (R 1 = 2,4) among 5 variables [ Ξ R1 = ] T Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

31 Fault detection and isolation Algorithm for the determination of the detectable and isolable faults 1 r = 1 2 Calculate for all available directions (R 1 I and R 2 I) the indicator K(R 1,R 2 ). If K(R 1,R 2 ) is equal to zero: only a set of variables potentially faulty may be determined, i.e. the faulty variables are associated to the indices R 1 or R 2 or R 1 and R 2. Thus, it is only required to determine one direction, for example R 1. If K(R 1,R 2 ) is closed to zero: magnitude of the fault has to be important to ensure fault isolation. Else the fault are isolable 3 r = r While r < max(l,n l) do to the step 2 Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

32 Application to hydraulic part of a wastewater treatment plant Description of the station cm 4 Extraction surface sludge 1m zoom P 5 Biology1 2 Recirculation sludge bar screen Sump Valve Overflow Biology2 10 Sûre Extraction sludge 1 High before bar screen (H1) 2 Command of the cleaning of the bar screens 3 Water level after the bar screen (H2) 4 Water level in the sump (H3) 5 Pumping station (C4) 6 Flow (Q5) Pump for recirculation sludge Pump for extraction sludge Pump for extraction surface sludge Water level in the overflow (H6) Construction of the data matrix Dynamic process : Temporal lag Non linear process : Transformed variables x(k) = [ H1(k) H2(k) H3(k) Q5(k) H6(k) tanh((q5(k 1) 550)./150) H1(k 1) H6(k 1) C4(k) ] T (1) The data matrix X is constituted of N observations of the vector x(k). Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

33 Application to hydraulic part of a wastewater treatment plant Construction of the model 4 principal components are determined Measure Classical estimation Robust estimation 50 H1 (cm) Classical residual 30 Robust residual Time Measure and estimation of H1 Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

34 Application to hydraulic part of a wastewater treatment plant Analysis of the reconstruction directions The maximum number of reconstructions is then equal to 255 (max(n l,l) 1 = 4) K R R Indicator K for r = 1 The number of useful reconstruction can be reduced to 202 Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

35 Application to hydraulic part of a wastewater treatment plant Fault detection Figure: Fault detection with Mahalanobis distance Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

36 Application to hydraulic part of a wastewater treatment plant Fault isolation Fault index Reconstruction direction under the detection threshold 16, 17 D 1 3, 10, 11, 14, 15, 19, 20, 21 D 3 7, 12 D 1,6 6, 8, 13, 18, 22 D 3,9 1, 2, 5 D 1,3,4 4 D 3,7,9 9 D 1,2,3,7 Table: Summary of fault isolation Faults 16 and 17 are under the threshold when the first variable (H1(k)) is reconstructed. variable H1(k) is faulty Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

37 Conclusion Conclusion Robust PCA with respect to outliers > directly applicable on data containing potential faults use of the principle of reconstruction and projection of the reconstructed data together > outliers detection and isolation Reduction of the computational load > determination of the dectable and isolable faults Application to hydraulic part of a wastewater treatment plant Tharrault, Mourot, Ragot, Harkat (CRAN) WWTP diagnosis with robust PCA SAFEPROCESS / 21

FAULT DETECTION AND ISOLATION WITH ROBUST PRINCIPAL COMPONENT ANALYSIS

FAULT DETECTION AND ISOLATION WITH ROBUST PRINCIPAL COMPONENT ANALYSIS Int. J. Appl. Math. Comput. Sci., 8, Vol. 8, No. 4, 49 44 DOI:.478/v6-8-38-3 FAULT DETECTION AND ISOLATION WITH ROBUST PRINCIPAL COMPONENT ANALYSIS YVON THARRAULT, GILLES MOUROT, JOSÉ RAGOT, DIDIER MAQUIN

More information

State estimation of uncertain multiple model with unknown inputs

State estimation of uncertain multiple model with unknown inputs State estimation of uncertain multiple model with unknown inputs Abdelkader Akhenak, Mohammed Chadli, Didier Maquin and José Ragot Centre de Recherche en Automatique de Nancy, CNRS UMR 79 Institut National

More information

Multiple sensor fault detection in heat exchanger system

Multiple sensor fault detection in heat exchanger system Multiple sensor fault detection in heat exchanger system Abdel Aïtouche, Didier Maquin, Frédéric Busson To cite this version: Abdel Aïtouche, Didier Maquin, Frédéric Busson. Multiple sensor fault detection

More information

Design of observers for Takagi-Sugeno fuzzy models for Fault Detection and Isolation

Design of observers for Takagi-Sugeno fuzzy models for Fault Detection and Isolation Design of observers for Takagi-Sugeno fuzzy models for Fault Detection and Isolation Abdelkader Akhenak, Mohammed Chadli, José Ragot and Didier Maquin Institut de Recherche en Systèmes Electroniques Embarqués

More information

Principal Component Analysis vs. Independent Component Analysis for Damage Detection

Principal Component Analysis vs. Independent Component Analysis for Damage Detection 6th European Workshop on Structural Health Monitoring - Fr..D.4 Principal Component Analysis vs. Independent Component Analysis for Damage Detection D. A. TIBADUIZA, L. E. MUJICA, M. ANAYA, J. RODELLAR

More information

RECONFIGURABILITY ANALYSIS FOR RELIABLE FAULT TOLERANT CONTROL DESIGN

RECONFIGURABILITY ANALYSIS FOR RELIABLE FAULT TOLERANT CONTROL DESIGN Int. J. Appl. Math. Comput. Sci., 2, Vol. 2, No. 3, 43 439 DOI:.2478/v6--32-z RECONFIGURABILITY ANALYSIS FOR RELIABLE FAULT TOLERANT CONTROL DESIGN AHMED KHELASSI, DIDIER THEILLIOL, PHILIPPE WEBER Research

More information

Convergence of Eigenspaces in Kernel Principal Component Analysis

Convergence of Eigenspaces in Kernel Principal Component Analysis Convergence of Eigenspaces in Kernel Principal Component Analysis Shixin Wang Advanced machine learning April 19, 2016 Shixin Wang Convergence of Eigenspaces April 19, 2016 1 / 18 Outline 1 Motivation

More information

PI OBSERVER DESIGN FOR DISCRETE-TIME DECOUPLED MULTIPLE MODELS. Rodolfo Orjuela, Benoît Marx, José Ragot and Didier Maquin

PI OBSERVER DESIGN FOR DISCRETE-TIME DECOUPLED MULTIPLE MODELS. Rodolfo Orjuela, Benoît Marx, José Ragot and Didier Maquin PI OBSERVER DESIGN FOR DISCRETE-TIME DECOUPLED MULTIPLE MODELS Rodolfo Orjuela Benoît Marx José Ragot and Didier Maquin Centre de Recherche en Automatique de Nancy UMR 739 Nancy-Université CNRS 2 Avenue

More information

Fault diagnosis in Takagi-Sugeno nonlinear systems

Fault diagnosis in Takagi-Sugeno nonlinear systems Fault diagnosis in akagi-sugeno nonlinear systems Dalil Ichalal Benoit Marx José Ragot Didier Maquin Centre de Recherche en Automatique de Nancy (CRAN), Nancy-Université, CNRS, 2 avenue de la forêt de

More information

A methodology for fault detection in rolling element bearings using singular spectrum analysis

A methodology for fault detection in rolling element bearings using singular spectrum analysis A methodology for fault detection in rolling element bearings using singular spectrum analysis Hussein Al Bugharbee,1, and Irina Trendafilova 2 1 Department of Mechanical engineering, the University of

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

Principal Component Analysis

Principal Component Analysis CSci 5525: Machine Learning Dec 3, 2008 The Main Idea Given a dataset X = {x 1,..., x N } The Main Idea Given a dataset X = {x 1,..., x N } Find a low-dimensional linear projection The Main Idea Given

More information

PROCESS MONITORING OF THREE TANK SYSTEM. Outline Introduction Automation system PCA method Process monitoring with T 2 and Q statistics Conclusions

PROCESS MONITORING OF THREE TANK SYSTEM. Outline Introduction Automation system PCA method Process monitoring with T 2 and Q statistics Conclusions PROCESS MONITORING OF THREE TANK SYSTEM Outline Introduction Automation system PCA method Process monitoring with T 2 and Q statistics Conclusions Introduction Monitoring system for the level and temperature

More information

CS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)

CS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision) CS4495/6495 Introduction to Computer Vision 8B-L2 Principle Component Analysis (and its use in Computer Vision) Wavelength 2 Wavelength 2 Principal Components Principal components are all about the directions

More information

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Changing Basis We ve talked so far about re-writing our data using a new set of variables, or a new basis.

More information

DIAGNOSIS USING FINITE MEMORY OBSERVERS ON AN UNKNOWN INPUT SYSTEM

DIAGNOSIS USING FINITE MEMORY OBSERVERS ON AN UNKNOWN INPUT SYSTEM DIAGNOSIS USING FINITE MEMORY OBSERVERS ON AN UNKNOWN INPUT SYSTEM Guillaume Graton,, Frédéric Kratz Jacques Fantini and Pierre Dupraz Laboratoire Vision et Robotique, Site de l IUT, 63, avenue de Lattre

More information

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Changing Basis We ve talked so far about re-writing our data using a new set of variables, or a new basis.

More information

Linear Dimensionality Reduction

Linear Dimensionality Reduction Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Principal Component Analysis 3 Factor Analysis

More information

State estimation and fault detection of uncertain systems based on an interval approach

State estimation and fault detection of uncertain systems based on an interval approach Author manuscript, published in "Conference on Control and Fault-Tolerant Systems, SysTol', Nice : France " DOI : 9/SYSTOL56766 State estimation and fault detection of uncertain systems based on an interval

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis Yuanzhen Shao MA 26500 Yuanzhen Shao PCA 1 / 13 Data as points in R n Assume that we have a collection of data in R n. x 11 x 21 x 12 S = {X 1 =., X x 22 2 =.,, X x m2 m =.

More information

COMP 408/508. Computer Vision Fall 2017 PCA for Recognition

COMP 408/508. Computer Vision Fall 2017 PCA for Recognition COMP 408/508 Computer Vision Fall 07 PCA or Recognition Recall: Color Gradient by PCA v λ ( G G, ) x x x R R v, v : eigenvectors o D D with v ^v (, ) x x λ, λ : eigenvalues o D D with λ >λ v λ ( B B, )

More information

1 Principal Components Analysis

1 Principal Components Analysis Lecture 3 and 4 Sept. 18 and Sept.20-2006 Data Visualization STAT 442 / 890, CM 462 Lecture: Ali Ghodsi 1 Principal Components Analysis Principal components analysis (PCA) is a very popular technique for

More information

Computation. For QDA we need to calculate: Lets first consider the case that

Computation. For QDA we need to calculate: Lets first consider the case that Computation For QDA we need to calculate: δ (x) = 1 2 log( Σ ) 1 2 (x µ ) Σ 1 (x µ ) + log(π ) Lets first consider the case that Σ = I,. This is the case where each distribution is spherical, around the

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 23, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 47 High dimensional

More information

Eigenfaces. Face Recognition Using Principal Components Analysis

Eigenfaces. Face Recognition Using Principal Components Analysis Eigenfaces Face Recognition Using Principal Components Analysis M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, 3(1), pp. 71-86, 1991. Slides : George Bebis, UNR

More information

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization Xiaohui Xie University of California, Irvine xhx@uci.edu Xiaohui Xie (UCI) ICS 6N 1 / 21 Symmetric matrices An n n

More information

Principal Component Analysis (PCA)

Principal Component Analysis (PCA) Principal Component Analysis (PCA) Salvador Dalí, Galatea of the Spheres CSC411/2515: Machine Learning and Data Mining, Winter 2018 Michael Guerzhoy and Lisa Zhang Some slides from Derek Hoiem and Alysha

More information

Using Principal Component Analysis Modeling to Monitor Temperature Sensors in a Nuclear Research Reactor

Using Principal Component Analysis Modeling to Monitor Temperature Sensors in a Nuclear Research Reactor Using Principal Component Analysis Modeling to Monitor Temperature Sensors in a Nuclear Research Reactor Rosani M. L. Penha Centro de Energia Nuclear Instituto de Pesquisas Energéticas e Nucleares - Ipen

More information

Covariance and Correlation Matrix

Covariance and Correlation Matrix Covariance and Correlation Matrix Given sample {x n } N 1, where x Rd, x n = x 1n x 2n. x dn sample mean x = 1 N N n=1 x n, and entries of sample mean are x i = 1 N N n=1 x in sample covariance matrix

More information

Point Distribution Models

Point Distribution Models Point Distribution Models Jan Kybic winter semester 2007 Point distribution models (Cootes et al., 1992) Shape description techniques A family of shapes = mean + eigenvectors (eigenshapes) Shapes described

More information

Methods for sparse analysis of high-dimensional data, II

Methods for sparse analysis of high-dimensional data, II Methods for sparse analysis of high-dimensional data, II Rachel Ward May 26, 2011 High dimensional data with low-dimensional structure 300 by 300 pixel images = 90, 000 dimensions 2 / 55 High dimensional

More information

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014

Dimensionality Reduction Using PCA/LDA. Hongyu Li School of Software Engineering TongJi University Fall, 2014 Dimensionality Reduction Using PCA/LDA Hongyu Li School of Software Engineering TongJi University Fall, 2014 Dimensionality Reduction One approach to deal with high dimensional data is by reducing their

More information

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to:

System 1 (last lecture) : limited to rigidly structured shapes. System 2 : recognition of a class of varying shapes. Need to: System 2 : Modelling & Recognising Modelling and Recognising Classes of Classes of Shapes Shape : PDM & PCA All the same shape? System 1 (last lecture) : limited to rigidly structured shapes System 2 :

More information

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu

Dimension Reduction Techniques. Presented by Jie (Jerry) Yu Dimension Reduction Techniques Presented by Jie (Jerry) Yu Outline Problem Modeling Review of PCA and MDS Isomap Local Linear Embedding (LLE) Charting Background Advances in data collection and storage

More information

High Dimensional Covariance and Precision Matrix Estimation

High Dimensional Covariance and Precision Matrix Estimation High Dimensional Covariance and Precision Matrix Estimation Wei Wang Washington University in St. Louis Thursday 23 rd February, 2017 Wei Wang (Washington University in St. Louis) High Dimensional Covariance

More information

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University

Lecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University Lecture 4: Principal Component Analysis Aykut Erdem May 016 Hacettepe University This week Motivation PCA algorithms Applications PCA shortcomings Autoencoders Kernel PCA PCA Applications Data Visualization

More information

Introduction to Machine Learning

Introduction to Machine Learning 10-701 Introduction to Machine Learning PCA Slides based on 18-661 Fall 2018 PCA Raw data can be Complex, High-dimensional To understand a phenomenon we measure various related quantities If we knew what

More information

Robust Statistical Process Monitoring for Biological Nutrient Removal Plants

Robust Statistical Process Monitoring for Biological Nutrient Removal Plants Robust Statistical Process Monitoring for Biological Nutrient Removal Plants Nabila Heloulou and Messaoud Ramdani Laboratoire d Automatique et Signaux de Annaba (LASA), Universite Badji-Mokhtar de Annaba.

More information

Advanced Introduction to Machine Learning CMU-10715

Advanced Introduction to Machine Learning CMU-10715 Advanced Introduction to Machine Learning CMU-10715 Principal Component Analysis Barnabás Póczos Contents Motivation PCA algorithms Applications Some of these slides are taken from Karl Booksh Research

More information

Data Mining. Dimensionality reduction. Hamid Beigy. Sharif University of Technology. Fall 1395

Data Mining. Dimensionality reduction. Hamid Beigy. Sharif University of Technology. Fall 1395 Data Mining Dimensionality reduction Hamid Beigy Sharif University of Technology Fall 1395 Hamid Beigy (Sharif University of Technology) Data Mining Fall 1395 1 / 42 Outline 1 Introduction 2 Feature selection

More information

Motivating the Covariance Matrix

Motivating the Covariance Matrix Motivating the Covariance Matrix Raúl Rojas Computer Science Department Freie Universität Berlin January 2009 Abstract This note reviews some interesting properties of the covariance matrix and its role

More information

OBSERVER DESIGN WITH GUARANTEED BOUND FOR LPV SYSTEMS. Jamal Daafouz Gilles Millerioux Lionel Rosier

OBSERVER DESIGN WITH GUARANTEED BOUND FOR LPV SYSTEMS. Jamal Daafouz Gilles Millerioux Lionel Rosier OBSERVER DESIGN WITH GUARANTEED BOUND FOR LPV SYSTEMS Jamal Daafouz Gilles Millerioux Lionel Rosier CRAN UMR 739 ENSEM 2, Avenue de la Forêt de Haye 54516 Vandoeuvre-lès-Nancy Cedex France, Email: Jamal.Daafouz@ensem.inpl-nancy.fr

More information

An approach for the state estimation of Takagi-Sugeno models and application to sensor fault diagnosis

An approach for the state estimation of Takagi-Sugeno models and application to sensor fault diagnosis An approach for the state estimation of Takagi-Sugeno models and application to sensor fault diagnosis Dalil Ichalal, Benoît Marx, José Ragot, Didier Maquin Abstract In this paper, a new method to design

More information

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26

Lecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26 Principal Component Analysis Brett Bernstein CDS at NYU April 25, 2017 Brett Bernstein (CDS at NYU) Lecture 13 April 25, 2017 1 / 26 Initial Question Intro Question Question Let S R n n be symmetric. 1

More information

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University PRINCIPAL COMPONENT ANALYSIS DIMENSIONALITY

More information

LEC 3: Fisher Discriminant Analysis (FDA)

LEC 3: Fisher Discriminant Analysis (FDA) LEC 3: Fisher Discriminant Analysis (FDA) A Supervised Dimensionality Reduction Approach Dr. Guangliang Chen February 18, 2016 Outline Motivation: PCA is unsupervised which does not use training labels

More information

Principal Component Analysis-I Geog 210C Introduction to Spatial Data Analysis. Chris Funk. Lecture 17

Principal Component Analysis-I Geog 210C Introduction to Spatial Data Analysis. Chris Funk. Lecture 17 Principal Component Analysis-I Geog 210C Introduction to Spatial Data Analysis Chris Funk Lecture 17 Outline Filters and Rotations Generating co-varying random fields Translating co-varying fields into

More information

Actuator Fault diagnosis: H framework with relative degree notion

Actuator Fault diagnosis: H framework with relative degree notion Actuator Fault diagnosis: H framework with relative degree notion D Ichalal B Marx D Maquin J Ragot Evry Val d Essonne University, IBISC-Lab, 4, rue de Pevoux, 92, Evry Courcouronnes, France (email: dalilichalal@ibiscuniv-evryfr

More information

Statistics for Applications. Chapter 9: Principal Component Analysis (PCA) 1/16

Statistics for Applications. Chapter 9: Principal Component Analysis (PCA) 1/16 Statistics for Applications Chapter 9: Principal Component Analysis (PCA) 1/16 Multivariate statistics and review of linear algebra (1) Let X be a d-dimensional random vector and X 1,..., X n be n independent

More information

An approach of faults estimation in Takagi-Sugeno fuzzy systems

An approach of faults estimation in Takagi-Sugeno fuzzy systems An approach of faults estimation in Takagi-Sugeno fuzzy systems Atef Kheder, Kamel Benothman, Didier Maquin, Mohamed Benrejeb To cite this version: Atef Kheder, Kamel Benothman, Didier Maquin, Mohamed

More information

DIAGNOSIS OF DISCRETE-TIME SINGULARLY PERTURBED SYSTEMS BASED ON SLOW SUBSYSTEM

DIAGNOSIS OF DISCRETE-TIME SINGULARLY PERTURBED SYSTEMS BASED ON SLOW SUBSYSTEM acta mechanica et automatica, vol.8 no.4 (4), DOI.478/ama-4-3 DIAGNOSIS OF DISCRETE-TIME SINGULARLY PERTURBED SYSTEMS BASED ON SLOW SUBSYSTEM Adel TELLILI *,***, Nouceyba ABDELKRIM *,**, Bahaa JAOUADI

More information

Dimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes. October 3, Statistics 202: Data Mining

Dimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes. October 3, Statistics 202: Data Mining Dimension reduction, PCA & eigenanalysis Based in part on slides from textbook, slides of Susan Holmes October 3, 2012 1 / 1 Combinations of features Given a data matrix X n p with p fairly large, it can

More information

AM205: Assignment 2. i=1

AM205: Assignment 2. i=1 AM05: Assignment Question 1 [10 points] (a) [4 points] For p 1, the p-norm for a vector x R n is defined as: ( n ) 1/p x p x i p ( ) i=1 This definition is in fact meaningful for p < 1 as well, although

More information

Principal Components Analysis

Principal Components Analysis Principal Components Analysis Santiago Paternain, Aryan Mokhtari and Alejandro Ribeiro March 29, 2018 At this point we have already seen how the Discrete Fourier Transform and the Discrete Cosine Transform

More information

(a)

(a) Chapter 8 Subspace Methods 8. Introduction Principal Component Analysis (PCA) is applied to the analysis of time series data. In this context we discuss measures of complexity and subspace methods for

More information

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Changing Basis We ve talked so far about re-writing our data using a new set of variables, or a new basis.

More information

ECE 661: Homework 10 Fall 2014

ECE 661: Homework 10 Fall 2014 ECE 661: Homework 10 Fall 2014 This homework consists of the following two parts: (1) Face recognition with PCA and LDA for dimensionality reduction and the nearest-neighborhood rule for classification;

More information

Kernel Principal Component Analysis

Kernel Principal Component Analysis Kernel Principal Component Analysis Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

Analytical redundancy for systems with unknown inputs. Application to faults detection

Analytical redundancy for systems with unknown inputs. Application to faults detection Analytical redundancy for systems with unknown inputs. Application to faults detection José Ragot, Didier Maquin, Frédéric Kratz To cite this version: José Ragot, Didier Maquin, Frédéric Kratz. Analytical

More information

7. Variable extraction and dimensionality reduction

7. Variable extraction and dimensionality reduction 7. Variable extraction and dimensionality reduction The goal of the variable selection in the preceding chapter was to find least useful variables so that it would be possible to reduce the dimensionality

More information

Machine Learning - MT & 14. PCA and MDS

Machine Learning - MT & 14. PCA and MDS Machine Learning - MT 2016 13 & 14. PCA and MDS Varun Kanade University of Oxford November 21 & 23, 2016 Announcements Sheet 4 due this Friday by noon Practical 3 this week (continue next week if necessary)

More information

SPEECH ENHANCEMENT USING PCA AND VARIANCE OF THE RECONSTRUCTION ERROR IN DISTRIBUTED SPEECH RECOGNITION

SPEECH ENHANCEMENT USING PCA AND VARIANCE OF THE RECONSTRUCTION ERROR IN DISTRIBUTED SPEECH RECOGNITION SPEECH ENHANCEMENT USING PCA AND VARIANCE OF THE RECONSTRUCTION ERROR IN DISTRIBUTED SPEECH RECOGNITION Amin Haji Abolhassani 1, Sid-Ahmed Selouani 2, Douglas O Shaughnessy 1 1 INRS-Energie-Matériaux-Télécommunications,

More information

3.1. The probabilistic view of the principal component analysis.

3.1. The probabilistic view of the principal component analysis. 301 Chapter 3 Principal Components and Statistical Factor Models This chapter of introduces the principal component analysis (PCA), briefly reviews statistical factor models PCA is among the most popular

More information

Dimension Reduction and Low-dimensional Embedding

Dimension Reduction and Low-dimensional Embedding Dimension Reduction and Low-dimensional Embedding Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208 http://www.eecs.northwestern.edu/~yingwu 1/26 Dimension

More information

Fault tolerant tracking control for continuous Takagi-Sugeno systems with time varying faults

Fault tolerant tracking control for continuous Takagi-Sugeno systems with time varying faults Fault tolerant tracking control for continuous Takagi-Sugeno systems with time varying faults Tahar Bouarar, Benoît Marx, Didier Maquin, José Ragot Centre de Recherche en Automatique de Nancy (CRAN) Nancy,

More information

Observer design for nonlinear systems described by multiple models

Observer design for nonlinear systems described by multiple models Observer design for nonlinear systems described by multiple models Rodolfo Orjuela, Benoît Marx, José Ragot, Didier Maquin To cite this version: Rodolfo Orjuela, Benoît Marx, José Ragot, Didier Maquin.

More information

Simultaneous state and unknown inputs estimation with PI and PMI observers for Takagi Sugeno model with unmeasurable premise variables

Simultaneous state and unknown inputs estimation with PI and PMI observers for Takagi Sugeno model with unmeasurable premise variables Simultaneous state and unknown inputs estimation with PI and PMI observers for Takagi Sugeno model with unmeasurable premise variables Dalil Ichalal, Benoît Marx, José Ragot and Didier Maquin Centre de

More information

Sparse Principal Component Analysis Formulations And Algorithms

Sparse Principal Component Analysis Formulations And Algorithms Sparse Principal Component Analysis Formulations And Algorithms SLIDE 1 Outline 1 Background What Is Principal Component Analysis (PCA)? What Is Sparse Principal Component Analysis (spca)? 2 The Sparse

More information

Generation of parity equations for singular systems. Application to diagnosis

Generation of parity equations for singular systems. Application to diagnosis Generation of parity equations for singular systems Application to diagnosis Didier Maquin, Besma Gaddouna, José Ragot To cite this version: Didier Maquin, Besma Gaddouna, José Ragot Generation of parity

More information

Principal Component Analysis

Principal Component Analysis B: Chapter 1 HTF: Chapter 1.5 Principal Component Analysis Barnabás Póczos University of Alberta Nov, 009 Contents Motivation PCA algorithms Applications Face recognition Facial expression recognition

More information

Model Based Fault Detection and Diagnosis Using Structured Residual Approach in a Multi-Input Multi-Output System

Model Based Fault Detection and Diagnosis Using Structured Residual Approach in a Multi-Input Multi-Output System SERBIAN JOURNAL OF ELECTRICAL ENGINEERING Vol. 4, No. 2, November 2007, 133-145 Model Based Fault Detection and Diagnosis Using Structured Residual Approach in a Multi-Input Multi-Output System A. Asokan

More information

An application of the GAM-PCA-VAR model to respiratory disease and air pollution data

An application of the GAM-PCA-VAR model to respiratory disease and air pollution data An application of the GAM-PCA-VAR model to respiratory disease and air pollution data Márton Ispány 1 Faculty of Informatics, University of Debrecen Hungary Joint work with Juliana Bottoni de Souza, Valdério

More information

Probabilistic Latent Semantic Analysis

Probabilistic Latent Semantic Analysis Probabilistic Latent Semantic Analysis Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro, Nam-gu, Pohang 37673, Korea seungjin@postech.ac.kr

More information

18.S096 Problem Set 7 Fall 2013 Factor Models Due Date: 11/14/2013. [ ] variance: E[X] =, and Cov[X] = Σ = =

18.S096 Problem Set 7 Fall 2013 Factor Models Due Date: 11/14/2013. [ ] variance: E[X] =, and Cov[X] = Σ = = 18.S096 Problem Set 7 Fall 2013 Factor Models Due Date: 11/14/2013 1. Consider a bivariate random variable: [ ] X X = 1 X 2 with mean and co [ ] variance: [ ] [ α1 Σ 1,1 Σ 1,2 σ 2 ρσ 1 σ E[X] =, and Cov[X]

More information

MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES

MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES MULTICHANNEL SIGNAL PROCESSING USING SPATIAL RANK COVARIANCE MATRICES S. Visuri 1 H. Oja V. Koivunen 1 1 Signal Processing Lab. Dept. of Statistics Tampere Univ. of Technology University of Jyväskylä P.O.

More information

FAST CROSS-VALIDATION IN ROBUST PCA

FAST CROSS-VALIDATION IN ROBUST PCA COMPSTAT 2004 Symposium c Physica-Verlag/Springer 2004 FAST CROSS-VALIDATION IN ROBUST PCA Sanne Engelen, Mia Hubert Key words: Cross-Validation, Robustness, fast algorithm COMPSTAT 2004 section: Partial

More information

More Linear Algebra. Edps/Soc 584, Psych 594. Carolyn J. Anderson

More Linear Algebra. Edps/Soc 584, Psych 594. Carolyn J. Anderson More Linear Algebra Edps/Soc 584, Psych 594 Carolyn J. Anderson Department of Educational Psychology I L L I N O I S university of illinois at urbana-champaign c Board of Trustees, University of Illinois

More information

Course topics (tentative) The role of random effects

Course topics (tentative) The role of random effects Course topics (tentative) random effects linear mixed models analysis of variance frequentist likelihood-based inference (MLE and REML) prediction Bayesian inference The role of random effects Rasmus Waagepetersen

More information

Second-Order Inference for Gaussian Random Curves

Second-Order Inference for Gaussian Random Curves Second-Order Inference for Gaussian Random Curves With Application to DNA Minicircles Victor Panaretos David Kraus John Maddocks Ecole Polytechnique Fédérale de Lausanne Panaretos, Kraus, Maddocks (EPFL)

More information

Manifold Learning: Theory and Applications to HRI

Manifold Learning: Theory and Applications to HRI Manifold Learning: Theory and Applications to HRI Seungjin Choi Department of Computer Science Pohang University of Science and Technology, Korea seungjin@postech.ac.kr August 19, 2008 1 / 46 Greek Philosopher

More information

TMA4267 Linear Statistical Models V2017 [L3]

TMA4267 Linear Statistical Models V2017 [L3] TMA4267 Linear Statistical Models V2017 [L3] Part 1: Multivariate RVs and normal distribution (L3) Covariance and positive definiteness [H:2.2,2.3,3.3], Principal components [H11.1-11.3] Quiz with Kahoot!

More information

PCA and LDA. Man-Wai MAK

PCA and LDA. Man-Wai MAK PCA and LDA Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S.J.D. Prince,Computer

More information

Observability analysis and sensor placement

Observability analysis and sensor placement Observability analysis and sensor placement Didier Maquin, Marie Luong, José Ragot To cite this version: Didier Maquin, Marie Luong, José Ragot. Observability analysis and sensor placement. IFAC/IMACS

More information

A New Robust Partial Least Squares Regression Method

A New Robust Partial Least Squares Regression Method A New Robust Partial Least Squares Regression Method 8th of September, 2005 Universidad Carlos III de Madrid Departamento de Estadistica Motivation Robust PLS Methods PLS Algorithm Computing Robust Variance

More information

GI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis. Massimiliano Pontil

GI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis. Massimiliano Pontil GI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis Massimiliano Pontil 1 Today s plan SVD and principal component analysis (PCA) Connection

More information

Principal Component Analysis for Interval Data

Principal Component Analysis for Interval Data Outline Paula Brito Fac. Economia & LIAAD-INESC TEC, Universidade do Porto ECI 2015 - Buenos Aires T3: Symbolic Data Analysis: Taking Variability in Data into Account Outline Outline 1 Introduction to

More information

Benefits of Using the MEGA Statistical Process Control

Benefits of Using the MEGA Statistical Process Control Eindhoven EURANDOM November 005 Benefits of Using the MEGA Statistical Process Control Santiago Vidal Puig Eindhoven EURANDOM November 005 Statistical Process Control: Univariate Charts USPC, MSPC and

More information

Intelligent Data Analysis. Principal Component Analysis. School of Computer Science University of Birmingham

Intelligent Data Analysis. Principal Component Analysis. School of Computer Science University of Birmingham Intelligent Data Analysis Principal Component Analysis Peter Tiňo School of Computer Science University of Birmingham Discovering low-dimensional spatial layout in higher dimensional spaces - 1-D/3-D example

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison [based on slides from Nina Balcan] slide 1 Goals for the lecture you should understand

More information

Application of Adaptive Thresholds in Robust Fault Detection of an Electro- Mechanical Single-Wheel Steering Actuator

Application of Adaptive Thresholds in Robust Fault Detection of an Electro- Mechanical Single-Wheel Steering Actuator Preprints of the 8th IFAC Symposium on Fault Detection, Supervision and Safety of Technical Processes (SAFEPROCESS) August 29-31, 212. Mexico City, Mexico Application of Adaptive Thresholds in Robust Fault

More information

Machine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling

Machine Learning. B. Unsupervised Learning B.2 Dimensionality Reduction. Lars Schmidt-Thieme, Nicolas Schilling Machine Learning B. Unsupervised Learning B.2 Dimensionality Reduction Lars Schmidt-Thieme, Nicolas Schilling Information Systems and Machine Learning Lab (ISMLL) Institute for Computer Science University

More information

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach

LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach Dr. Guangliang Chen February 9, 2016 Outline Introduction Review of linear algebra Matrix SVD PCA Motivation The digits

More information

Statistical Data Analysis

Statistical Data Analysis DS-GA 0 Lecture notes 8 Fall 016 1 Descriptive statistics Statistical Data Analysis In this section we consider the problem of analyzing a set of data. We describe several techniques for visualizing the

More information

Principal Component Analysis (PCA)

Principal Component Analysis (PCA) Principal Component Analysis (PCA) Additional reading can be found from non-assessed exercises (week 8) in this course unit teaching page. Textbooks: Sect. 6.3 in [1] and Ch. 12 in [2] Outline Introduction

More information

Preprocessing & dimensionality reduction

Preprocessing & dimensionality reduction Introduction to Data Mining Preprocessing & dimensionality reduction CPSC/AMTH 445a/545a Guy Wolf guy.wolf@yale.edu Yale University Fall 2016 CPSC 445 (Guy Wolf) Dimensionality reduction Yale - Fall 2016

More information

Temporal-Difference Q-learning in Active Fault Diagnosis

Temporal-Difference Q-learning in Active Fault Diagnosis Temporal-Difference Q-learning in Active Fault Diagnosis Jan Škach 1 Ivo Punčochář 1 Frank L. Lewis 2 1 Identification and Decision Making Research Group (IDM) European Centre of Excellence - NTIS University

More information

Basics of Multivariate Modelling and Data Analysis

Basics of Multivariate Modelling and Data Analysis Basics of Multivariate Modelling and Data Analysis Kurt-Erik Häggblom 6. Principal component analysis (PCA) 6.1 Overview 6.2 Essentials of PCA 6.3 Numerical calculation of PCs 6.4 Effects of data preprocessing

More information

EE613 Machine Learning for Engineers. Kernel methods Support Vector Machines. jean-marc odobez 2015

EE613 Machine Learning for Engineers. Kernel methods Support Vector Machines. jean-marc odobez 2015 EE613 Machine Learning for Engineers Kernel methods Support Vector Machines jean-marc odobez 2015 overview Kernel methods introductions and main elements defining kernels Kernelization of k-nn, K-Means,

More information

LECTURE NOTE #11 PROF. ALAN YUILLE

LECTURE NOTE #11 PROF. ALAN YUILLE LECTURE NOTE #11 PROF. ALAN YUILLE 1. NonLinear Dimension Reduction Spectral Methods. The basic idea is to assume that the data lies on a manifold/surface in D-dimensional space, see figure (1) Perform

More information

Principal Components Analysis (PCA)

Principal Components Analysis (PCA) Principal Components Analysis (PCA) Principal Components Analysis (PCA) a technique for finding patterns in data of high dimension Outline:. Eigenvectors and eigenvalues. PCA: a) Getting the data b) Centering

More information