Concurrent Canonical Correlation Analysis Modeling for Quality-Relevant Monitoring
|
|
- Godwin Palmer
- 6 years ago
- Views:
Transcription
1 Preprint, 11th IFAC Symposium on Dynamics and Control of Process Systems, including Biosystems June 6-8, 16. NTNU, Trondheim, Norway Concurrent Canonical Correlation Analysis Modeling for Quality-Relevant Monitoring Qinqin Zhu Qiang Liu, S. Joe Qin Mork Family Department of Chemical Engineering and Materials Science, University of Southern California, CA 989, USA ( State Key Laboratory of Synthetical Automation for Process Industries, Northeastern University, Shenyang, Liaoning 11819, PR China ( School of Science and Engineering, the Chinese University of Hong Kong (Shenzhen), Shenzhen, Guangdong 51817, China ( Abstract: Canonical correlation analysis (CCA) is a well-known data analysis technique that extracts multidimensional correlation structure between two groups of variables. Due to the advantages of CCA on quality prediction, CCA-based modeling and monitoring are discussed in this paper. To overcome the shortcoming of CCA that focuses on correlation but ignores variance information, a new concurrent CCA (CCCA) modeling method is proposed to completely decompose the input and output spaces into five subspaces, to retain the CCA efficiency in predicting the output while exploiting the variance structure for process monitoring using subsequent principal component decomposition in the input and output spaces, respectively. The corresponding monitoring statistics and control limits are then developed in these subspaces. The Tennessee Eastman process is used to demonstrate the effectiveness of CCCA-based monitoring methods. Keywords: Concurrent Canonical Correlation Analysis (CCCA), Quality-Relevant Monitoring 1. INTRODUCTION Multivariate statistical process monitoring based on process variables and quality variables has been widely used in industrial processes, including chemicals, polymers, microelectronics manufacturing and pharmaceutical processes (Nomikos and MacGregor (1995); Wise and Gallagher (1996); Qin (3); Chiang et al. ()). Among them, principal component analysis (PCA), partial least squares (PLS) and canonical correlation analysis (CCA) are three basic multivariate statistical methods. When the quality measurements are expensive or difficult to obtain, PCA is widely used for process monitoring on account of its ability to handle high-dimensional, noisy, and highly correlated data by projecting the data onto a lower-dimensional subspace that contains most of the variance of the original data (Wise and Gallagher (1996)). However, PCA-based monitoring methods are only effective for monitoring variations in process variables (X), and no information from the quality variables (Y) is extracted. In practical industrial processes, the variations of process variables may be compensated by feedback controllers, which will have no influence on the quality variables. Thus, monitoring only This work was supported in part by the Natural Science Foundation of China under Grant , Grant , Grant 61573, the China Postdoctoral Science Foundation Grant 13M5414, the International Postdoctoral Exchange Fellowship Program under Grant 13, and the Fundamental Research Funds for the Central Universities (N13181). on the process variables will increase the false alarm rates, making the models unreliable. In order to include the information of the quality variables, PLS-based and CCA-based modeling methods are employed to find the covariances or correlations between process variables and quality variables (MacGregor et al. (1994); Khan et al. (8)). PLS is a data decomposition method for maximizing covariances between X and Y, and it has been studied intensively. Li et al. (1) studied the effect of quality variables on the X-space decomposition and the geometric properties of the PLS structure. In standard PLS methods, the original space is decomposed into two subspaces, which are monitored by T and Q statistics (Qin (3)), respectively. Although it works well in some cases, there are several problems involved in this monitoring scheme. Firstly, PLS components that form T may contain variations orthogonal to Y, which are useless to predict Y. Moreover, PLS extracts principal components based on a maximum covariance criteria, and the residuals of X are not necessarily small. Thus, it is inappropriate to monitor the residual space with the Q statistic. In order to improve the monitoring performance, Zhou et al. (1) proposed a total PLS (T-PLS), which performs further decomposition and divides X-space into four parts. The T-PLS-based monitoring, however, decomposes the X-space unnecessarily into four parts, which can be concisely divided into input-relevant and output-relevant Copyright 16 IFAC 144
2 IFAC DYCOPS-CAB, 16 June 6-8, 16. NTNU, Trondheim, Norway parts instead. Qin and Zheng (13) put forward a concurrent PLS (CPLS) to overcome the drawbacks of T-PLS and monitor input-relevant and output-relevant faults separately. CPLS reduces the false alarm rates and imporves the monitoring performance. PLS related methods are variants of affine transformation of the process and quality variables, and are efficient for quality prediction since they simultaneously exploit the input structure while predicting the output. CCA, in contrast, focuses only on extracting the multidimensional correlation structure between X and Y, which enables it to build a more efficient prediction. CCA is widely used in signal processing, computer vision and behavioral studies (Hardoon et al. (4); Sherry and Henson (5)). CCAbased monitoring, however, has not been studied due to its lack of attention to variance structure in the data. In this paper, we discuss the advantages of CCA over PLS on quality-relevant prediction and the advantages of PLS over CCA for variable structure modeling and monitoring. We then propose a concurrent canonical correlation analysis (CCCA) algorithm for quality-relevant fault detection. CCCA decomposes the original data space into five subspaces: 1) correlation subspace (CRS), which is formed by the canonical variates that are directly relevant to the predictable variations of the output; ) output-principal subspace (OPS), the principal subspace of the unpredictable part of the output; 3) output-residual subspace (ORS), the residual subspace of the unpredictable part of the output; 4) input-principal subspace (IPS), the principal subspace of the input that is irrelevant to the prediction of the output; and 5) input-residual subspace (IRS), the residual subspace of the input which is useless to predict the output. The corresponding fault indices and control limits for the five subspaces are also developed. Moreover, some significant properties of CCCA are analyzed. The CCCA modeling and monitoring approach is inspired by the CPLS approach, while the efficiency in input-output prediction of CCA is incorporated in CCCA modeling. The remaining work of this paper is organized as follows. Fault detection based on CCA model is first presented in Section. A comparison between CCA and PLS will be demonstrated as well. In Section 3, the concurrent CCA modeling algorithm and the corresponding fault monitoring scheme are developed. The Tennessee Eastman process is employed to illustrate the effectiveness of CCCA against PLS, CCA and CPLS in Section 4. Finally, conclusion is presented in the last section.. CCA FOR QUALITY-RELEVANT MONITORING.1 CCA Modeling Hotelling (1936) studied the extension of PCA to extract the multidimensional correlation structure between two sets of variables, and developed CCA. Given input matrix X R n m consisting of n samples with m process variables, and output matrix Y R n p with p quality variables, CCA projects X and Y to a lower-dimensional space defined by a small number of latent variables (t 1,..., t A ), where A is the number of CCA factors. The mean-centered X and Y are decomposed as: { X = TP + E Y = TQ + F where T = [t 1,..., t A ] are the sets of canonical variates for X, P = [p 1,..., p A ] and Q = [q 1,..., q A ] are the loading vectors for X and Y, respectively. A brief algorithm for CCA is given in Appendix A. In this algorithm, the canonical variates T and U are calculated through weighting matrices R and C, respectively. { T = XR () U = YC (1) where U = [u 1,..., u A ] are the sets of canonical variates for Y, R = [r 1,..., r A ] and C = [c 1,..., c A ].. Comparison of CCA and PLS on Process Modeling Both CCA and PLS are used to extract the relation between process and quality variables. However, CCA maximizes the correlation between a linear combination of process and quality variables; while PLS aims to extract the covariance in the two sets of variables. PLS has been widely studied and used in multivariate statistical quality control. The objective function of PLS is max J = t t PLS,u PLS PLSu PLS (3) where t PLS and u PLS are score vectors for X and Y in PLS model. From the above equation, we can see that the scaling of the variables will affect the solutions of PLS, since it is based on a maximum covariance criteria. For the principal components, apart from extracting the ones that reflect the relation between input and output variables, PLS also considers the effects of the input structures. Thus, PLS exploits the variable structure well. In practical industrial processes, the input and output data are sampled from a set of different sensors, and the variance of the values from a given sensor may be affected by noise and is unrelated to the importance of the received information. There may be one pair of directions in the two spaces that has a high covariance due to high variable magnitudes but has a high noise level, while another pair of directions has an almost perfect correlation but a small variable magnitude and therefore low covariance. Therefore, PLS is not appropriate for quality prediction and monitoring in these cases. In contrast, CCA tries to maximize the correlation between X and Y, t CCA u CCA max J = (4) t CCA,u CCA t CCA u CCA where t CCA and u CCA are score vectors for X and Y in CCA model. Correlation between process and quality variables is invariant to the variable magnitude (Borga et al. (1997)). Therefore, the magnitudes of the variances of the values from sensors have no impact on the result of CCA, and the components extracted by CCA can build a more efficient prediction..3 CCA-based Monitoring To perform process monitoring on a new data sample x and y, CCA projects them on input data space as x = ˆx+ 145
3 IFAC DYCOPS-CAB, 16 June 6-8, 16. NTNU, Trondheim, Norway x, where ˆx = PR x Span{P} x = ( I PR ) x Span{R} Similar to PCA and standard PLS (Qin (3)), T and Q statistics are used to monitor the variations in principal and residual subspaces for CCA. Under the assumption that latent vectors are normally distributed with zero means, the monitoring statistics and control limits of T and Q are defined as: T = t Λ 1 t A(n 1) n(n 1) F A,n A,α (5) Q = x gχ h (6) where t = R x, Λ = 1 n 1 diag{λ 1, λ,..., λ A }. F A,n A,α is F-distribution with A and n A degrees of freedom, and the confidence level is defined by α by (1 α) 1%. gχ h is the χ -distribution with scaling factor g and h degrees of freedom. The calculations of g and h can be found in MacGregor et al. (1994). Then if T exceeds the control limit, a quality-relevant fault is detected with (1 α) 1% confidence. If Q exceeds its control limit, a fault in the residual space will be detected with (1 α) 1% confidence. This monitoring scheme works well in some cases. However, it suffers from the same problems of PLS, which are the orthogonal components in the principal component subspace and the potential large variances in the residual subspace. 3. CCCA FOR PROCESS MONITORING 3.1 Concurrent Canonical Correlation Analysis Inspired by the improvement from PLS to CPLS on fault monitoring in the work of Qin and Zheng (13), CCCAbased monitoring method is proposed. To realize a complete monitoring scheme of the quality variables and exclude unrelated variables, CCCA decomposes the data into five spaces: CRS, responsible for the predictable output, IPS, monitoring input relevant faults, IRS, monitoring potentially output-relevant faults, OPS and ORS, monitoring abnormal variations that are output-relevant and unpredictable from input. The CCCA algorithm for data with multiple input and multiple output is given in Table 1. After performing CCCA, the matrices X and Y can be decomposed as: X = T c R c + T x P x + X (7) Y = T c Q c + T y P y + Ỹ (8) where the loadings R c R m Ac, P x R m Ax, Q c R p Ac and P y R p Ay characterize the CCCA model, and the scores T c R n Ac, T x R n Ax and T y R n Ay represent the correlations in X that are related to predictable part of Y, variations in X that are useless to predict Y, and variations in Y that are unpredictable from X, respectively. Given a new data sample x new and y new, CCCA projects them as follows: Table 1. The Concurrent CCA Algorithm 1. Scale X and Y to zero-mean. Perform CCA on input matrix X and output matrix Y to give R, T, Q, and P (Appendix A).. Perform singular value decomposition (SVD) on the predictable output Ŷ = TQ, Ŷ = U c D c Vc = T c Q c where T c = U c is comprised by left singular vectors, and Q c = V c D c contains nonzero singular values in descending order and the corresponding right singular vectors. The number of components is denoted as A 1 c. 3. Obtain the unpredictable output Ỹc = Y T c Q c, and perform PCA with A y principle components, Ỹ c = T y P y + Ỹ This gives the output-principal scores T y and outputresiduals Ỹ. 4. Obtain the output-irrelevant input by projecting on the orthogonal complement of Span R c, Xc = X T c R c, where R c = RQ V c D 1 c, and R c = ( ) R 1 c R c R c, and perform PCA with A x principal components, X c = T x P x + X to yield the input-principal scores T x and inputresiduals X. 1 Different from CPLS in Qin and Zheng (13), A c in CCCA does not equal to the number of all non-zero singular values. Instead, A c is determined by the first A c singular values, whose sum exceeds the 95% of sum of all the singular values for the first time. x new = ˆx c + x c = R c t c + P x t x + x (9) y new = ŷ c + ỹ c = Q c t c + P y t y + ỹ (1) where ˆx c = R c t c, x c = x new R c t c = P x t x + x (11) ŷ c = Q c t c, ỹ c = y new Q c t c = P y t y + ỹ (1) and t c = R c x new (13) t x = P x x c (14) t y = P y ỹc (15) For the residual part x and ỹ, we have x = x c P x t x = ( I P x P ) x xc (16) ỹ = ỹ c P y t y = ( I P y P ) y ỹc (17) 3. Properties of CCCA Several important properties are hold for the CCCA algorithm, and they are convenient to calculate the monitoring statistics in Table, which will be developed in the next subsection. Lemma 1. X c R c =, P x R c = and XR c =. Proof: From the step and 4 of CCCA algorithm in Table 1, we have T c = U c = ŶcV c D 1 c X c = X T c R c = XR c 146
4 IFAC DYCOPS-CAB, 16 June 6-8, 16. NTNU, Trondheim, Norway Thus, X c R c = (X T c R c)r c =. Also, due to X c = T x P x + X and X = (I P x P x ) X c, we have X c R c = T x P x R c + XR c = T x P x R c + (I P x P x ) X c R c = T x P x R c = which shows that XR c =. Finally, T x T x = (n 1)Λ x = diag{λ x1,..., λ xax } and λ xi is the variance of principal component, then T x T x P x R c = (n 1)Λ x P x R c = Therefore, P x R c =. Lemma. t x = P x x new, and x = (I P x P x )x new. Proof: From Lemma 1, we have t x = P x x c = P x (x new R c t c ) = P x x new Similar for x, = P x x new P x R c ( R c R c ) 1 tc x = (I P x P x ) x c = (I P x P x )(x new R c t c ) = (I P x P x )x new (I P x P x )R c ( R c R c ) 1 tc = (I P x P x )x new The last equivalence is derived from XR c =. 3.3 CCCA-based Process Monitoring CCCA-based monitoring scheme works in a similar way as CCA. First, we build a CCCA model from normal data sets X and Y. Then for a new sample, all scores and residuals are calculated through Eq Eq. 17. Finally, several control plots are constructed with corresponding control limits, which are used for fault detection. In multivariate statistical process monitoring, T and Q statistics are widely used for monitoring systematic part and residual part of the process variations (Zhou et al. (1)), respectively. In CCCA, the variations in the CRS, IPS and OPS subspaces contain systematic part, and it is suitable to use T statistic. The residual part in IRS and ORS subspaces represents the residual variations, which should be monitored with Q index instead (Jackson and Mudholkar (1979)). For a sample x and y, the T statistic for the predictable part ŷ can be calculated by Tc = t c Λ 1 c t c = x R c Λ 1 c R c x (18) where Λ c = diag{λ c1, λ c,..., λ cac } and λ ci denotes the variances of the principal components. The input-relevant scores in Eq. 14 and residuals in Eq. 16 can be monitored like PCA by T and Q as follow, Tx = t x Λ 1 x t x = x c P x Λ 1 x P x x c (19) Q x = x x = x ( c I Px P ) x xc () where ( I P x P x ) = ( I Px P x ), and Λx = 1 n 1 T x T x. Table. Monitoring Statistics and Control Limits Statistics Calculation Control Limit Tc t c Λ 1 c t c A c(n 1) n(n A F c) A c,n A c,α Tx t x Λ 1 x t x A x(n 1) n(n A F x) A x,n A x,α Ty t y Λ 1 y t y A y(n 1) n(n A F y) A y,n A y,α Q x x x g x χ h x,α Q y ỹ ỹ g y χ h y,α n, number of training samples; A c, number of principal components in the CRS subspace; A x, number of principal components in IPS space; A y, number of principal components in OPS subspace; the calculation of g x, h x, g y and h y can be found in Qin (3). Similarly, the output-relevant scores in Eq. 15 and residuals in Eq. 17 are monitored by Ty = t y Λ 1 y t y = ỹc P y Λ 1 y P y ỹc (1) Q y = ỹ ỹ = ỹc ( I Py P ) y ỹc () where ( ) I P y P ( ) y = I Py P y, and Λy = 1 n 1 T y T y. Assume that the data are sampled from a multivariate normal distribution. Then the control limits of T c, T x and T y can be obtained by an F -distribution, and the control limits for Q x and Q y are gained by a χ -distribution (Nomikos and MacGregor (1995)). The control limits of these indices are listed in Table. According to the monitoring statistics and control limits in Table, we can monitor the input-relevant and outputrelevant faults as follows: (1) If T c exceeds its control limit, a quality-relevant fault will be detected with (1 α) 1% confidence. () If T x exceeds its control limit, the fault will be identified as quality irrelevant but process relevant with (1 α) 1% confidence. These faults can be paid less attention if only quality variables are considered significant. (3) If Q x exceeds its control limit, a potentially qualityrelevant fault will be detected with (1 α) 1% confidence, since it may contain the variations that are not excited in the training dataset. (4) If T y or Q y exceeds their control limits, a qualityrelevant fault will be detected, which is unpredictable from the input. 4. TENNESSEE EASTMAN PROCESS CASE STUDIES Tennessee Eastman process (TEP) (Downs and Vogel (1993)) is widely used as a benchmark process for evaluating the process monitoring methods such as PCA and PLS. The process produces two products (G and H) from four reactants (A, C, D and E). The reactions are: A(g) + C(g) + D(g) G(l) A(g) + C(g) + E(g) H(l) A(g) + E(g) X(l) 3D(g) X(l) where X(l) is byproduct. There are 41 measurements and 1 manipulated variables in this process, which are listed in 147
5 IFAC DYCOPS-CAB, 16 June 6-8, 16. NTNU, Trondheim, Norway Table 3. Fault Detection Rate for TEP Using PLS, CCA, CPLS and CCCA (%) Disturbances PLS CCA CPLS CCCA IDV(1) IDV() IDV(5) IDV(6) IDV(8) IDV(1) IDV(1) IDV(13) Table 4. False Alarm Rate for TEP Using PLS, CCA, CPLS and CCCA (%) Disturbances PLS CCA CPLS CCCA IDV() IDV(3) IDV(4) IDV(9) IDV(11) IDV(15) Downs and Vogel (1993). The details of the whole process can be found in Chiang et al. (1). In this case study, PLS, CCA, CPLS and CCCA are performed on TEP. For all these monitoring schemes, XMEAS(1-36) and XMV(1-11) are regarded as the process variables, where XMEAS(1-36) are the process measurements and XMV(1-11) are the manipulated variables. XMEAS(37-41) are selected as the quality variables, which are quality measurements. We use 5 normal samples to build PLS, CCA, CPLS and CCCA models. The samples are all centered to zero mean and scaled to unit variance. The number of factors for CCA and PLS are 4 and 34, determined by cross-validation. For CPLS, A c = 5, A y = 5 and A x = 8. And for CCCA, A c = 5, A y = 5 and A x = 3. In this study, we use 13 faulty sample sets and one normal sample set for fault detection (Downs and Vogel (1993)). Each sample set consists of 96 samples. In order to compare the effectiveness of the four models, it is necessary to identify whether a fault is related to the output Y or not. Here, the strategy in Zhou et al. (1) is employed to divide the 14 sample sets into two groups, quality-relevant group and quality-irrelevant group, and their monitoring results are listed in Table 3 and Table 4, respectively. For PLS and CCA, we use T index to monitor the qualityrelevant faults, while for CPLS and CCCA, T c and T y are employed to monitor the process. From Table 3 and Table 4, we can see that CCCA has higher fault detection rates and lower false alarm rates than PLS, CCA and CPLS in most cases. This is because: (1) Compared with PLS and CCA, CCCA also monitors the variations in the OPS subspace, which will affect the quality variables but are not predictable from the process variables; () Compared with CPLS, CCCA maximizes the correlation between X and Y, and the extracted T c T x 6 4 T y Q x Fig. 1. CCCA-based Monitoring Result for IDV() T 6 4 Q Fig.. CCA-based Monitoring Result for IDV() principal components are more relevant than those of CPLS. Moreover, the fact that CCA has lower false alarm rates than PLS in Table 4 also shows the second reason. Apart from the higher fault detection rates and lower false alarm rates, CCCA outperforms PLS and CCA since it can decompose the spaces completely and monitor the quality changes efficiently. For example, the monitoring results for IDV() of CCCA and CCA in Downs and Vogel (1993) are presented in Fig. 1 and, and this is a step disturbance in B composition. Although both CCA and CCCA detect this disturbance in all subspaces, Tc of CCCA-based model tends to return to normal, which reflects the effect of the feedback controllers in the process. Since Tc shows the variations predictable from the input and Tx presents the variations related to the input, the monitoring scheme of CCCA successfully separates them and improves the accuracy of fault detection. For the step disturbance IDV(4) that comes from reactor cooling water inlet temperature (Downs and Vogel (1993)), the monitoring results of CCCA and CCA are presented in Fig. 3 and 4. In this scenario, due to the cascade controller in the system, the variation of the reactor cooling water inlet temperature will not affect the output variables. For CCCA, the monitoring result shows that this fault is quality irrelevant, which should receive less attention during the process. CCA, however, detects the disturbance both in T and Q statistics, raising the alarm incorrectly. 148
6 IFAC DYCOPS-CAB, 16 June 6-8, 16. NTNU, Trondheim, Norway 3 1 T c 3 1 T x 3 1 T y 6 4 Q x Fig. 3. CCCA-based Monitoring Result for IDV(4) 3 1 T 6 4 Q Fig. 4. CCA-based Monitoring Result for IDV(4) 5. CONCLUSIONS In this article, concurrent CCA is proposed to monitor the process-relevant and quality-relevant faults separately. The input and output spaces are concurrently projected into five subspaces, and the corresponding monitoring statistics and control limits are also developed. The application results of TEP shows that CCCA gains higher fault detection rates and lower false alarm rates than CCA, PLS and CPLS. CCCA also outperforms other methods with its comprehensive decomposition. The comparison of CCA and PLS performed in this paper shows the advantages of CCA on quality prediction. Future work is to extend CCCA model to nonlinear cases. APPENDIX A: CCA ALGORITHM 1 Scale the data to zero mean. Perform eigenvalue decomposition to calculate the square root factors, [V x, D x ] = eig(x X) [V y, D y ] = eig(y Y) Σ 1 xx = V x D 1 x Σ 1 V x V y yy = V y D 1 y 3 Perform SVD to calculate weighting matrices R and C, [U z, S z, U z ] = svd(σ 1 xx X YΣ 1 yy ) R = Σ 1 xx U z C = Σ 1 yy V z 4 Obtain canonical variates T and U for X and Y, T = XR U = YC REFERENCES Borga, M., Landelius, T., and Knutsson, H. (1997). A unified approach to PCA, PLS, MLR and CCA. Chiang, L.H., Braatz, R.D., and Russell, E.L. (1). Fault detection and diagnosis in industrial systems. Springer Science & Business Media. Chiang, L.H., Russell, E.L., and Braatz, R.D. (). Fault diagnosis in chemical processes using fisher discriminant analysis, discriminant partial least squares, and principal component analysis. Chemometrics and Intelligent Laboratory Systems, 5(), Downs, J.J. and Vogel, E.F. (1993). A plant-wide industrial process control problem. Computers & Chemical Engineering, 17(3), Hardoon, D.R., Szedmak, S., and Shawe-Taylor, J. (4). Canonical correlation analysis: An overview with application to learning methods. Neural computation, 16(1), Hotelling, H. (1936). Relations between two sets of variates. Biometrika, Jackson, J.E. and Mudholkar, G.S. (1979). Control procedures for residuals associated with principal component analysis. Technometrics, 1(3), Khan, A.A., Moyne, J.R., and Tilbury, D.M. (8). Virtual metrology and feedback control for semiconductor manufacturing processes using recursive partial least squares. Journal of Process Control, 18(1), Li, G., Qin, S.J., and Zhou, D. (1). Geometric properties of partial least squares for process monitoring. Automatica, 46(1), 4 1. MacGregor, J.F., Jaeckle, C., Kiparissides, C., and Koutoudi, M. (1994). Process monitoring and diagnosis by multiblock PLS methods. AIChE Journal, 4(5), Nomikos, P. and MacGregor, J.F. (1995). Multivariate SPC charts for monitoring batch processes. Technometrics, 37(1), Qin, S.J. (3). Statistical process monitoring: basics and beyond. Journal of chemometrics, 17(8-9), Qin, S.J. and Zheng, Y. (13). Quality-relevant and process-relevant fault monitoring with concurrent projection to latent structures. AIChE Journal, 59(), Sherry, A. and Henson, R.K. (5). Conducting and interpreting canonical correlation analysis in personality research: A user-friendly primer. Journal of Personality Assessment, 84(1), Wise, B.M. and Gallagher, N.B. (1996). The process chemometrics approach to process monitoring and fault detection. Journal of Process Control, 6(6), Zhou, D., Li, G., and Qin, S.J. (1). Total projection to latent structures for process monitoring. AIChE Journal, 56(1),
Study on modifications of PLS approach for process monitoring
Study on modifications of PLS approach for process monitoring Shen Yin Steven X. Ding Ping Zhang Adel Hagahni Amol Naik Institute of Automation and Complex Systems, University of Duisburg-Essen, Bismarckstrasse
More informationComparison of statistical process monitoring methods: application to the Eastman challenge problem
Computers and Chemical Engineering 24 (2000) 175 181 www.elsevier.com/locate/compchemeng Comparison of statistical process monitoring methods: application to the Eastman challenge problem Manabu Kano a,
More informationDynamic-Inner Partial Least Squares for Dynamic Data Modeling
Preprints of the 9th International Symposium on Advanced Control of Chemical Processes The International Federation of Automatic Control MoM.5 Dynamic-Inner Partial Least Squares for Dynamic Data Modeling
More informationReconstruction-based Contribution for Process Monitoring with Kernel Principal Component Analysis.
American Control Conference Marriott Waterfront, Baltimore, MD, USA June 3-July, FrC.6 Reconstruction-based Contribution for Process Monitoring with Kernel Principal Component Analysis. Carlos F. Alcala
More informationAutomatica. Geometric properties of partial least squares for process monitoring. Gang Li a, S. Joe Qin b,c,, Donghua Zhou a, Brief paper
Automatica 46 (2010) 204 210 Contents lists available at ScienceDirect Automatica journal homepage: www.elsevier.com/locate/automatica Brief paper Geometric properties of partial least squares for process
More informationDynamic time warping based causality analysis for root-cause diagnosis of nonstationary fault processes
Preprints of the 9th International Symposium on Advanced Control of Chemical Processes The International Federation of Automatic Control June 7-1, 21, Whistler, British Columbia, Canada WeA4. Dynamic time
More informationKeywords: Multimode process monitoring, Joint probability, Weighted probabilistic PCA, Coefficient of variation.
2016 International Conference on rtificial Intelligence: Techniques and pplications (IT 2016) ISBN: 978-1-60595-389-2 Joint Probability Density and Weighted Probabilistic PC Based on Coefficient of Variation
More informationPROCESS FAULT DETECTION AND ROOT CAUSE DIAGNOSIS USING A HYBRID TECHNIQUE. Md. Tanjin Amin, Syed Imtiaz* and Faisal Khan
PROCESS FAULT DETECTION AND ROOT CAUSE DIAGNOSIS USING A HYBRID TECHNIQUE Md. Tanjin Amin, Syed Imtiaz* and Faisal Khan Centre for Risk, Integrity and Safety Engineering (C-RISE) Faculty of Engineering
More informationProcess Monitoring Based on Recursive Probabilistic PCA for Multi-mode Process
Preprints of the 9th International Symposium on Advanced Control of Chemical Processes The International Federation of Automatic Control WeA4.6 Process Monitoring Based on Recursive Probabilistic PCA for
More informationUPSET AND SENSOR FAILURE DETECTION IN MULTIVARIATE PROCESSES
UPSET AND SENSOR FAILURE DETECTION IN MULTIVARIATE PROCESSES Barry M. Wise, N. Lawrence Ricker and David J. Veltkamp Center for Process Analytical Chemistry and Department of Chemical Engineering University
More informationData-driven based Fault Diagnosis using Principal Component Analysis
Data-driven based Fault Diagnosis using Principal Component Analysis Shakir M. Shaikh 1, Imtiaz A. Halepoto 2, Nazar H. Phulpoto 3, Muhammad S. Memon 2, Ayaz Hussain 4, Asif A. Laghari 5 1 Department of
More informationBATCH PROCESS MONITORING THROUGH THE INTEGRATION OF SPECTRAL AND PROCESS DATA. Julian Morris, Elaine Martin and David Stewart
BATCH PROCESS MONITORING THROUGH THE INTEGRATION OF SPECTRAL AND PROCESS DATA Julian Morris, Elaine Martin and David Stewart Centre for Process Analytics and Control Technology School of Chemical Engineering
More informationUsing Principal Component Analysis Modeling to Monitor Temperature Sensors in a Nuclear Research Reactor
Using Principal Component Analysis Modeling to Monitor Temperature Sensors in a Nuclear Research Reactor Rosani M. L. Penha Centro de Energia Nuclear Instituto de Pesquisas Energéticas e Nucleares - Ipen
More informationFERMENTATION BATCH PROCESS MONITORING BY STEP-BY-STEP ADAPTIVE MPCA. Ning He, Lei Xie, Shu-qing Wang, Jian-ming Zhang
FERMENTATION BATCH PROCESS MONITORING BY STEP-BY-STEP ADAPTIVE MPCA Ning He Lei Xie Shu-qing Wang ian-ming Zhang National ey Laboratory of Industrial Control Technology Zhejiang University Hangzhou 3007
More informationStructure in Data. A major objective in data analysis is to identify interesting features or structure in the data.
Structure in Data A major objective in data analysis is to identify interesting features or structure in the data. The graphical methods are very useful in discovering structure. There are basically two
More informationOn the Equivalence Between Canonical Correlation Analysis and Orthonormalized Partial Least Squares
On the Equivalence Between Canonical Correlation Analysis and Orthonormalized Partial Least Squares Liang Sun, Shuiwang Ji, Shipeng Yu, Jieping Ye Department of Computer Science and Engineering, Arizona
More informationPrincipal Component Analysis vs. Independent Component Analysis for Damage Detection
6th European Workshop on Structural Health Monitoring - Fr..D.4 Principal Component Analysis vs. Independent Component Analysis for Damage Detection D. A. TIBADUIZA, L. E. MUJICA, M. ANAYA, J. RODELLAR
More informationLearning with Singular Vectors
Learning with Singular Vectors CIS 520 Lecture 30 October 2015 Barry Slaff Based on: CIS 520 Wiki Materials Slides by Jia Li (PSU) Works cited throughout Overview Linear regression: Given X, Y find w:
More informationFault Detection Approach Based on Weighted Principal Component Analysis Applied to Continuous Stirred Tank Reactor
Send Orders for Reprints to reprints@benthamscience.ae 9 he Open Mechanical Engineering Journal, 5, 9, 9-97 Open Access Fault Detection Approach Based on Weighted Principal Component Analysis Applied to
More informationFAULT DETECTION AND ISOLATION WITH ROBUST PRINCIPAL COMPONENT ANALYSIS
Int. J. Appl. Math. Comput. Sci., 8, Vol. 8, No. 4, 49 44 DOI:.478/v6-8-38-3 FAULT DETECTION AND ISOLATION WITH ROBUST PRINCIPAL COMPONENT ANALYSIS YVON THARRAULT, GILLES MOUROT, JOSÉ RAGOT, DIDIER MAQUIN
More informationBasics of Multivariate Modelling and Data Analysis
Basics of Multivariate Modelling and Data Analysis Kurt-Erik Häggblom 6. Principal component analysis (PCA) 6.1 Overview 6.2 Essentials of PCA 6.3 Numerical calculation of PCs 6.4 Effects of data preprocessing
More informationPCA and LDA. Man-Wai MAK
PCA and LDA Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S.J.D. Prince,Computer
More informationComputation. For QDA we need to calculate: Lets first consider the case that
Computation For QDA we need to calculate: δ (x) = 1 2 log( Σ ) 1 2 (x µ ) Σ 1 (x µ ) + log(π ) Lets first consider the case that Σ = I,. This is the case where each distribution is spherical, around the
More informationNeural Component Analysis for Fault Detection
Neural Component Analysis for Fault Detection Haitao Zhao arxiv:7.48v [cs.lg] Dec 7 Key Laboratory of Advanced Control and Optimization for Chemical Processes of Ministry of Education, School of Information
More informationReducedPCR/PLSRmodelsbysubspaceprojections
ReducedPCR/PLSRmodelsbysubspaceprojections Rolf Ergon Telemark University College P.O.Box 2, N-9 Porsgrunn, Norway e-mail: rolf.ergon@hit.no Published in Chemometrics and Intelligent Laboratory Systems
More informationVector Space Models. wine_spectral.r
Vector Space Models 137 wine_spectral.r Latent Semantic Analysis Problem with words Even a small vocabulary as in wine example is challenging LSA Reduce number of columns of DTM by principal components
More informationWhen Fisher meets Fukunaga-Koontz: A New Look at Linear Discriminants
When Fisher meets Fukunaga-Koontz: A New Look at Linear Discriminants Sheng Zhang erence Sim School of Computing, National University of Singapore 3 Science Drive 2, Singapore 7543 {zhangshe, tsim}@comp.nus.edu.sg
More informationImproved multi-scale kernel principal component analysis and its application for fault detection
chemical engineering research and design 9 ( 2 1 2 ) 1271 128 Contents lists available at SciVerse ScienceDirect Chemical Engineering Research and Design j ourna l ho me page: www.elsevier.com/locate/cherd
More informationPCA & ICA. CE-717: Machine Learning Sharif University of Technology Spring Soleymani
PCA & ICA CE-717: Machine Learning Sharif University of Technology Spring 2015 Soleymani Dimensionality Reduction: Feature Selection vs. Feature Extraction Feature selection Select a subset of a given
More informationA Least Squares Formulation for Canonical Correlation Analysis
A Least Squares Formulation for Canonical Correlation Analysis Liang Sun, Shuiwang Ji, and Jieping Ye Department of Computer Science and Engineering Arizona State University Motivation Canonical Correlation
More informationSmall sample size in high dimensional space - minimum distance based classification.
Small sample size in high dimensional space - minimum distance based classification. Ewa Skubalska-Rafaj lowicz Institute of Computer Engineering, Automatics and Robotics, Department of Electronics, Wroc
More informationInstrumentation Design and Upgrade for Principal Components Analysis Monitoring
2150 Ind. Eng. Chem. Res. 2004, 43, 2150-2159 Instrumentation Design and Upgrade for Principal Components Analysis Monitoring Estanislao Musulin, Miguel Bagajewicz, José M. Nougués, and Luis Puigjaner*,
More information1 Principal Components Analysis
Lecture 3 and 4 Sept. 18 and Sept.20-2006 Data Visualization STAT 442 / 890, CM 462 Lecture: Ali Ghodsi 1 Principal Components Analysis Principal components analysis (PCA) is a very popular technique for
More informationMultivariate statistical monitoring of two-dimensional dynamic batch processes utilizing non-gaussian information
*Manuscript Click here to view linked References Multivariate statistical monitoring of two-dimensional dynamic batch processes utilizing non-gaussian information Yuan Yao a, ao Chen b and Furong Gao a*
More informationLinear Dimensionality Reduction
Outline Hong Chang Institute of Computing Technology, Chinese Academy of Sciences Machine Learning Methods (Fall 2012) Outline Outline I 1 Introduction 2 Principal Component Analysis 3 Factor Analysis
More informationPrincipal Component Analysis
Principal Component Analysis Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison [based on slides from Nina Balcan] slide 1 Goals for the lecture you should understand
More informationPCA, Kernel PCA, ICA
PCA, Kernel PCA, ICA Learning Representations. Dimensionality Reduction. Maria-Florina Balcan 04/08/2015 Big & High-Dimensional Data High-Dimensions = Lot of Features Document classification Features per
More information(b) x 2 t 2. (a) x 2, max. Decoupled. Decoupled. x 2, min. x 1 t 1. x 1, min. x 1, max. Latent Space 1. Original Space. Latent Space.
A DYNAMIC PLS FRAMEWORK FOR CONSTRAINED MODEL PREDICTIVE CONTROL S. Lakshminarayanan Rohit S. Patwardhan Sirish L. Shah K. Nandakumar Department of Chemical and Materials Engineering, University of Alberta,
More informationSYSTEMATIC APPLICATIONS OF MULTIVARIATE ANALYSIS TO MONITORING OF EQUIPMENT HEALTH IN SEMICONDUCTOR MANUFACTURING. D.S.H. Wong S.S.
Proceedings of the 8 Winter Simulation Conference S. J. Mason, R. R. Hill, L. Mönch, O. Rose, T. Jefferson, J. W. Fowler eds. SYSTEMATIC APPLICATIONS OF MULTIVARIATE ANALYSIS TO MONITORING OF EQUIPMENT
More informationLearning based on kernel-pca for abnormal event detection using filtering EWMA-ED.
Trabalho apresentado no CNMAC, Gramado - RS, 2016. Proceeding Series of the Brazilian Society of Computational and Applied Mathematics Learning based on kernel-pca for abnormal event detection using filtering
More informationPrincipal Component Analysis (PCA)
Principal Component Analysis (PCA) Additional reading can be found from non-assessed exercises (week 8) in this course unit teaching page. Textbooks: Sect. 6.3 in [1] and Ch. 12 in [2] Outline Introduction
More informationOn-line multivariate statistical monitoring of batch processes using Gaussian mixture model
On-line multivariate statistical monitoring of batch processes using Gaussian mixture model Tao Chen,a, Jie Zhang b a School of Chemical and Biomedical Engineering, Nanyang Technological University, Singapore
More informationRecursive PCA for adaptive process monitoring
Journal of Process Control 0 (2000) 47±486 www.elsevier.com/locate/jprocont Recursive PCA for adaptive process monitoring Weihua Li, H. Henry Yue, Sergio Valle-Cervantes, S. Joe Qin * Department of Chemical
More informationDecember 20, MAA704, Multivariate analysis. Christopher Engström. Multivariate. analysis. Principal component analysis
.. December 20, 2013 Todays lecture. (PCA) (PLS-R) (LDA) . (PCA) is a method often used to reduce the dimension of a large dataset to one of a more manageble size. The new dataset can then be used to make
More informationOn-line monitoring of a sugar crystallization process
Computers and Chemical Engineering 29 (2005) 1411 1422 On-line monitoring of a sugar crystallization process A. Simoglou a, P. Georgieva c, E.B. Martin a, A.J. Morris a,,s.feyodeazevedo b a Center for
More informationDYNAMIC PRINCIPAL COMPONENT ANALYSIS USING INTEGRAL TRANSFORMS
AIChE Annual Meeting, Miami Beach, November 1999, Paper N o 232(c) DYNAMIC PRINCIPAL COMPONENT ANALYSIS USING INTEGRAL TRANSFORMS K.C. Chow, K-J. Tan, H. Tabe, J. Zhang and N.F. Thornhill Department of
More informationLecture 24: Principal Component Analysis. Aykut Erdem May 2016 Hacettepe University
Lecture 4: Principal Component Analysis Aykut Erdem May 016 Hacettepe University This week Motivation PCA algorithms Applications PCA shortcomings Autoencoders Kernel PCA PCA Applications Data Visualization
More informationImproved Anomaly Detection Using Multi-scale PLS and Generalized Likelihood Ratio Test
Improved Anomaly Detection Using Multi-scale PLS and Generalized Likelihood Ratio Test () Muddu Madakyaru, () Department of Chemical Engineering Manipal Institute of Technology Manipal University, India
More informationJust-In-Time Statistical Process Control: Adaptive Monitoring of Vinyl Acetate Monomer Process
Milano (Italy) August 28 - September 2, 211 Just-In-Time Statistical Process Control: Adaptive Monitoring of Vinyl Acetate Monomer Process Manabu Kano Takeaki Sakata Shinji Hasebe Kyoto University, Nishikyo-ku,
More informationDefining the structure of DPCA models and its impact on Process Monitoring and Prediction activities
Accepted Manuscript Defining the structure of DPCA models and its impact on Process Monitoring and Prediction activities Tiago J. Rato, Marco S. Reis PII: S0169-7439(13)00049-X DOI: doi: 10.1016/j.chemolab.2013.03.009
More informationCS4495/6495 Introduction to Computer Vision. 8B-L2 Principle Component Analysis (and its use in Computer Vision)
CS4495/6495 Introduction to Computer Vision 8B-L2 Principle Component Analysis (and its use in Computer Vision) Wavelength 2 Wavelength 2 Principal Components Principal components are all about the directions
More informationPCA and LDA. Man-Wai MAK
PCA and LDA Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S.J.D. Prince,Computer
More informationStatistical Pattern Recognition
Statistical Pattern Recognition Feature Extraction Hamid R. Rabiee Jafar Muhammadi, Alireza Ghasemi, Payam Siyari Spring 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Agenda Dimensionality Reduction
More informationpairs. Such a system is a reinforcement learning system. In this paper we consider the case where we have a distribution of rewarded pairs of input an
Learning Canonical Correlations Hans Knutsson Magnus Borga Tomas Landelius knutte@isy.liu.se magnus@isy.liu.se tc@isy.liu.se Computer Vision Laboratory Department of Electrical Engineering Linkíoping University,
More informationMachine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012
Machine Learning CSE6740/CS7641/ISYE6740, Fall 2012 Principal Components Analysis Le Song Lecture 22, Nov 13, 2012 Based on slides from Eric Xing, CMU Reading: Chap 12.1, CB book 1 2 Factor or Component
More informationAN ONLINE NIPALS ALGORITHM FOR PARTIAL LEAST SQUARES. Alexander E. Stott, Sithan Kanna, Danilo P. Mandic, William T. Pike
AN ONLINE NIPALS ALGORIHM FOR PARIAL LEAS SQUARES Alexander E. Stott, Sithan Kanna, Danilo P. Mandic, William. Pike Electrical and Electronic Engineering Department, Imperial College London, SW7 2AZ, UK
More informationA Modular NMF Matching Algorithm for Radiation Spectra
A Modular NMF Matching Algorithm for Radiation Spectra Melissa L. Koudelka Sensor Exploitation Applications Sandia National Laboratories mlkoude@sandia.gov Daniel J. Dorsey Systems Technologies Sandia
More informationRobust Method of Multiple Variation Sources Identification in Manufacturing Processes For Quality Improvement
Zhiguo Li Graduate Student Shiyu Zhou 1 Assistant Professor e-mail: szhou@engr.wisc.edu Department of Industrial and Systems Engineering, University of Wisconsin, Madison, WI 53706 Robust Method of Multiple
More informationFocus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations.
Previously Focus was on solving matrix inversion problems Now we look at other properties of matrices Useful when A represents a transformations y = Ax Or A simply represents data Notion of eigenvectors,
More informationSoft Sensor Modelling based on Just-in-Time Learning and Bagging-PLS for Fermentation Processes
1435 A publication of CHEMICAL ENGINEERING TRANSACTIONS VOL. 70, 2018 Guest Editors: Timothy G. Walmsley, Petar S. Varbanov, Rongin Su, Jiří J. Klemeš Copyright 2018, AIDIC Servizi S.r.l. ISBN 978-88-95608-67-9;
More informationProcess-Quality Monitoring Using Semi-supervised Probability Latent Variable Regression Models
Preprints of the 9th World Congress he International Federation of Automatic Control Cape own, South Africa August 4-9, 4 Process-Quality Monitoring Using Semi-supervised Probability Latent Variable Regression
More informationResearch Article A Novel Data-Driven Fault Diagnosis Algorithm Using Multivariate Dynamic Time Warping Measure
Abstract and Applied Analysis, Article ID 625814, 8 pages http://dx.doi.org/10.1155/2014/625814 Research Article A Novel Data-Driven Fault Diagnosis Algorithm Using Multivariate Dynamic Time Warping Measure
More informationChemometrics. Matti Hotokka Physical chemistry Åbo Akademi University
Chemometrics Matti Hotokka Physical chemistry Åbo Akademi University Linear regression Experiment Consider spectrophotometry as an example Beer-Lamberts law: A = cå Experiment Make three known references
More informationData Mining Techniques
Data Mining Techniques CS 6220 - Section 3 - Fall 2016 Lecture 12 Jan-Willem van de Meent (credit: Yijun Zhao, Percy Liang) DIMENSIONALITY REDUCTION Borrowing from: Percy Liang (Stanford) Linear Dimensionality
More informationLecture 13. Principal Component Analysis. Brett Bernstein. April 25, CDS at NYU. Brett Bernstein (CDS at NYU) Lecture 13 April 25, / 26
Principal Component Analysis Brett Bernstein CDS at NYU April 25, 2017 Brett Bernstein (CDS at NYU) Lecture 13 April 25, 2017 1 / 26 Initial Question Intro Question Question Let S R n n be symmetric. 1
More informationPrincipal Component Analysis
CSci 5525: Machine Learning Dec 3, 2008 The Main Idea Given a dataset X = {x 1,..., x N } The Main Idea Given a dataset X = {x 1,..., x N } Find a low-dimensional linear projection The Main Idea Given
More informationEECS 275 Matrix Computation
EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 6 1 / 22 Overview
More informationFoundations of Computer Vision
Foundations of Computer Vision Wesley. E. Snyder North Carolina State University Hairong Qi University of Tennessee, Knoxville Last Edited February 8, 2017 1 3.2. A BRIEF REVIEW OF LINEAR ALGEBRA Apply
More informationMutlivariate Statistical Process Performance Monitoring. Jie Zhang
Mutlivariate Statistical Process Performance Monitoring Jie Zhang School of Chemical Engineering & Advanced Materials Newcastle University Newcastle upon Tyne NE1 7RU, UK jie.zhang@newcastle.ac.uk Located
More informationTwo dimensional recursive least squares for batch processes system identification
Preprints of the 10th IFAC International Symposium on Dynamics and Control of Process Systems The International Federation of Automatic Control Two dimensional recursive least squares for batch processes
More informationProduct Quality Prediction by a Neural Soft-Sensor Based on MSA and PCA
International Journal of Automation and Computing 1 (2006) 17-22 Product Quality Prediction by a Neural Soft-Sensor Based on MSA and PCA Jian Shi, Xing-Gao Liu National Laboratory of Industrial Control
More informationDimension Reduction Techniques. Presented by Jie (Jerry) Yu
Dimension Reduction Techniques Presented by Jie (Jerry) Yu Outline Problem Modeling Review of PCA and MDS Isomap Local Linear Embedding (LLE) Charting Background Advances in data collection and storage
More informationBasics of Multivariate Modelling and Data Analysis
Basics of Multivariate Modelling and Data Analysis Kurt-Erik Häggblom 2. Overview of multivariate techniques 2.1 Different approaches to multivariate data analysis 2.2 Classification of multivariate techniques
More informationMaximum variance formulation
12.1. Principal Component Analysis 561 Figure 12.2 Principal component analysis seeks a space of lower dimensionality, known as the principal subspace and denoted by the magenta line, such that the orthogonal
More informationfor time-dependent, high-dimensional data
A review of PCA-based statistical process monitoring methods for time-dependent, high-dimensional data Bart De Ketelaere MeBioS - Department of Biosystems, KU Leuven, Kasteelpark Arenberg 30, 3001 Leuven
More informationStat 159/259: Linear Algebra Notes
Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the
More informationFactor Analysis and Kalman Filtering (11/2/04)
CS281A/Stat241A: Statistical Learning Theory Factor Analysis and Kalman Filtering (11/2/04) Lecturer: Michael I. Jordan Scribes: Byung-Gon Chun and Sunghoon Kim 1 Factor Analysis Factor analysis is used
More informationPrincipal Component Analysis
B: Chapter 1 HTF: Chapter 1.5 Principal Component Analysis Barnabás Póczos University of Alberta Nov, 009 Contents Motivation PCA algorithms Applications Face recognition Facial expression recognition
More informationCombination of analytical and statistical models for dynamic systems fault diagnosis
Annual Conference of the Prognostics and Health Management Society, 21 Combination of analytical and statistical models for dynamic systems fault diagnosis Anibal Bregon 1, Diego Garcia-Alvarez 2, Belarmino
More informationDesign of Adaptive PCA Controllers for SISO Systems
Preprints of the 8th IFAC World Congress Milano (Italy) August 28 - September 2, 2 Design of Adaptive PCA Controllers for SISO Systems L. Brito Palma*, F. Vieira Coito*, P. Sousa Gil*, R. Neves-Silva*
More informationDimension Reduction (PCA, ICA, CCA, FLD,
Dimension Reduction (PCA, ICA, CCA, FLD, Topic Models) Yi Zhang 10-701, Machine Learning, Spring 2011 April 6 th, 2011 Parts of the PCA slides are from previous 10-701 lectures 1 Outline Dimension reduction
More informationCh.3 Canonical correlation analysis (CCA) [Book, Sect. 2.4]
Ch.3 Canonical correlation analysis (CCA) [Book, Sect. 2.4] With 2 sets of variables {x i } and {y j }, canonical correlation analysis (CCA), first introduced by Hotelling (1936), finds the linear modes
More informationLarge Multistream Data Analytics for Monitoring and. Diagnostics in Manufacturing Systems
Large Multistream Data Analytics for Monitoring and Diagnostics in Manufacturing Systems Samaneh Ebrahimi, Chitta Ranjan, and Kamran Paynabar arxiv:1812.10430v1 [stat.ml] 26 Dec 2018 H. Milton Stewart
More informationRobust control for a multi-stage evaporation plant in the presence of uncertainties
Preprint 11th IFAC Symposium on Dynamics and Control of Process Systems including Biosystems June 6-8 16. NTNU Trondheim Norway Robust control for a multi-stage evaporation plant in the presence of uncertainties
More informationNonlinear process monitoring using kernel principal component analysis
Chemical Engineering Science 59 (24) 223 234 www.elsevier.com/locate/ces Nonlinear process monitoring using kernel principal component analysis Jong-Min Lee a, ChangKyoo Yoo b, Sang Wook Choi a, Peter
More informationGI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis. Massimiliano Pontil
GI07/COMPM012: Mathematical Programming and Research Methods (Part 2) 2. Least Squares and Principal Components Analysis Massimiliano Pontil 1 Today s plan SVD and principal component analysis (PCA) Connection
More informationLEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach
LEC 2: Principal Component Analysis (PCA) A First Dimensionality Reduction Approach Dr. Guangliang Chen February 9, 2016 Outline Introduction Review of linear algebra Matrix SVD PCA Motivation The digits
More informationSynergy between Data Reconciliation and Principal Component Analysis.
Plant Monitoring and Fault Detection Synergy between Data Reconciliation and Principal Component Analysis. Th. Amand a, G. Heyen a, B. Kalitventzeff b Thierry.Amand@ulg.ac.be, G.Heyen@ulg.ac.be, B.Kalitventzeff@ulg.ac.be
More informationWavelet Transform And Principal Component Analysis Based Feature Extraction
Wavelet Transform And Principal Component Analysis Based Feature Extraction Keyun Tong June 3, 2010 As the amount of information grows rapidly and widely, feature extraction become an indispensable technique
More informationA MULTIVARIABLE STATISTICAL PROCESS MONITORING METHOD BASED ON MULTISCALE ANALYSIS AND PRINCIPAL CURVES. Received January 2012; revised June 2012
International Journal of Innovative Computing, Information and Control ICIC International c 2013 ISSN 1349-4198 Volume 9, Number 4, April 2013 pp. 1781 1800 A MULTIVARIABLE STATISTICAL PROCESS MONITORING
More informationNotes on Latent Semantic Analysis
Notes on Latent Semantic Analysis Costas Boulis 1 Introduction One of the most fundamental problems of information retrieval (IR) is to find all documents (and nothing but those) that are semantically
More informationWhat is Principal Component Analysis?
What is Principal Component Analysis? Principal component analysis (PCA) Reduce the dimensionality of a data set by finding a new set of variables, smaller than the original set of variables Retains most
More informationLecture: Face Recognition and Feature Reduction
Lecture: Face Recognition and Feature Reduction Juan Carlos Niebles and Ranjay Krishna Stanford Vision and Learning Lab Lecture 11-1 Recap - Curse of dimensionality Assume 5000 points uniformly distributed
More informationStatistical Machine Learning
Statistical Machine Learning Christoph Lampert Spring Semester 2015/2016 // Lecture 12 1 / 36 Unsupervised Learning Dimensionality Reduction 2 / 36 Dimensionality Reduction Given: data X = {x 1,..., x
More informationComputers & Industrial Engineering
Computers & Industrial Engineering 59 (2010) 145 156 Contents lists available at ScienceDirect Computers & Industrial Engineering journal homepage: www.elsevier.com/locate/caie Integrating independent
More informationTensor Methods for Feature Learning
Tensor Methods for Feature Learning Anima Anandkumar U.C. Irvine Feature Learning For Efficient Classification Find good transformations of input for improved classification Figures used attributed to
More informationIntroduction to Principal Component Analysis (PCA)
Introduction to Principal Component Analysis (PCA) NESAC/BIO NESAC/BIO Daniel J. Graham PhD University of Washington NESAC/BIO MVSA Website 2010 Multivariate Analysis Multivariate analysis (MVA) methods
More informationUnsupervised Machine Learning and Data Mining. DS 5230 / DS Fall Lecture 7. Jan-Willem van de Meent
Unsupervised Machine Learning and Data Mining DS 5230 / DS 4420 - Fall 2018 Lecture 7 Jan-Willem van de Meent DIMENSIONALITY REDUCTION Borrowing from: Percy Liang (Stanford) Dimensionality Reduction Goal:
More informationModule B1: Multivariate Process Control
Module B1: Multivariate Process Control Prof. Fugee Tsung Hong Kong University of Science and Technology Quality Lab: http://qlab.ielm.ust.hk I. Multivariate Shewhart chart WHY MULTIVARIATE PROCESS CONTROL
More informationDimensionality Reduction and Principal Components
Dimensionality Reduction and Principal Components Nuno Vasconcelos (Ken Kreutz-Delgado) UCSD Motivation Recall, in Bayesian decision theory we have: World: States Y in {1,..., M} and observations of X
More information