Online Mode Shape Estimation using Complex Principal Component Analysis and Clustering, Hallvar Haugdal (NTMU Norway)

Size: px
Start display at page:

Download "Online Mode Shape Estimation using Complex Principal Component Analysis and Clustering, Hallvar Haugdal (NTMU Norway)"

Transcription

1 Online Mode Shape Estimation using Complex Principal Component Analysis and Clustering, Hallvar Haugdal (NTMU Norway) Mr. Hallvar Haugdal, finished his MSc. in Electrical Engineering at NTNU, Norway in 2016, specializing in numerical field calculations and electrical motor design. After his MSc he worked a Scientific Assistant also at NTNU, dealing with courses on electrical machines, power systems and field calculations. Since January 2018 he is a PhD student on Wide Area Monitoring- and Control Systems.

2 Online Mode Shape Estimation using Complex Principal Component Analysis and Clustering Hallvar Haugdal

3 Contents Motivation Background theory (C)PCA Clustering Proposed method Application to Kundur Two Area-system

4 Contents Motivation Background theory (C)PCA Clustering Proposed method Application to Kundur Two Area-system

5 Motivation Modal analysis Powerful tool Understanding of system dynamics Stability limits Design of controllers Difficult due to inaccurate modelling Load dependent Propose empirical approach Relying only on Wide Area PMU measurements Correlation Statistical learning

6 Two-part method Part 1: Provide estimates of modes and mode shapes Moving window ~ 5 10 s length Using correlation, Complex Principal Component Analysis (CPCA) Provide point estimates of modes shapes Part 2: Compute averaged mode shapes Differentiate between noise and modes Clustering points resulting from Part 1 Averaged modes and -shapes computed as centroids of clusters

7 Contents Motivation Background theory (C)PCA Clustering Proposed method Application to Kundur Two Area-system

8 Principal Component Analysis Matrix of MM series, NN samples: XX = xx 11 xx 22 xx MM = xx 1 (tt 1 ) xx 1 tt 2 xx 1 tt NN xx 2 tt 1 xx 2 tt 2 xx 2 tt NN xx MM tt 1 xx 3 tt 2 xx MM tt NN Want to transform the correlated series xx 11, xx 22 xx MM into uncorrelated series ss 11, ss 22 ss MM : Eigenvalues (λ ii ) and -vectors (uu ii ): CCuu ii = λ ii uu ii, ii = 1 MM UU = uu 11, uu 22 uu MM UU CCCC = ΛΛ = λλ 1 λλ 2 λλ 1 > λλ 2 > λλ 3 λλ MM λλ MM SS = ss 11 ss 22 ss MM = UU TT XX, ss ii = uu ii TT XX Inversion gives time series from scores XX = UUUU Do this by eigendecomposition of the covariance matrix: CC = 1 1 NN XXXXTT MM xx ii = uu ii,jj ss jj jj=11 Contribution of ss jj in xx ii given by coefficient uu ii,jj

9 Complex Principal Component Analysis Similar to conventional PCA Additional steps: Empirical Mode Decomposition (EMD) Detrending (Decomposition Intrinsic Mode Functions) Hilbert Transform Complex time series zz ii Complex covariance matrix: CC = 1 NN 1 ZZ ZZ Contribution of ss jj in zz ii given by complex coefficient uu ii,jj Resembles observability mode shapes Slower than PCA Due to EMD and Hilbert Transform Works well with damped exponentials Noisy during steps [3]

10 Clustering Various methods K-means Gaussian Mixture Models [4] [2]

11 Contents Motivation Background theory (C)PCA Clustering Proposed method Application to Kundur Two Area-system

12 Proposed method, Part 1: Generating mode estimates Input: Starting out with time window 1. PCA XX SS (Scores) 2. EMD Detrending 3. Hilbert Transform 4. CPCA SS ZZ (Complex scores) Complex PC Scores resemble modes (ff, ξξ) Complex Coefficients resemble mode shapes Output: Point in 2MM + 1 dimensions pp = ff, Re CC 1, Im CC 1, Re CC 2, Im CC 2 Re CC MM, Im CC MM

13 Proposed method, Part 2: Averaged mode estimates using Clustering Resulting points from Part 1 are assumed to populate input space more densely close to areas corresponding to modes Input: Matrix of QQ point estimates PP = pp 1 pp 2 pp QQ = ff, Re CC 1, Im CC 1 ff, Re CC 1, Im CC 1 ff, Re CC 1, Im CC 1 Re CC MM, Im CC MM Re CC MM, Im CC MM Re CC MM, Im CC MM Clustering of points (number of clusters unknown) Output: Averaged mode shapes Centroids of clusters

14 Contents Motivation Background theory (C)PCA Clustering Proposed method Application to Kundur Two Area-system

15 Application Kundur Two-Area System Simulated data DigSILENT PowerFactory Short Circuits near generators [1]

16 Application Kundur Two-Area System Part 1 Moving window Point estimates pp 1 pp 2 pp 3

17 Application Kundur Two-Area System Part 2 Observations Clustering

18 Application Kundur Two-Area System Part 2 Resulting average mode shapes DigSILENT PowerFactory Modal Analysis

19 Concluding remarks Appears to give reasonable results Possible to differentiate modes of similar frequency Potential improvement Including damping Clustering Remove noise Frequency estimation Rotation of mode shapes Further thoughts Clustering on «long term» moving window Use parallel windows of different lengths Track modes Adaptive controllers Further testing Simulations with amibent noise Real PMU data

20 References [1] P. Kundur, Power System Stability and Control. New York: McGraw-Hill, [2] T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning. New York, NY: Springer New York, [3] J. D. Horel, Complex Principal Component Analysis: Theory and Examples, Journal of Climate and Applied Meteorology, vol. 23, no. 12. pp , [4] Yu's Machine Learning Garden, retrieved from (2018)

Advanced data analysis

Advanced data analysis Advanced data analysis Akisato Kimura ( 木村昭悟 ) NTT Communication Science Laboratories E-mail: akisato@ieee.org Advanced data analysis 1. Introduction (Aug 20) 2. Dimensionality reduction (Aug 20,21) PCA,

More information

LOWELL WEEKLY JOURNAL. ^Jberxy and (Jmott Oao M d Ccmsparftble. %m >ai ruv GEEAT INDUSTRIES

LOWELL WEEKLY JOURNAL. ^Jberxy and (Jmott Oao M d Ccmsparftble. %m >ai ruv GEEAT INDUSTRIES ? (») /»» 9 F ( ) / ) /»F»»»»»# F??»»» Q ( ( »»» < 3»» /» > > } > Q ( Q > Z F 5

More information

PanHomc'r I'rui;* :".>r '.a'' W"»' I'fltolt. 'j'l :. r... Jnfii<on. Kslaiaaac. <.T i.. %.. 1 >

PanHomc'r I'rui;* :.>r '.a'' W»' I'fltolt. 'j'l :. r... Jnfii<on. Kslaiaaac. <.T i.. %.. 1 > 5 28 (x / &» )»(»»» Q ( 3 Q» (» ( (3 5» ( q 2 5 q 2 5 5 8) 5 2 2 ) ~ ( / x {» /»»»»» (»»» ( 3 ) / & Q ) X ] Q & X X X x» 8 ( &» 2 & % X ) 8 x & X ( #»»q 3 ( ) & X 3 / Q X»»» %» ( z 22 (»» 2» }» / & 2 X

More information

PCA, Kernel PCA, ICA

PCA, Kernel PCA, ICA PCA, Kernel PCA, ICA Learning Representations. Dimensionality Reduction. Maria-Florina Balcan 04/08/2015 Big & High-Dimensional Data High-Dimensions = Lot of Features Document classification Features per

More information

Principal Component Analysis

Principal Component Analysis Principal Component Analysis Yingyu Liang yliang@cs.wisc.edu Computer Sciences Department University of Wisconsin, Madison [based on slides from Nina Balcan] slide 1 Goals for the lecture you should understand

More information

Lecture 11. Kernel Methods

Lecture 11. Kernel Methods Lecture 11. Kernel Methods COMP90051 Statistical Machine Learning Semester 2, 2017 Lecturer: Andrey Kan Copyright: University of Melbourne This lecture The kernel trick Efficient computation of a dot product

More information

Worksheets for GCSE Mathematics. Algebraic Expressions. Mr Black 's Maths Resources for Teachers GCSE 1-9. Algebra

Worksheets for GCSE Mathematics. Algebraic Expressions. Mr Black 's Maths Resources for Teachers GCSE 1-9. Algebra Worksheets for GCSE Mathematics Algebraic Expressions Mr Black 's Maths Resources for Teachers GCSE 1-9 Algebra Algebraic Expressions Worksheets Contents Differentiated Independent Learning Worksheets

More information

Clusters. Unsupervised Learning. Luc Anselin. Copyright 2017 by Luc Anselin, All Rights Reserved

Clusters. Unsupervised Learning. Luc Anselin.   Copyright 2017 by Luc Anselin, All Rights Reserved Clusters Unsupervised Learning Luc Anselin http://spatial.uchicago.edu 1 curse of dimensionality principal components multidimensional scaling classical clustering methods 2 Curse of Dimensionality 3 Curse

More information

.1 "patedl-righl" timti tame.nto our oai.c iii C. W.Fiak&Co. She ftowtt outnal,

.1 patedl-righl timti tame.nto our oai.c iii C. W.Fiak&Co. She ftowtt outnal, J 2 X Y J Y 3 : > Y 6? ) Q Y x J Y Y // 6 : : \ x J 2 J Q J Z 3 Y 7 2 > 3 [6 2 : x z (7 :J 7 > J : 7 (J 2 J < ( q / 3 6 q J $3 2 6:J : 3 q 2 6 3 2 2 J > 2 :2 : J J 2 2 J 7 J 7 J \ : q 2 J J Y q x ( ) 3:

More information

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin 1 Introduction to Machine Learning PCA and Spectral Clustering Introduction to Machine Learning, 2013-14 Slides: Eran Halperin Singular Value Decomposition (SVD) The singular value decomposition (SVD)

More information

Educjatipnal. L a d ie s * COBNWALILI.S H IG H SCHOOL. I F O R G IR L S A n B k i n d e r g a r t e n.

Educjatipnal. L a d ie s * COBNWALILI.S H IG H SCHOOL. I F O R G IR L S A n B k i n d e r g a r t e n. - - - 0 x ] - ) ) -? - Q - - z 0 x 8 - #? ) 80 0 0 Q ) - 8-8 - ) x ) - ) -] ) Q x?- x - - / - - x - - - x / /- Q ] 8 Q x / / - 0-0 0 x 8 ] ) / - - /- - / /? x ) x x Q ) 8 x q q q )- 8-0 0? - Q - - x?-

More information

Kernel-Based Principal Component Analysis (KPCA) and Its Applications. Nonlinear PCA

Kernel-Based Principal Component Analysis (KPCA) and Its Applications. Nonlinear PCA Kernel-Based Principal Component Analysis (KPCA) and Its Applications 4//009 Based on slides originaly from Dr. John Tan 1 Nonlinear PCA Natural phenomena are usually nonlinear and standard PCA is intrinsically

More information

Chapter 4: Factor Analysis

Chapter 4: Factor Analysis Chapter 4: Factor Analysis In many studies, we may not be able to measure directly the variables of interest. We can merely collect data on other variables which may be related to the variables of interest.

More information

Introduction to Machine Learning

Introduction to Machine Learning 10-701 Introduction to Machine Learning PCA Slides based on 18-661 Fall 2018 PCA Raw data can be Complex, High-dimensional To understand a phenomenon we measure various related quantities If we knew what

More information

Variations. ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra

Variations. ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra Variations ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra Last Time Probability Density Functions Normal Distribution Expectation / Expectation of a function Independence Uncorrelated

More information

SECTION 7: FAULT ANALYSIS. ESE 470 Energy Distribution Systems

SECTION 7: FAULT ANALYSIS. ESE 470 Energy Distribution Systems SECTION 7: FAULT ANALYSIS ESE 470 Energy Distribution Systems 2 Introduction Power System Faults 3 Faults in three-phase power systems are short circuits Line-to-ground Line-to-line Result in the flow

More information

Learning with Singular Vectors

Learning with Singular Vectors Learning with Singular Vectors CIS 520 Lecture 30 October 2015 Barry Slaff Based on: CIS 520 Wiki Materials Slides by Jia Li (PSU) Works cited throughout Overview Linear regression: Given X, Y find w:

More information

Worksheets for GCSE Mathematics. Solving Equations. Mr Black's Maths Resources for Teachers GCSE 1-9. Algebra

Worksheets for GCSE Mathematics. Solving Equations. Mr Black's Maths Resources for Teachers GCSE 1-9. Algebra Worksheets for GCSE Mathematics Solving Equations Mr Black's Maths Resources for Teachers GCSE 1-9 Algebra Equations Worksheets Contents Differentiated Independent Learning Worksheets Solving Equations

More information

Principal Component Analysis. Applied Multivariate Statistics Spring 2012

Principal Component Analysis. Applied Multivariate Statistics Spring 2012 Principal Component Analysis Applied Multivariate Statistics Spring 2012 Overview Intuition Four definitions Practical examples Mathematical example Case study 2 PCA: Goals Goal 1: Dimension reduction

More information

Principal Component Analysis

Principal Component Analysis I.T. Jolliffe Principal Component Analysis Second Edition With 28 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition Acknowledgments List of Figures List of Tables

More information

Statistical Learning with the Lasso, spring The Lasso

Statistical Learning with the Lasso, spring The Lasso Statistical Learning with the Lasso, spring 2017 1 Yeast: understanding basic life functions p=11,904 gene values n number of experiments ~ 10 Blomberg et al. 2003, 2010 The Lasso fmri brain scans function

More information

Principal Component Analysis of Sea Surface Temperature via Singular Value Decomposition

Principal Component Analysis of Sea Surface Temperature via Singular Value Decomposition Principal Component Analysis of Sea Surface Temperature via Singular Value Decomposition SYDE 312 Final Project Ziyad Mir, 20333385 Jennifer Blight, 20347163 Faculty of Engineering Department of Systems

More information

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017 Prof. John Paisley Department of Electrical Engineering & Data Science Institute Columbia University PRINCIPAL COMPONENT ANALYSIS DIMENSIONALITY

More information

COMPLEX PRINCIPAL COMPONENT SPECTRA EXTRACTION

COMPLEX PRINCIPAL COMPONENT SPECTRA EXTRACTION COMPLEX PRINCIPAL COMPONEN SPECRA EXRACION PROGRAM complex_pca_spectra Computing principal components o begin, click the Formation attributes tab in the AASPI-UIL window and select program complex_pca_spectra:

More information

P A L A C E P IE R, S T. L E O N A R D S. R a n n o w, q u a r r y. W WALTER CR O TC H, Esq., Local Chairman. E. CO O PER EVANS, Esq.,.

P A L A C E P IE R, S T. L E O N A R D S. R a n n o w, q u a r r y. W WALTER CR O TC H, Esq., Local Chairman. E. CO O PER EVANS, Esq.,. ? ( # [ ( 8? [ > 3 Q [ ««> » 9 Q { «33 Q> 8 \ \ 3 3 3> Q»«9 Q ««« 3 8 3 8 X \ [ 3 ( ( Z ( Z 3( 9 9 > < < > >? 8 98 ««3 ( 98 < # # Q 3 98? 98 > > 3 8 9 9 ««««> 3 «>

More information

Large Scale Data Analysis Using Deep Learning

Large Scale Data Analysis Using Deep Learning Large Scale Data Analysis Using Deep Learning Linear Algebra U Kang Seoul National University U Kang 1 In This Lecture Overview of linear algebra (but, not a comprehensive survey) Focused on the subset

More information

Lecture Notes to Big Data Management and Analytics Winter Term 2017/2018 Text Processing and High-Dimensional Data

Lecture Notes to Big Data Management and Analytics Winter Term 2017/2018 Text Processing and High-Dimensional Data Lecture Notes to Winter Term 2017/2018 Text Processing and High-Dimensional Data Matthias Schubert, Matthias Renz, Felix Borutta, Evgeniy Faerman, Christian Frey, Klaus Arthur Schmid, Daniyal Kazempour,

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data.

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data. Structure in Data A major objective in data analysis is to identify interesting features or structure in the data. The graphical methods are very useful in discovering structure. There are basically two

More information

Data Preprocessing. Jilles Vreeken IRDM 15/ Oct 2015

Data Preprocessing. Jilles Vreeken IRDM 15/ Oct 2015 Data Preprocessing Jilles Vreeken 22 Oct 2015 So, how do you pronounce Jilles Yill-less Vreeken Fray-can Okay, now we can talk. Questions of the day How do we preprocess data before we can extract anything

More information

Worksheets for GCSE Mathematics. Quadratics. mr-mathematics.com Maths Resources for Teachers. Algebra

Worksheets for GCSE Mathematics. Quadratics. mr-mathematics.com Maths Resources for Teachers. Algebra Worksheets for GCSE Mathematics Quadratics mr-mathematics.com Maths Resources for Teachers Algebra Quadratics Worksheets Contents Differentiated Independent Learning Worksheets Solving x + bx + c by factorisation

More information

Independent Component Analysis and FastICA. Copyright Changwei Xiong June last update: July 7, 2016

Independent Component Analysis and FastICA. Copyright Changwei Xiong June last update: July 7, 2016 Independent Component Analysis and FastICA Copyright Changwei Xiong 016 June 016 last update: July 7, 016 TABLE OF CONTENTS Table of Contents...1 1. Introduction.... Independence by Non-gaussianity....1.

More information

Support Vector Machines. CSE 4309 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Support Vector Machines. CSE 4309 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington Support Vector Machines CSE 4309 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington 1 A Linearly Separable Problem Consider the binary classification

More information

Random Vectors Part A

Random Vectors Part A Random Vectors Part A Page 1 Outline Random Vectors Measurement of Dependence: Covariance Other Methods of Representing Dependence Set of Joint Distributions Copulas Common Random Vectors Functions of

More information

Principal Component Analysis

Principal Component Analysis CSci 5525: Machine Learning Dec 3, 2008 The Main Idea Given a dataset X = {x 1,..., x N } The Main Idea Given a dataset X = {x 1,..., x N } Find a low-dimensional linear projection The Main Idea Given

More information

An Efficient Algorithm For Weak Hierarchical Lasso. Yashu Liu, Jie Wang, Jieping Ye Arizona State University

An Efficient Algorithm For Weak Hierarchical Lasso. Yashu Liu, Jie Wang, Jieping Ye Arizona State University An Efficient Algorithm For Weak Hierarchical Lasso Yashu Liu, Jie Wang, Jieping Ye Arizona State University Outline Regression with Interactions Problems and Challenges Weak Hierarchical Lasso The Proposed

More information

STATISTICAL LEARNING SYSTEMS

STATISTICAL LEARNING SYSTEMS STATISTICAL LEARNING SYSTEMS LECTURE 8: UNSUPERVISED LEARNING: FINDING STRUCTURE IN DATA Institute of Computer Science, Polish Academy of Sciences Ph. D. Program 2013/2014 Principal Component Analysis

More information

Multiple Regression Analysis: Inference ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD

Multiple Regression Analysis: Inference ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD Multiple Regression Analysis: Inference ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD Introduction When you perform statistical inference, you are primarily doing one of two things: Estimating the boundaries

More information

Introduction to Density Estimation and Anomaly Detection. Tom Dietterich

Introduction to Density Estimation and Anomaly Detection. Tom Dietterich Introduction to Density Estimation and Anomaly Detection Tom Dietterich Outline Definition and Motivations Density Estimation Parametric Density Estimation Mixture Models Kernel Density Estimation Neural

More information

Chapter 22 : Electric potential

Chapter 22 : Electric potential Chapter 22 : Electric potential What is electric potential? How does it relate to potential energy? How does it relate to electric field? Some simple applications What does it mean when it says 1.5 Volts

More information

Lecture 10: Logistic Regression

Lecture 10: Logistic Regression BIOINF 585: Machine Learning for Systems Biology & Clinical Informatics Lecture 10: Logistic Regression Jie Wang Department of Computational Medicine & Bioinformatics University of Michigan 1 Outline An

More information

Comparing Robustness of Pairwise and Multiclass Neural-Network Systems for Face Recognition

Comparing Robustness of Pairwise and Multiclass Neural-Network Systems for Face Recognition Comparing Robustness of Pairwise and Multiclass Neural-Network Systems for Face Recognition J. Uglov, V. Schetinin, C. Maple Computing and Information System Department, University of Bedfordshire, Luton,

More information

Mining Big Data Using Parsimonious Factor and Shrinkage Methods

Mining Big Data Using Parsimonious Factor and Shrinkage Methods Mining Big Data Using Parsimonious Factor and Shrinkage Methods Hyun Hak Kim 1 and Norman Swanson 2 1 Bank of Korea and 2 Rutgers University ECB Workshop on using Big Data for Forecasting and Statistics

More information

A Step Towards the Cognitive Radar: Target Detection under Nonstationary Clutter

A Step Towards the Cognitive Radar: Target Detection under Nonstationary Clutter A Step Towards the Cognitive Radar: Target Detection under Nonstationary Clutter Murat Akcakaya Department of Electrical and Computer Engineering University of Pittsburgh Email: akcakaya@pitt.edu Satyabrata

More information

ME5286 Robotics Spring 2017 Quiz 2

ME5286 Robotics Spring 2017 Quiz 2 Page 1 of 5 ME5286 Robotics Spring 2017 Quiz 2 Total Points: 30 You are responsible for following these instructions. Please take a minute and read them completely. 1. Put your name on this page, any other

More information

What is Principal Component Analysis?

What is Principal Component Analysis? What is Principal Component Analysis? Principal component analysis (PCA) Reduce the dimensionality of a data set by finding a new set of variables, smaller than the original set of variables Retains most

More information

Quantitative Understanding in Biology Principal Components Analysis

Quantitative Understanding in Biology Principal Components Analysis Quantitative Understanding in Biology Principal Components Analysis Introduction Throughout this course we have seen examples of complex mathematical phenomena being represented as linear combinations

More information

Machine Learning (BSMC-GA 4439) Wenke Liu

Machine Learning (BSMC-GA 4439) Wenke Liu Machine Learning (BSMC-GA 4439) Wenke Liu 02-01-2018 Biomedical data are usually high-dimensional Number of samples (n) is relatively small whereas number of features (p) can be large Sometimes p>>n Problems

More information

Predicting Winners of Competitive Events with Topological Data Analysis

Predicting Winners of Competitive Events with Topological Data Analysis Predicting Winners of Competitive Events with Topological Data Analysis Conrad D Souza Ruben Sanchez-Garcia R.Sanchez-Garcia@soton.ac.uk Tiejun Ma tiejun.ma@soton.ac.uk Johnnie Johnson J.E.Johnson@soton.ac.uk

More information

Terms of Use. Copyright Embark on the Journey

Terms of Use. Copyright Embark on the Journey Terms of Use All rights reserved. No part of this packet may be reproduced, stored in a retrieval system, or transmitted in any form by any means - electronic, mechanical, photo-copies, recording, or otherwise

More information

9. Switched Capacitor Filters. Electronic Circuits. Prof. Dr. Qiuting Huang Integrated Systems Laboratory

9. Switched Capacitor Filters. Electronic Circuits. Prof. Dr. Qiuting Huang Integrated Systems Laboratory 9. Switched Capacitor Filters Electronic Circuits Prof. Dr. Qiuting Huang Integrated Systems Laboratory Motivation Transmission of voice signals requires an active RC low-pass filter with very low ff cutoff

More information

(a)

(a) Chapter 8 Subspace Methods 8. Introduction Principal Component Analysis (PCA) is applied to the analysis of time series data. In this context we discuss measures of complexity and subspace methods for

More information

Motivating the Covariance Matrix

Motivating the Covariance Matrix Motivating the Covariance Matrix Raúl Rojas Computer Science Department Freie Universität Berlin January 2009 Abstract This note reviews some interesting properties of the covariance matrix and its role

More information

Eigenface-based facial recognition

Eigenface-based facial recognition Eigenface-based facial recognition Dimitri PISSARENKO December 1, 2002 1 General This document is based upon Turk and Pentland (1991b), Turk and Pentland (1991a) and Smith (2002). 2 How does it work? The

More information

Convergence of Eigenspaces in Kernel Principal Component Analysis

Convergence of Eigenspaces in Kernel Principal Component Analysis Convergence of Eigenspaces in Kernel Principal Component Analysis Shixin Wang Advanced machine learning April 19, 2016 Shixin Wang Convergence of Eigenspaces April 19, 2016 1 / 18 Outline 1 Motivation

More information

Principal Components Analysis (PCA)

Principal Components Analysis (PCA) Principal Components Analysis (PCA) Principal Components Analysis (PCA) a technique for finding patterns in data of high dimension Outline:. Eigenvectors and eigenvalues. PCA: a) Getting the data b) Centering

More information

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA Yoshua Bengio Pascal Vincent Jean-François Paiement University of Montreal April 2, Snowbird Learning 2003 Learning Modal Structures

More information

Machine learning for pervasive systems Classification in high-dimensional spaces

Machine learning for pervasive systems Classification in high-dimensional spaces Machine learning for pervasive systems Classification in high-dimensional spaces Department of Communications and Networking Aalto University, School of Electrical Engineering stephan.sigg@aalto.fi Version

More information

Vector Space Models. wine_spectral.r

Vector Space Models. wine_spectral.r Vector Space Models 137 wine_spectral.r Latent Semantic Analysis Problem with words Even a small vocabulary as in wine example is challenging LSA Reduce number of columns of DTM by principal components

More information

IV. Matrix Approximation using Least-Squares

IV. Matrix Approximation using Least-Squares IV. Matrix Approximation using Least-Squares The SVD and Matrix Approximation We begin with the following fundamental question. Let A be an M N matrix with rank R. What is the closest matrix to A that

More information

Sparse orthogonal factor analysis

Sparse orthogonal factor analysis Sparse orthogonal factor analysis Kohei Adachi and Nickolay T. Trendafilov Abstract A sparse orthogonal factor analysis procedure is proposed for estimating the optimal solution with sparse loadings. In

More information

10.4 The Cross Product

10.4 The Cross Product Math 172 Chapter 10B notes Page 1 of 9 10.4 The Cross Product The cross product, or vector product, is defined in 3 dimensions only. Let aa = aa 1, aa 2, aa 3 bb = bb 1, bb 2, bb 3 then aa bb = aa 2 bb

More information

LOWELL WEEKI.Y JOURINAL

LOWELL WEEKI.Y JOURINAL / $ 8) 2 {!»!» X ( (!!!?! () ~ x 8» x /»!! $?» 8! ) ( ) 8 X x /! / x 9 ( 2 2! z»!!»! ) / x»! ( (»»!» [ ~!! 8 X / Q X x» ( (!»! Q ) X x X!! (? ( ()» 9 X»/ Q ( (X )!» / )! X» x / 6!»! }? ( q ( ) / X! 8 x»

More information

Kernel-Based Contrast Functions for Sufficient Dimension Reduction

Kernel-Based Contrast Functions for Sufficient Dimension Reduction Kernel-Based Contrast Functions for Sufficient Dimension Reduction Michael I. Jordan Departments of Statistics and EECS University of California, Berkeley Joint work with Kenji Fukumizu and Francis Bach

More information

Leverage Sparse Information in Predictive Modeling

Leverage Sparse Information in Predictive Modeling Leverage Sparse Information in Predictive Modeling Liang Xie Countrywide Home Loans, Countrywide Bank, FSB August 29, 2008 Abstract This paper examines an innovative method to leverage information from

More information

Machine Learning. Ensemble Methods. Manfred Huber

Machine Learning. Ensemble Methods. Manfred Huber Machine Learning Ensemble Methods Manfred Huber 2015 1 Bias, Variance, Noise Classification errors have different sources Choice of hypothesis space and algorithm Training set Noise in the data The expected

More information

Expectation Maximization

Expectation Maximization Expectation Maximization Machine Learning CSE546 Carlos Guestrin University of Washington November 13, 2014 1 E.M.: The General Case E.M. widely used beyond mixtures of Gaussians The recipe is the same

More information

CS Lecture 8 & 9. Lagrange Multipliers & Varitional Bounds

CS Lecture 8 & 9. Lagrange Multipliers & Varitional Bounds CS 6347 Lecture 8 & 9 Lagrange Multipliers & Varitional Bounds General Optimization subject to: min ff 0() R nn ff ii 0, h ii = 0, ii = 1,, mm ii = 1,, pp 2 General Optimization subject to: min ff 0()

More information

Probabilistic & Unsupervised Learning. Latent Variable Models

Probabilistic & Unsupervised Learning. Latent Variable Models Probabilistic & Unsupervised Learning Latent Variable Models Maneesh Sahani maneesh@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc ML/CSML, Dept Computer Science University College London

More information

(1) Correspondence of the density matrix to traditional method

(1) Correspondence of the density matrix to traditional method (1) Correspondence of the density matrix to traditional method New method (with the density matrix) Traditional method (from thermal physics courses) ZZ = TTTT ρρ = EE ρρ EE = dddd xx ρρ xx ii FF = UU

More information

Exploratory Factor Analysis: dimensionality and factor scores. Psychology 588: Covariance structure and factor models

Exploratory Factor Analysis: dimensionality and factor scores. Psychology 588: Covariance structure and factor models Exploratory Factor Analysis: dimensionality and factor scores Psychology 588: Covariance structure and factor models How many PCs to retain 2 Unlike confirmatory FA, the number of factors to extract is

More information

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012 Machine Learning CSE6740/CS7641/ISYE6740, Fall 2012 Principal Components Analysis Le Song Lecture 22, Nov 13, 2012 Based on slides from Eric Xing, CMU Reading: Chap 12.1, CB book 1 2 Factor or Component

More information

Review for Exam Hyunse Yoon, Ph.D. Assistant Research Scientist IIHR-Hydroscience & Engineering University of Iowa

Review for Exam Hyunse Yoon, Ph.D. Assistant Research Scientist IIHR-Hydroscience & Engineering University of Iowa 57:020 Fluids Mechanics Fall2013 1 Review for Exam3 12. 11. 2013 Hyunse Yoon, Ph.D. Assistant Research Scientist IIHR-Hydroscience & Engineering University of Iowa 57:020 Fluids Mechanics Fall2013 2 Chapter

More information

Control of Mobile Robots

Control of Mobile Robots Control of Mobile Robots Regulation and trajectory tracking Prof. Luca Bascetta (luca.bascetta@polimi.it) Politecnico di Milano Dipartimento di Elettronica, Informazione e Bioingegneria Organization and

More information

Grover s algorithm. We want to find aa. Search in an unordered database. QC oracle (as usual) Usual trick

Grover s algorithm. We want to find aa. Search in an unordered database. QC oracle (as usual) Usual trick Grover s algorithm Search in an unordered database Example: phonebook, need to find a person from a phone number Actually, something else, like hard (e.g., NP-complete) problem 0, xx aa Black box ff xx

More information

Campus Academic Resource Program Differentiation Rules

Campus Academic Resource Program Differentiation Rules This handout will: Outline the definition of the derivative and introduce different notations for the derivative Introduce Differentiation rules and provide concise explanation of them Provide examples

More information

ECE 5984: Introduction to Machine Learning

ECE 5984: Introduction to Machine Learning ECE 5984: Introduction to Machine Learning Topics: (Finish) Expectation Maximization Principal Component Analysis (PCA) Readings: Barber 15.1-15.4 Dhruv Batra Virginia Tech Administrativia Poster Presentation:

More information

Analysis and Design of Control Dynamics of Manipulator Robot s Joint Drive

Analysis and Design of Control Dynamics of Manipulator Robot s Joint Drive Journal of Mechanics Engineering and Automation 8 (2018) 205-213 doi: 10.17265/2159-5275/2018.05.003 D DAVID PUBLISHING Analysis and Design of Control Dynamics of Manipulator Robot s Joint Drive Bukhar

More information

Independent Component Analysis and Its Application on Accelerator Physics

Independent Component Analysis and Its Application on Accelerator Physics Independent Component Analysis and Its Application on Accelerator Physics Xiaoying Pang LA-UR-12-20069 ICA and PCA Similarities: Blind source separation method (BSS) no model Observed signals are linear

More information

MATH 1080: Calculus of One Variable II Fall 2018 Textbook: Single Variable Calculus: Early Transcendentals, 7e, by James Stewart.

MATH 1080: Calculus of One Variable II Fall 2018 Textbook: Single Variable Calculus: Early Transcendentals, 7e, by James Stewart. MATH 1080: Calculus of One Variable II Fall 2018 Textbook: Single Variable Calculus: Early Transcendentals, 7e, by James Stewart Unit 2 Skill Set Important: Students should expect test questions that require

More information

8.6 Bayesian neural networks (BNN) [Book, Sect. 6.7]

8.6 Bayesian neural networks (BNN) [Book, Sect. 6.7] 8.6 Bayesian neural networks (BNN) [Book, Sect. 6.7] While cross-validation allows one to find the weight penalty parameters which would give the model good generalization capability, the separation of

More information

Confidence Intervals for the Odds Ratio in Logistic Regression with Two Binary X s

Confidence Intervals for the Odds Ratio in Logistic Regression with Two Binary X s Chapter 866 Confidence Intervals for the Odds Ratio in Logistic Regression with Two Binary X s Introduction Logistic regression expresses the relationship between a binary response variable and one or

More information

Intelligent Fault Classification of Rolling Bearing at Variable Speed Based on Reconstructed Phase Space

Intelligent Fault Classification of Rolling Bearing at Variable Speed Based on Reconstructed Phase Space Journal of Robotics, Networking and Artificial Life, Vol., No. (June 24), 97-2 Intelligent Fault Classification of Rolling Bearing at Variable Speed Based on Reconstructed Phase Space Weigang Wen School

More information

CS 540: Machine Learning Lecture 1: Introduction

CS 540: Machine Learning Lecture 1: Introduction CS 540: Machine Learning Lecture 1: Introduction AD January 2008 AD () January 2008 1 / 41 Acknowledgments Thanks to Nando de Freitas Kevin Murphy AD () January 2008 2 / 41 Administrivia & Announcement

More information

1 Data Arrays and Decompositions

1 Data Arrays and Decompositions 1 Data Arrays and Decompositions 1.1 Variance Matrices and Eigenstructure Consider a p p positive definite and symmetric matrix V - a model parameter or a sample variance matrix. The eigenstructure is

More information

Table of Contents. Multivariate methods. Introduction II. Introduction I

Table of Contents. Multivariate methods. Introduction II. Introduction I Table of Contents Introduction Antti Penttilä Department of Physics University of Helsinki Exactum summer school, 04 Construction of multinormal distribution Test of multinormality with 3 Interpretation

More information

Hopfield Network for Associative Memory

Hopfield Network for Associative Memory CSE 5526: Introduction to Neural Networks Hopfield Network for Associative Memory 1 The next few units cover unsupervised models Goal: learn the distribution of a set of observations Some observations

More information

Lecture 7: Con3nuous Latent Variable Models

Lecture 7: Con3nuous Latent Variable Models CSC2515 Fall 2015 Introduc3on to Machine Learning Lecture 7: Con3nuous Latent Variable Models All lecture slides will be available as.pdf on the course website: http://www.cs.toronto.edu/~urtasun/courses/csc2515/

More information

Chapter 5: Spectral Domain From: The Handbook of Spatial Statistics. Dr. Montserrat Fuentes and Dr. Brian Reich Prepared by: Amanda Bell

Chapter 5: Spectral Domain From: The Handbook of Spatial Statistics. Dr. Montserrat Fuentes and Dr. Brian Reich Prepared by: Amanda Bell Chapter 5: Spectral Domain From: The Handbook of Spatial Statistics Dr. Montserrat Fuentes and Dr. Brian Reich Prepared by: Amanda Bell Background Benefits of Spectral Analysis Type of data Basic Idea

More information

Massachusetts Institute of Technology. Problem Set 6 Solutions

Massachusetts Institute of Technology. Problem Set 6 Solutions Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science Department of Mechanical Engineering 6.050J/.0J Information and Entropy Spring 003 Problem Set 6 Solutions

More information

Gopalkrishna Veni. Project 4 (Active Shape Models)

Gopalkrishna Veni. Project 4 (Active Shape Models) Gopalkrishna Veni Project 4 (Active Shape Models) Introduction Active shape Model (ASM) is a technique of building a model by learning the variability patterns from training datasets. ASMs try to deform

More information

Factor Analysis (10/2/13)

Factor Analysis (10/2/13) STA561: Probabilistic machine learning Factor Analysis (10/2/13) Lecturer: Barbara Engelhardt Scribes: Li Zhu, Fan Li, Ni Guan Factor Analysis Factor analysis is related to the mixture models we have studied.

More information

Bootstrap & Confidence/Prediction intervals

Bootstrap & Confidence/Prediction intervals Bootstrap & Confidence/Prediction intervals Olivier Roustant Mines Saint-Étienne 2017/11 Olivier Roustant (EMSE) Bootstrap & Confidence/Prediction intervals 2017/11 1 / 9 Framework Consider a model with

More information

Confidence Intervals for the Interaction Odds Ratio in Logistic Regression with Two Binary X s

Confidence Intervals for the Interaction Odds Ratio in Logistic Regression with Two Binary X s Chapter 867 Confidence Intervals for the Interaction Odds Ratio in Logistic Regression with Two Binary X s Introduction Logistic regression expresses the relationship between a binary response variable

More information

EE16B Designing Information Devices and Systems II

EE16B Designing Information Devices and Systems II EE6B Designing Information Devices and Systems II Lecture 9B Geometry of SVD, PCA Uniqueness of the SVD Find SVD of A 0 A 0 AA T 0 ) ) 0 0 ~u ~u 0 ~u ~u ~u ~u Uniqueness of the SVD Find SVD of A 0 A 0

More information

PCA and admixture models

PCA and admixture models PCA and admixture models CM226: Machine Learning for Bioinformatics. Fall 2016 Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar, Alkes Price PCA and admixture models 1 / 57 Announcements HW1

More information

MSA220 Statistical Learning for Big Data

MSA220 Statistical Learning for Big Data MSA220 Statistical Learning for Big Data Lecture 4 Rebecka Jörnsten Mathematical Sciences University of Gothenburg and Chalmers University of Technology More on Discriminant analysis More on Discriminant

More information

Heeyoul (Henry) Choi. Dept. of Computer Science Texas A&M University

Heeyoul (Henry) Choi. Dept. of Computer Science Texas A&M University Heeyoul (Henry) Choi Dept. of Computer Science Texas A&M University hchoi@cs.tamu.edu Introduction Speaker Adaptation Eigenvoice Comparison with others MAP, MLLR, EMAP, RMP, CAT, RSW Experiments Future

More information

Online Kernel PCA with Entropic Matrix Updates

Online Kernel PCA with Entropic Matrix Updates Dima Kuzmin Manfred K. Warmuth Computer Science Department, University of California - Santa Cruz dima@cse.ucsc.edu manfred@cse.ucsc.edu Abstract A number of updates for density matrices have been developed

More information