Online Mode Shape Estimation using Complex Principal Component Analysis and Clustering, Hallvar Haugdal (NTMU Norway)

Similar documents
Advanced data analysis

LOWELL WEEKLY JOURNAL. ^Jberxy and (Jmott Oao M d Ccmsparftble. %m >ai ruv GEEAT INDUSTRIES

PALACE PIER, ST. LEONARDS. M A N A G E R - B O W A R D V A N B I E N E.

PanHomc'r I'rui;* :".>r '.a'' W"»' I'fltolt. 'j'l :. r... Jnfii<on. Kslaiaaac. <.T i.. %.. 1 >

PCA, Kernel PCA, ICA

Principal Component Analysis

Lecture 11. Kernel Methods

Worksheets for GCSE Mathematics. Algebraic Expressions. Mr Black 's Maths Resources for Teachers GCSE 1-9. Algebra

Clusters. Unsupervised Learning. Luc Anselin. Copyright 2017 by Luc Anselin, All Rights Reserved

.1 "patedl-righl" timti tame.nto our oai.c iii C. W.Fiak&Co. She ftowtt outnal,

Introduction to Machine Learning. PCA and Spectral Clustering. Introduction to Machine Learning, Slides: Eran Halperin

Educjatipnal. L a d ie s * COBNWALILI.S H IG H SCHOOL. I F O R G IR L S A n B k i n d e r g a r t e n.

Kernel-Based Principal Component Analysis (KPCA) and Its Applications. Nonlinear PCA

Chapter 4: Factor Analysis

Introduction to Machine Learning

Variations. ECE 6540, Lecture 02 Multivariate Random Variables & Linear Algebra

SECTION 7: FAULT ANALYSIS. ESE 470 Energy Distribution Systems

Learning with Singular Vectors

Worksheets for GCSE Mathematics. Solving Equations. Mr Black's Maths Resources for Teachers GCSE 1-9. Algebra

Principal Component Analysis. Applied Multivariate Statistics Spring 2012

Principal Component Analysis

Statistical Learning with the Lasso, spring The Lasso

Principal Component Analysis of Sea Surface Temperature via Singular Value Decomposition

COMS 4721: Machine Learning for Data Science Lecture 19, 4/6/2017

COMPLEX PRINCIPAL COMPONENT SPECTRA EXTRACTION

P A L A C E P IE R, S T. L E O N A R D S. R a n n o w, q u a r r y. W WALTER CR O TC H, Esq., Local Chairman. E. CO O PER EVANS, Esq.,.

Large Scale Data Analysis Using Deep Learning

Lecture Notes to Big Data Management and Analytics Winter Term 2017/2018 Text Processing and High-Dimensional Data

Linear Dynamical Systems

Structure in Data. A major objective in data analysis is to identify interesting features or structure in the data.

Data Preprocessing. Jilles Vreeken IRDM 15/ Oct 2015

Worksheets for GCSE Mathematics. Quadratics. mr-mathematics.com Maths Resources for Teachers. Algebra

Independent Component Analysis and FastICA. Copyright Changwei Xiong June last update: July 7, 2016

Support Vector Machines. CSE 4309 Machine Learning Vassilis Athitsos Computer Science and Engineering Department University of Texas at Arlington

Random Vectors Part A

Principal Component Analysis

An Efficient Algorithm For Weak Hierarchical Lasso. Yashu Liu, Jie Wang, Jieping Ye Arizona State University

STATISTICAL LEARNING SYSTEMS

Multiple Regression Analysis: Inference ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD

Introduction to Density Estimation and Anomaly Detection. Tom Dietterich

Chapter 22 : Electric potential

Lecture 10: Logistic Regression

Comparing Robustness of Pairwise and Multiclass Neural-Network Systems for Face Recognition

Mining Big Data Using Parsimonious Factor and Shrinkage Methods

A Step Towards the Cognitive Radar: Target Detection under Nonstationary Clutter

ME5286 Robotics Spring 2017 Quiz 2

What is Principal Component Analysis?

Quantitative Understanding in Biology Principal Components Analysis

Machine Learning (BSMC-GA 4439) Wenke Liu

Predicting Winners of Competitive Events with Topological Data Analysis

Terms of Use. Copyright Embark on the Journey

9. Switched Capacitor Filters. Electronic Circuits. Prof. Dr. Qiuting Huang Integrated Systems Laboratory

(a)

Motivating the Covariance Matrix

Eigenface-based facial recognition

Convergence of Eigenspaces in Kernel Principal Component Analysis

Principal Components Analysis (PCA)

Learning Eigenfunctions: Links with Spectral Clustering and Kernel PCA

Machine learning for pervasive systems Classification in high-dimensional spaces

Vector Space Models. wine_spectral.r

IV. Matrix Approximation using Least-Squares

Sparse orthogonal factor analysis

10.4 The Cross Product

LOWELL WEEKI.Y JOURINAL

Kernel-Based Contrast Functions for Sufficient Dimension Reduction

Leverage Sparse Information in Predictive Modeling

Machine Learning. Ensemble Methods. Manfred Huber

Expectation Maximization

CS Lecture 8 & 9. Lagrange Multipliers & Varitional Bounds

Probabilistic & Unsupervised Learning. Latent Variable Models

(1) Correspondence of the density matrix to traditional method

Exploratory Factor Analysis: dimensionality and factor scores. Psychology 588: Covariance structure and factor models

Machine Learning. Principal Components Analysis. Le Song. CSE6740/CS7641/ISYE6740, Fall 2012

Review for Exam Hyunse Yoon, Ph.D. Assistant Research Scientist IIHR-Hydroscience & Engineering University of Iowa

Control of Mobile Robots

Grover s algorithm. We want to find aa. Search in an unordered database. QC oracle (as usual) Usual trick

Campus Academic Resource Program Differentiation Rules

ECE 5984: Introduction to Machine Learning

Analysis and Design of Control Dynamics of Manipulator Robot s Joint Drive

Independent Component Analysis and Its Application on Accelerator Physics

MATH 1080: Calculus of One Variable II Fall 2018 Textbook: Single Variable Calculus: Early Transcendentals, 7e, by James Stewart.

8.6 Bayesian neural networks (BNN) [Book, Sect. 6.7]

Confidence Intervals for the Odds Ratio in Logistic Regression with Two Binary X s

Intelligent Fault Classification of Rolling Bearing at Variable Speed Based on Reconstructed Phase Space

CS 540: Machine Learning Lecture 1: Introduction

1 Data Arrays and Decompositions

Table of Contents. Multivariate methods. Introduction II. Introduction I

Hopfield Network for Associative Memory

Lecture 7: Con3nuous Latent Variable Models

Chapter 5: Spectral Domain From: The Handbook of Spatial Statistics. Dr. Montserrat Fuentes and Dr. Brian Reich Prepared by: Amanda Bell

Massachusetts Institute of Technology. Problem Set 6 Solutions

Gopalkrishna Veni. Project 4 (Active Shape Models)

Factor Analysis (10/2/13)

Bootstrap & Confidence/Prediction intervals

Confidence Intervals for the Interaction Odds Ratio in Logistic Regression with Two Binary X s

EE16B Designing Information Devices and Systems II

PCA and admixture models

MSA220 Statistical Learning for Big Data

Heeyoul (Henry) Choi. Dept. of Computer Science Texas A&M University

Online Kernel PCA with Entropic Matrix Updates

Transcription:

Online Mode Shape Estimation using Complex Principal Component Analysis and Clustering, Hallvar Haugdal (NTMU Norway) Mr. Hallvar Haugdal, finished his MSc. in Electrical Engineering at NTNU, Norway in 2016, specializing in numerical field calculations and electrical motor design. After his MSc he worked a Scientific Assistant also at NTNU, dealing with courses on electrical machines, power systems and field calculations. Since January 2018 he is a PhD student on Wide Area Monitoring- and Control Systems.

Online Mode Shape Estimation using Complex Principal Component Analysis and Clustering Hallvar Haugdal

Contents Motivation Background theory (C)PCA Clustering Proposed method Application to Kundur Two Area-system

Contents Motivation Background theory (C)PCA Clustering Proposed method Application to Kundur Two Area-system

Motivation Modal analysis Powerful tool Understanding of system dynamics Stability limits Design of controllers Difficult due to inaccurate modelling Load dependent Propose empirical approach Relying only on Wide Area PMU measurements Correlation Statistical learning

Two-part method Part 1: Provide estimates of modes and mode shapes Moving window ~ 5 10 s length Using correlation, Complex Principal Component Analysis (CPCA) Provide point estimates of modes shapes Part 2: Compute averaged mode shapes Differentiate between noise and modes Clustering points resulting from Part 1 Averaged modes and -shapes computed as centroids of clusters

Contents Motivation Background theory (C)PCA Clustering Proposed method Application to Kundur Two Area-system

Principal Component Analysis Matrix of MM series, NN samples: XX = xx 11 xx 22 xx MM = xx 1 (tt 1 ) xx 1 tt 2 xx 1 tt NN xx 2 tt 1 xx 2 tt 2 xx 2 tt NN xx MM tt 1 xx 3 tt 2 xx MM tt NN Want to transform the correlated series xx 11, xx 22 xx MM into uncorrelated series ss 11, ss 22 ss MM : Eigenvalues (λ ii ) and -vectors (uu ii ): CCuu ii = λ ii uu ii, ii = 1 MM UU = uu 11, uu 22 uu MM UU CCCC = ΛΛ = λλ 1 λλ 2 λλ 1 > λλ 2 > λλ 3 λλ MM λλ MM SS = ss 11 ss 22 ss MM = UU TT XX, ss ii = uu ii TT XX Inversion gives time series from scores XX = UUUU Do this by eigendecomposition of the covariance matrix: CC = 1 1 NN XXXXTT MM xx ii = uu ii,jj ss jj jj=11 Contribution of ss jj in xx ii given by coefficient uu ii,jj

Complex Principal Component Analysis Similar to conventional PCA Additional steps: Empirical Mode Decomposition (EMD) Detrending (Decomposition Intrinsic Mode Functions) Hilbert Transform Complex time series zz ii Complex covariance matrix: CC = 1 NN 1 ZZ ZZ Contribution of ss jj in zz ii given by complex coefficient uu ii,jj Resembles observability mode shapes Slower than PCA Due to EMD and Hilbert Transform Works well with damped exponentials Noisy during steps [3]

Clustering Various methods K-means Gaussian Mixture Models [4] [2]

Contents Motivation Background theory (C)PCA Clustering Proposed method Application to Kundur Two Area-system

Proposed method, Part 1: Generating mode estimates Input: Starting out with time window 1. PCA XX SS (Scores) 2. EMD Detrending 3. Hilbert Transform 4. CPCA SS ZZ (Complex scores) Complex PC Scores resemble modes (ff, ξξ) Complex Coefficients resemble mode shapes Output: Point in 2MM + 1 dimensions pp = ff, Re CC 1, Im CC 1, Re CC 2, Im CC 2 Re CC MM, Im CC MM

Proposed method, Part 2: Averaged mode estimates using Clustering Resulting points from Part 1 are assumed to populate input space more densely close to areas corresponding to modes Input: Matrix of QQ point estimates PP = pp 1 pp 2 pp QQ = ff, Re CC 1, Im CC 1 ff, Re CC 1, Im CC 1 ff, Re CC 1, Im CC 1 Re CC MM, Im CC MM Re CC MM, Im CC MM Re CC MM, Im CC MM Clustering of points (number of clusters unknown) Output: Averaged mode shapes Centroids of clusters

Contents Motivation Background theory (C)PCA Clustering Proposed method Application to Kundur Two Area-system

Application Kundur Two-Area System Simulated data DigSILENT PowerFactory Short Circuits near generators [1]

Application Kundur Two-Area System Part 1 Moving window Point estimates pp 1 pp 2 pp 3

Application Kundur Two-Area System Part 2 Observations Clustering

Application Kundur Two-Area System Part 2 Resulting average mode shapes DigSILENT PowerFactory Modal Analysis

Concluding remarks Appears to give reasonable results Possible to differentiate modes of similar frequency Potential improvement Including damping Clustering Remove noise Frequency estimation Rotation of mode shapes Further thoughts Clustering on «long term» moving window Use parallel windows of different lengths Track modes Adaptive controllers Further testing Simulations with amibent noise Real PMU data

References [1] P. Kundur, Power System Stability and Control. New York: McGraw-Hill, 1994. [2] T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning. New York, NY: Springer New York, 2009. [3] J. D. Horel, Complex Principal Component Analysis: Theory and Examples, Journal of Climate and Applied Meteorology, vol. 23, no. 12. pp. 1660 1673, 1984. [4] Yu's Machine Learning Garden, retrieved from http://yulearning.blogspot.com/2014/11/einsteins-most-famous-equation-isemc2.html (2018)