Eigenvalues, Eigenvectors, and an Intro to PCA

Similar documents
Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues, Eigenvectors, and an Intro to PCA

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Lecture 15, 16: Diagonalization

Study Guide for Linear Algebra Exam 2

ICS 6N Computational Linear Algebra Symmetric Matrices and Orthogonal Diagonalization

Knowledge Discovery and Data Mining 1 (VO) ( )

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Solutions to Final Exam

Diagonalization of Matrix

MATH 221, Spring Homework 10 Solutions

Eigenvalues, Eigenvectors, and Diagonalization

Announcements Wednesday, November 01

Homework sheet 4: EIGENVALUES AND EIGENVECTORS. DIAGONALIZATION (with solutions) Year ? Why or why not? 6 9

PRINCIPAL COMPONENTS ANALYSIS

Recall : Eigenvalues and Eigenvectors

MATH 583A REVIEW SESSION #1

VAR Model. (k-variate) VAR(p) model (in the Reduced Form): Y t-2. Y t-1 = A + B 1. Y t + B 2. Y t-p. + ε t. + + B p. where:

Jordan Canonical Form Homework Solutions

MA 265 FINAL EXAM Fall 2012

MAC Module 12 Eigenvalues and Eigenvectors. Learning Objectives. Upon completing this module, you should be able to:

MAC Module 12 Eigenvalues and Eigenvectors

Singular Value Decomposition. 1 Singular Value Decomposition and the Four Fundamental Subspaces

Math 2331 Linear Algebra

The Jordan Normal Form and its Applications

Dimensionality Reduction: PCA. Nicholas Ruozzi University of Texas at Dallas

Linear Systems. Class 27. c 2008 Ron Buckmire. TITLE Projection Matrices and Orthogonal Diagonalization CURRENT READING Poole 5.4

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Final Exam Practice Problems Answers Math 24 Winter 2012

Econ Slides from Lecture 7

1. In this problem, if the statement is always true, circle T; otherwise, circle F.

Eigenvalues and Eigenvectors

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 304 Linear Algebra Lecture 33: Bases of eigenvectors. Diagonalization.

Intelligent Data Analysis. Principal Component Analysis. School of Computer Science University of Birmingham

1 Principal Components Analysis

Agenda: Understand the action of A by seeing how it acts on eigenvectors.

MAT 1302B Mathematical Methods II

Math 3191 Applied Linear Algebra

Mathematical foundations - linear algebra

TMA Calculus 3. Lecture 21, April 3. Toke Meier Carlsen Norwegian University of Science and Technology Spring 2013

Computationally, diagonal matrices are the easiest to work with. With this idea in mind, we introduce similarity:

(a) II and III (b) I (c) I and III (d) I and II and III (e) None are true.

Question 7. Consider a linear system A x = b with 4 unknown. x = [x 1, x 2, x 3, x 4 ] T. The augmented

Generalized Eigenvectors and Jordan Form

Linear Algebra. Rekha Santhanam. April 3, Johns Hopkins Univ. Rekha Santhanam (Johns Hopkins Univ.) Linear Algebra April 3, / 7

Announcements Wednesday, November 01

Matrices related to linear transformations

Jordan Normal Form and Singular Decomposition

Solutions to Review Problems for Chapter 6 ( ), 7.1

First of all, the notion of linearity does not depend on which coordinates are used. Recall that a map T : R n R m is linear if

and let s calculate the image of some vectors under the transformation T.

8 Eigenvectors and the Anisotropic Multivariate Gaussian Distribution

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Designing Information Devices and Systems I Discussion 4B

Eigenvalues and Eigenvectors 7.2 Diagonalization

(b) If a multiple of one row of A is added to another row to produce B then det(b) =det(a).

2. Every linear system with the same number of equations as unknowns has a unique solution.

The Singular Value Decomposition

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Chapter 4 & 5: Vector Spaces & Linear Transformations

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Linear Algebra Practice Problems

Math Final December 2006 C. Robinson

Principal Component Analysis

0 2 0, it is diagonal, hence diagonalizable)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Vectors To begin, let us describe an element of the state space as a point with numerical coordinates, that is x 1. x 2. x =

Announcements Monday, October 29

Chapter 7: Symmetric Matrices and Quadratic Forms

Chapter 5. Eigenvalues and Eigenvectors

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

235 Final exam review questions

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Definition (T -invariant subspace) Example. Example

5.3.5 The eigenvalues are 3, 2, 3 (i.e., the diagonal entries of D) with corresponding eigenvalues. Null(A 3I) = Null( ), 0 0

MAT1302F Mathematical Methods II Lecture 19

Matrix Vector Products

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Linear Algebra II Lecture 22

1 Last time: least-squares problems

Eigenvalues and Eigenvectors

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Name Solutions Linear Algebra; Test 3. Throughout the test simplify all answers except where stated otherwise.

Eigenvalues for Triangular Matrices. ENGI 7825: Linear Algebra Review Finding Eigenvalues and Diagonalization

APPLICATIONS The eigenvalues are λ = 5, 5. An orthonormal basis of eigenvectors consists of

Linear Algebra II Lecture 13

Math 315: Linear Algebra Solutions to Assignment 7

Chapter 2. Matrix Arithmetic. Chapter 2

Data Preprocessing Tasks

Diagonalization of Matrices

Remark 1 By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

UNDERSTANDING THE DIAGONALIZATION PROBLEM. Roy Skjelnes. 1.- Linear Maps 1.1. Linear maps. A map T : R n R m is a linear map if

Chapter 6. Eigenvalues. Josef Leydold Mathematical Methods WS 2018/19 6 Eigenvalues 1 / 45

Linear Algebra- Final Exam Review

Transcription:

Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA

Changing Basis We ve talked so far about re-writing our data using a new set of variables, or a new basis. How to choose this new basis?? Eigenvalues, Eigenvectors, and an Intro to PCA

PCA One (very popular) method: start by choosing the basis vectors as the directions in which the variance of the data is maximal. weight height Eigenvalues, Eigenvectors, and an Intro to PCA

PCA One (very popular) method: start by choosing the basis vectors as the directions in which the variance of the data is maximal. v1 w h Eigenvalues, Eigenvectors, and an Intro to PCA

PCA Then, choose subsequent directions that are orthogonal to the first and have the next largest variance. v2 w h Eigenvalues, Eigenvectors, and an Intro to PCA

PCA As it turns out, these directions are given by the eigenvectors of either: The Covariance Matrix (data is only centered) The Correlation Matrix (data is completely standardized) Great. What the heck is an eigenvector? Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues and Eigenvectors

Effect of Matrix Multiplication When we multiply a matrix by a vector, what do we get? A x =? = b Another vector. When the matrix A is square (n n) then x and b are both vectors in R n. We can draw (or imagine) them in the same space.

Linear Transformation In general, multiplying a vector by a matrix changes both its magnitude and direction. A = ( ) 1 1 6 0 x = ( ) 1 5 Ax = ( ) 4 6 span(e2) x Ax span(e1)

Eigenvectors However, a matrix may act on certain vectors by changing only their magnitude, not their direction (or possibly reversing it) ( ) ( ) ( ) 1 1 1 2 A = x = Ax = 6 0 3 6 span(e2) Ax x span(e1) The matrix A acts like a scalar!

Eigenvalues and Eigenvectors For a square matrix, A, a non-zero vector x is called an eigenvector if multiplication by A results in a scalar multiple of x. Ax = λx The scalar λ is called the eigenvalue associated with the eigenvector.

Previous Example ( ) 1 1 A = 6 0 ( ) 2 Ax = = 2 6 x = ( ) 1 3 ( ) 1 = 2x 3 = λ = 2 is the eigenvalue span(e2) Ax x span(e1)

Example 2 Show that x is an eigenvector of A and find the corresponding eigenvalue. ( ) ( ) 1 1 1 A = x = 6 0 2 ( ) 3 Ax = ( 1 = 3 2 6 ) = 3x = λ = 3 is the eigenvalue corresponding to the eigenvector x

Example 2 A = ( ) 1 1 6 0 x = span(e2) ( ) 1 2 Ax = 3x Ax x span(e1)

Let s Practice 1 Show that v is an eigenvector of A and find the corresponding eigenvalue: A = A = ( ) 1 2 2 1 ( ) 1 1 6 0 v = v = ( ) 3 3 ( ) 1 2 2 Can a rectangular matrix have eigenvalues/eigenvectors?

Eigenspaces Fact: Any scalar multiple of an eigenvector is also an eigenvector with the same eigenvalue: A = ( ) 1 1 6 0 x = ( ) 1 3 λ = 2 Try v = ( ) 2 6 or u = ( ) 1 3 or z = ( ) 3 9 In general (#proof): Let Ax = λx. If c is some constant, then: A(cx) = c(ax) = c(λx) = λ(cx)

Eigenspaces There are infinitely many eigenvectors associated with any one eigenvalue. Any scalar multiple (positive or negative) can be used. The collection is referred to as the eigenspace associated with the eigenvalue. In the previous example, the eigenspace of λ = 2 was the { span x = ( )} 1. 3 With this in mind what should you expect from software?

Zero Eigenvalues What if λ = 0 is an eigenvalue for some matrix A? Ax = 0x = 0, where x 0 is an eigenvector. This means that some linear combination of the columns of A equals zero! = Columns of A are linearly dependent. = A is not full rank. = Perfect Multicollinearity.

Eigenpairs For a square (n n) matrix A, eigenvalues and eigenvectors come in pairs. Normally, the pairs are ordered by magnitude of eigenvalue. λ 1 λ 2 λ 3 λ n λ 1 is the largest eigenvalue Eigenvector v i corresponds to eigenvalue λ i

Let s Practice 1 For the following matrix, determine the eigenvalue associated with the given eigenvector. 1 1 0 1 A = 0 1 1 v = 1 1 0 1 1 From this eigenvalue, what can you conclude about the matrix A? 2 The matrix M has eigenvectors u and v. What is λ 1, the first eigenvalue for the matrix M? ( ) ( ) ( ) 1 1 1 1 M = u = v = 6 0 3 2 3 For the previous problem, is the specific eigenvector (u or v) the only eigenvector associated with λ 1?

Diagonalization What happens if I put all the eigenvectors of A into a matrix as columns: V = [v 1 v 2 v 3... v n ] and then compute AV? AV = A[v 1 v 2 v 3... v n ] = [Av 1 Av 2 Av 3... Av n ] = [λ 1 v 1 λ 2 v 2 λ 3 v 3... λ n v n ] The effect is that the columns are all scaled by the corresponding eigenvalue.

Diagonalization AV = [λ 1 v 1 λ 2 v 2 λ 3 v 3... λ n v n ] The same thing would happen if we multiplied V by a diagonal matrix of eigenvalues on the right: λ 1 0 0... 0 0 λ 2 0... 0 VD = [v 1 v 2 v 3... v n ]. 0 0... 0....... 0 0 0 0 λ n = [λ 1 v 1 λ 2 v 2 λ 3 v 3... λ n v n ] = AV = VD

Diagonalization AV = VD If the matrix A has n linearly independent eigenvectors, then the matrix V has an inverse. = V 1 AV = D This is called the diagonalization of A. It is only possible when A has n linearly independent eigenvectors (so that the matrix V has an inverse).

Let s Practice For the matrix A = ( ) 0 4, 1 5 a. The pairs λ 1 = 4, v 1 = ( ) 1 1 and λ 2 = 1, v 2 = ( ) 4 1 are eigenpairs of A. For each pair, provide another eigenvector associated with that eigenvalue. b. Based on the eigenvectors listed in part a, can the matrix A be diagonalized? Why or why not? If diagonalization is possible, explain how it would be done.

Symmetric Matrices - Properties Symmetric matrices (like the covariance matrix, the correlation matrix, and X T X) have some very nice properties: 1 They are always diagonalizable. (Always have n linearly independent eigenvectors.) 2 Their eigenvectors are all mutually orthogonal. 3 Thus, if you normalize the eigenvectors (software does this automatically) they will form an orthogonal matrix, V. = AV = VD V T AV = D

Eigenvectors of Symmetric Matrices 5 4 3 2 1 0 1 2 3 4 5 5 4 3 2 1 0 1 2 3 4 5