Linear Algebra Methods for Data Mining
|
|
- Marsha Hudson
- 6 years ago
- Views:
Transcription
1 Linear Algebra Methods for Data Mining Saara Hyvönen, Spring Basic Linear Algebra continued Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki
2 Happened so far: matrix-vector multiplication, matrix-matrix multiplication vector norms, matrix norms distances between vectors eigenvalues, eigenvectors linear independence basis orthogonality Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 1
3 Eigenvalues and eigenvectors Let A be a n n matrix. The vector v 0 that satisfies Av = λv for some scalar λ is called the eigenvector of A and λ is the eigenvalue corresponding to the eigenvector v. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 2
4 Example det(a λi) = Av = ( λ λ ) v = λv. = (2 λ)(3 λ) 1 = 0 λ 1 = 3.62 λ 2 = 1.38 ( ) ( v 1 = v 2 = ) Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 3
5 Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 4
6 Orthogonality Two vectors x and y are orthogonal, if x T y = 0. Let q j, j = 1,..., n be orthogonal, i.e. q T i q j = 0, i j. Then they are linearly independent. Let the set of orthogonal vectors q j, j = 1,..., m in R m be normalized, q = 1. Then they are orthonormal, and constitute an orthonormal basis in R m. A matrix R m m Q = [q 1 q 2... q m ] with orthonormal columns is called an orthogonal matrix. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 5
7 Example In the previous example we determined the eigenvectors of the matrix ( ) to be v 1 = ( ), v 2 = The vectors v 1 and v 2 are orthogonal: v T 1 v 2 = ( 0.52) = 0. This is no coincidence! ( ). Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 6
8 Eigenvalues and eigenvectors of a symmetric matrix The eigenvectors of a symmetric matrix are mutually orthogonal and its eigenvalues are real. (A R n n is a symmetric matrix, if A = A T.) Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 7
9 Eigendecomposition A symmetric matrix A R n n can be written in the form A = UΛU T, where the columns of U are the eigenvectors of A and Λ is a diagonal matrix, the diagonal elements of which are the corresponding eigenvalues of A. Note, that U is orthogonal. This is called the eigendecomposition or symmetric Schur decomposition of A. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 8
10 U=[ ; ] Check U = Lambda=[3.62 0;0 1.38] Lambda = Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 9
11 U*Lambda*U ans = Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 10
12 Example of symmetric matrices: graphs The adjacency matrix of an undirected graph is a symmetric matrix: 1 2 A = Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 11
13 Example: virus propagation Question 1: How does a virus spread across an arbitrary network? Question 2: Will it create an epidemic? Question 3: What can we do about it? Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 12
14 Model: the Susceptible-Infected-Susceptible (SIS) model cured nodes immediately become susceptible virus birth rate β: probability that an infected neighbor attacks virus death rate δ: probability that an infected node heals virus strength s = β/δ. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 13
15 Healthy Infected? Infected Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 14
16 Epidemic threshold τ The epidemic threshold of a graph is the largest value of τ for which it holds that if the virus strength an epidemic can not happen. s = β/δ < τ, Problem: given a graph G, compute the epidemic threshold τ. The epidemic threshold depends only on the graph! But which properties of the graph? Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 15
17 Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 16
18 Answer (to question 2) Theorem. (Wang et al, 2003) Let us have a graph with the adjacency matrix A. We have no epidemic, if where β/δ < τ = 1/λ A, λ A the largest eigenvalue of A β is the prob. that an infected neighbor attacks, δ is the prob. that an infected node heals. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 17
19 What can we do with this information? Q: Who is the best person/computer to immunize against the virus? A: The one the removal of which will make the largest difference in λ A. Note: Eigenvalues are strongly related to graph topology! Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 18
20 Other questions that can be answered in a similar way: who is the best customer to advertise to? who originated a raging rumor? in general: how important is a node in a network? Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 19
21 What if the matrix is not symmetric? If A is symmetric, we can write A = UΛU T, where U and Λ contain the eigenvectors and corresponding eigenvalues of A. What if A R m n? Definitely not symmetric! Then we can use the singular value decomposition (SVD): A = UΣV T, where U and V are orthogonal, and Σ is diagonal. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 20
22 Example: customer \ day Wed Thu Fri Sat Sun ABC Inc CDE Co FGH Ltd NOP Inc Smith Brown Johnson Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 21
23 = ( ) ( ) Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 22
24 How to compute matrix decompositions? We have had a brief glimpse at eigenvalue decomposition and singular value decomposition. Before taking a closer look: how can we compute these? they are imple- Answer 1: use LAPACK, Matlab, Mathematica,... mented everywhere! Answer 2: we need more linear algebra tools! Back to basics... Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 23
25 Givens (plane) rotations G = ( c s s c ), c 2 + s 2 = 1. theta=pi/8;c=cos(theta);s=sin(theta); G=[c s;-s c] G *G Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 24
26 Givens (plane) rotations G = ( c s s c ), c 2 + s 2 = 1. We can choose c and s so that c = cos(θ), s = sin(θ) for some θ. Then multiplication of a vector x by G means that we rotate the vector in th R 2 by an angle θ: Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 25
27 x θ Gx Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 26
28 Embed a 2-D rotation in a larger unit matrix: c 0 s s 0 c Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 27
29 G = G *G ans = Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 28
30 Givens rotations can be used to zero elements in vectors and matrices. ( ) c s Rotation matrix G =, c 2 + s 2 = 1. s c Choose c (and s) so that Gv = ( α 0 ) : Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 29
31 Choose c (and s) so that Gv = ( α 0 ) : c = v 1 v v 2 2, s = v 2 v v 2 2 (Check that this does what it is supposed to do!) Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 30
32 We can choose c and s in c 0 s s 0 c so that we can zero element 4 in a vector by a rotation in plane (2,4). Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 31
33 x=[1;2;3;4]; sq=sqrt(x(2)^2+x(4)^2); c=x(2)/sq;s=x(4)/sq; G=[ ;0 c 0 s; ;0 -s 0 c]; y=g*x y = Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 32
34 Reminder The inverse of an orthogonal matrix Q is Q 1 = Q T. The Euclidean length of a vector is invariant under an orthogonal transformation Q: Qx 2 = (Qx) T Qx = x T x = x 2. The product of two orthogonal matrices Q and P is orthogonal: X T X = (PQ) T PQ = Q T P T PQ = Q T Q = I. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 33
35 Transforming a vector v to αe 1 Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 34
36 Summarize αe 1 = G 3 (G 2 (G 1 v)) = (G 3 G 2 G 1 )v. Denote P = G 3 G 2 G 1. P is orthogonal, and Pv = αe 1. Since P is orthogonal, euclidean length is preserved, and α = v = n vi 2. i=1 Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 35
37 Number of floating point operations What is the number of flops needed to transform the first column of a m n matrix to αe 1, a multiple of the first unit vector? the computation of ( c s s c ) ( x y ) = ( cx + sy sx + cy ) requires 4 multiplications and 2 additions, i.e. 6 flops. Applying such a transformation to a m n matrix requires 6n flops. In order to zero all elements but one in the first column of the matrix we must apply (m 1) rotations. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 36
38 overall flop count: 6(m 1)n 6mn. the so called Householder transformation, with which one can do the same thing, is slightly cheaper: 4mn. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 37
39 What was that all about? We can use very simple, orthogonal transformations (e.g. rotations) to zero elements in a matrix. Givens In principle, this is what is done when matrix decompositions are calculated. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 38
40 Matrix decompositions We wish to decompose the matrix A by writing it as a product of two or more matrices: A m n = B m k C k n, A m n = B m k C k r D r n This is done in such a way that the right side of the equation yields some useful information or insight to the nature of the data matrix A. Or is in other ways useful for solving the problem at hand. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 39
41 Matrix decompositions There are numerous examples of useful matrix decompositions: Eigendecomposition: A = UΛU T, A symmetric, U and Λ eigenvectors and eigenvalues. Singular value decomposition: A = UΣV T, U, V orthogonal, Σ diagonal. Matrix factorization is the same thing as matrix decomposition (e.g. NMF = nonnegative matix factorization, V = WH, all elements nonnegative.) Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 40
42 How to compute these? Roughly (attn: in reality there is more to it than this!), (1) Manipulate A by multiplying it by intelligently chosen, fairly simple (orthogonal) matrices from both sides: V k... V 2 V 1 AW 1... W s = B, until B is nice. (2) Denote V = V k... V 2 V 1, W = W 1... W s. Now A = VBW T. But how to choose V 1,..., V k, W 1,..., W s? Needed: linear algebra tools for transforming matrices in an orderly fashion: Givens! Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 41
43 By applying several Givens rotations in succession, we can transform α 0 x Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 42
44 QR transformation We transform A Q T A = R, where Q is orthogonal and R is upper triangular. Example: Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 43
45 Note! Any matrix A R m n, m n, can be transformed to upper triangular form by an orthogonal matrix. If the columns of A are linearly independent, then R is non-singular. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 44
46 QR decomposition A = Q ( R 0 ) Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 45
47 References [1] Lars Eldén: Matrix Methods in Data Mining and Pattern Recognition, SIAM [2] G. H. Golub and C. F. Van Loan. Matrix Computations. 3rd ed. Johns Hopkins Press, Baltimore, MD., [3] Y. Wang, D. Chakrabarti, C. Wang and C. Faloutsos, Epidemic Spreading in Real Networks: an Eigenvalue Viewpoint, SRDS 2003, Florence, Italy. Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki 46
Linear Algebra Methods for Data Mining
Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 1. Basic Linear Algebra Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki Example
More informationLinear Algebra Methods for Data Mining
Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 PCA, NMF Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki Summary: PCA PCA is SVD
More informationLinear Algebra Methods for Data Mining
Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 The Singular Value Decomposition (SVD) continued Linear Algebra Methods for Data Mining, Spring 2007, University
More informationLinear Algebra - Part II
Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T
More information2. Review of Linear Algebra
2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear
More informationEECS 275 Matrix Computation
EECS 275 Matrix Computation Ming-Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced, CA 95344 http://faculty.ucmerced.edu/mhyang Lecture 12 1 / 18 Overview
More informationCS 143 Linear Algebra Review
CS 143 Linear Algebra Review Stefan Roth September 29, 2003 Introductory Remarks This review does not aim at mathematical rigor very much, but instead at ease of understanding and conciseness. Please see
More informationNumerical Methods for Solving Large Scale Eigenvalue Problems
Peter Arbenz Computer Science Department, ETH Zürich E-mail: arbenz@inf.ethz.ch arge scale eigenvalue problems, Lecture 2, February 28, 2018 1/46 Numerical Methods for Solving Large Scale Eigenvalue Problems
More informationMath 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination
Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column
More informationNotes on Eigenvalues, Singular Values and QR
Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square
More informationLinear Algebra Methods for Data Mining
Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 Linear Discriminant Analysis Linear Algebra Methods for Data Mining, Spring 2007, University of Helsinki Principal
More informationThe Singular Value Decomposition
The Singular Value Decomposition An Important topic in NLA Radu Tiberiu Trîmbiţaş Babeş-Bolyai University February 23, 2009 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More informationIntroduction to Matrix Algebra
Introduction to Matrix Algebra August 18, 2010 1 Vectors 1.1 Notations A p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the line. When p
More informationNonnegative and spectral matrix theory Lecture notes
Nonnegative and spectral matrix theory Lecture notes Dario Fasino, University of Udine (Italy) Lecture notes for the first part of the course Nonnegative and spectral matrix theory with applications to
More informationSolving large scale eigenvalue problems
arge scale eigenvalue problems, Lecture 4, March 14, 2018 1/41 Lecture 4, March 14, 2018: The QR algorithm http://people.inf.ethz.ch/arbenz/ewp/ Peter Arbenz Computer Science Department, ETH Zürich E-mail:
More informationMath Bootcamp An p-dimensional vector is p numbers put together. Written as. x 1 x =. x p
Math Bootcamp 2012 1 Review of matrix algebra 1.1 Vectors and rules of operations An p-dimensional vector is p numbers put together. Written as x 1 x =. x p. When p = 1, this represents a point in the
More informationJordan Normal Form and Singular Decomposition
University of Debrecen Diagonalization and eigenvalues Diagonalization We have seen that if A is an n n square matrix, then A is diagonalizable if and only if for all λ eigenvalues of A we have dim(u λ
More informationFundamentals of Matrices
Maschinelles Lernen II Fundamentals of Matrices Christoph Sawade/Niels Landwehr/Blaine Nelson Tobias Scheffer Matrix Examples Recap: Data Linear Model: f i x = w i T x Let X = x x n be the data matrix
More informationLinear Algebra, part 3. Going back to least squares. Mathematical Models, Analysis and Simulation = 0. a T 1 e. a T n e. Anna-Karin Tornberg
Linear Algebra, part 3 Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2010 Going back to least squares (Sections 1.7 and 2.3 from Strang). We know from before: The vector
More informationLinear Methods in Data Mining
Why Methods? linear methods are well understood, simple and elegant; algorithms based on linear methods are widespread: data mining, computer vision, graphics, pattern recognition; excellent general software
More informationApplied Linear Algebra in Geoscience Using MATLAB
Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in
More informationBasic Calculus Review
Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V
More informationComputational Methods. Eigenvalues and Singular Values
Computational Methods Eigenvalues and Singular Values Manfred Huber 2010 1 Eigenvalues and Singular Values Eigenvalues and singular values describe important aspects of transformations and of data relations
More informationbe a Householder matrix. Then prove the followings H = I 2 uut Hu = (I 2 uu u T u )u = u 2 uut u
MATH 434/534 Theoretical Assignment 7 Solution Chapter 7 (71) Let H = I 2uuT Hu = u (ii) Hv = v if = 0 be a Householder matrix Then prove the followings H = I 2 uut Hu = (I 2 uu )u = u 2 uut u = u 2u =
More informationWe will discuss matrix diagonalization algorithms in Numerical Recipes in the context of the eigenvalue problem in quantum mechanics, m A n = λ m
Eigensystems We will discuss matrix diagonalization algorithms in umerical Recipes in the context of the eigenvalue problem in quantum mechanics, A n = λ n n, (1) where A is a real, symmetric Hamiltonian
More informationNumerical Methods. Elena loli Piccolomini. Civil Engeneering. piccolom. Metodi Numerici M p. 1/??
Metodi Numerici M p. 1/?? Numerical Methods Elena loli Piccolomini Civil Engeneering http://www.dm.unibo.it/ piccolom elena.loli@unibo.it Metodi Numerici M p. 2/?? Least Squares Data Fitting Measurement
More informationIr O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )
Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O
More information. = V c = V [x]v (5.1) c 1. c k
Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,
More informationSocial Influence in Online Social Networks. Epidemiological Models. Epidemic Process
Social Influence in Online Social Networks Toward Understanding Spatial Dependence on Epidemic Thresholds in Networks Dr. Zesheng Chen Viral marketing ( word-of-mouth ) Blog information cascading Rumor
More information1 Last time: least-squares problems
MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that
More informationLinear Algebra, part 3 QR and SVD
Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We
More informationLinear Algebra Review. Vectors
Linear Algebra Review 9/4/7 Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka http://cs.gmu.edu/~kosecka/cs682.html Virginia de Sa (UCSD) Cogsci 8F Linear Algebra review Vectors
More informationEcon Slides from Lecture 7
Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for
More informationKnowledge Discovery and Data Mining 1 (VO) ( )
Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory
More informationEigenvalue Problems and Singular Value Decomposition
Eigenvalue Problems and Singular Value Decomposition Sanzheng Qiao Department of Computing and Software McMaster University August, 2012 Outline 1 Eigenvalue Problems 2 Singular Value Decomposition 3 Software
More informationLINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM
LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator
More informationSome notes on Linear Algebra. Mark Schmidt September 10, 2009
Some notes on Linear Algebra Mark Schmidt September 10, 2009 References Linear Algebra and Its Applications. Strang, 1988. Practical Optimization. Gill, Murray, Wright, 1982. Matrix Computations. Golub
More informationLet A an n n real nonsymmetric matrix. The eigenvalue problem: λ 1 = 1 with eigenvector u 1 = ( ) λ 2 = 2 with eigenvector u 2 = ( 1
Eigenvalue Problems. Introduction Let A an n n real nonsymmetric matrix. The eigenvalue problem: EIGENVALE PROBLEMS AND THE SVD. [5.1 TO 5.3 & 7.4] Au = λu Example: ( ) 2 0 A = 2 1 λ 1 = 1 with eigenvector
More informationLinear Algebra and Matrices
Linear Algebra and Matrices 4 Overview In this chapter we studying true matrix operations, not element operations as was done in earlier chapters. Working with MAT- LAB functions should now be fairly routine.
More informationSVD, PCA & Preprocessing
Chapter 1 SVD, PCA & Preprocessing Part 2: Pre-processing and selecting the rank Pre-processing Skillicorn chapter 3.1 2 Why pre-process? Consider matrix of weather data Monthly temperatures in degrees
More informationEE731 Lecture Notes: Matrix Computations for Signal Processing
EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University October 17, 005 Lecture 3 3 he Singular Value Decomposition
More informationLinear Algebra Massoud Malek
CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product
More informationProblems of Eigenvalues/Eigenvectors
67 Problems of Eigenvalues/Eigenvectors Reveiw of Eigenvalues and Eigenvectors Gerschgorin s Disk Theorem Power and Inverse Power Methods Jacobi Transform for Symmetric Matrices Spectrum Decomposition
More informationNotes on Solving Linear Least-Squares Problems
Notes on Solving Linear Least-Squares Problems Robert A. van de Geijn The University of Texas at Austin Austin, TX 7871 October 1, 14 NOTE: I have not thoroughly proof-read these notes!!! 1 Motivation
More informationReview of Linear Algebra
Review of Linear Algebra Definitions An m n (read "m by n") matrix, is a rectangular array of entries, where m is the number of rows and n the number of columns. 2 Definitions (Con t) A is square if m=
More information1 Vectors. Notes for Bindel, Spring 2017 Numerical Analysis (CS 4220)
Notes for 2017-01-30 Most of mathematics is best learned by doing. Linear algebra is no exception. You have had a previous class in which you learned the basics of linear algebra, and you will have plenty
More informationUNIT 6: The singular value decomposition.
UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T
More informationMATH 221, Spring Homework 10 Solutions
MATH 22, Spring 28 - Homework Solutions Due Tuesday, May Section 52 Page 279, Problem 2: 4 λ A λi = and the characteristic polynomial is det(a λi) = ( 4 λ)( λ) ( )(6) = λ 6 λ 2 +λ+2 The solutions to the
More informationReview problems for MA 54, Fall 2004.
Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on
More informationSummary of Week 9 B = then A A =
Summary of Week 9 Finding the square root of a positive operator Last time we saw that positive operators have a unique positive square root We now briefly look at how one would go about calculating the
More informationLecture 15, 16: Diagonalization
Lecture 15, 16: Diagonalization Motivation: Eigenvalues and Eigenvectors are easy to compute for diagonal matrices. Hence, we would like (if possible) to convert matrix A into a diagonal matrix. Suppose
More informationTBP MATH33A Review Sheet. November 24, 2018
TBP MATH33A Review Sheet November 24, 2018 General Transformation Matrices: Function Scaling by k Orthogonal projection onto line L Implementation If we want to scale I 2 by k, we use the following: [
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationLinear Algebra- Final Exam Review
Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.
More informationApplied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic
Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization
More informationThe SVD-Fundamental Theorem of Linear Algebra
Nonlinear Analysis: Modelling and Control, 2006, Vol. 11, No. 2, 123 136 The SVD-Fundamental Theorem of Linear Algebra A. G. Akritas 1, G. I. Malaschonok 2, P. S. Vigklas 1 1 Department of Computer and
More informationCS 246 Review of Linear Algebra 01/17/19
1 Linear algebra In this section we will discuss vectors and matrices. We denote the (i, j)th entry of a matrix A as A ij, and the ith entry of a vector as v i. 1.1 Vectors and vector operations A vector
More informationFunctional Analysis Review
Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all
More informationLecture 2: Linear Algebra Review
EE 227A: Convex Optimization and Applications January 19 Lecture 2: Linear Algebra Review Lecturer: Mert Pilanci Reading assignment: Appendix C of BV. Sections 2-6 of the web textbook 1 2.1 Vectors 2.1.1
More information1 Singular Value Decomposition and Principal Component
Singular Value Decomposition and Principal Component Analysis In these lectures we discuss the SVD and the PCA, two of the most widely used tools in machine learning. Principal Component Analysis (PCA)
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More informationAM 205: lecture 8. Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition
AM 205: lecture 8 Last time: Cholesky factorization, QR factorization Today: how to compute the QR factorization, the Singular Value Decomposition QR Factorization A matrix A R m n, m n, can be factorized
More informationMarkov Chains and Spectral Clustering
Markov Chains and Spectral Clustering Ning Liu 1,2 and William J. Stewart 1,3 1 Department of Computer Science North Carolina State University, Raleigh, NC 27695-8206, USA. 2 nliu@ncsu.edu, 3 billy@ncsu.edu
More information15 Singular Value Decomposition
15 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing
More informationEigenvalue and Eigenvector Problems
Eigenvalue and Eigenvector Problems An attempt to introduce eigenproblems Radu Trîmbiţaş Babeş-Bolyai University April 8, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) Eigenvalue and Eigenvector Problems
More informationB553 Lecture 5: Matrix Algebra Review
B553 Lecture 5: Matrix Algebra Review Kris Hauser January 19, 2012 We have seen in prior lectures how vectors represent points in R n and gradients of functions. Matrices represent linear transformations
More informationQuantum Computing Lecture 2. Review of Linear Algebra
Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces
More informationLecture notes: Applied linear algebra Part 1. Version 2
Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and
More informationMATH 581D FINAL EXAM Autumn December 12, 2016
MATH 58D FINAL EXAM Autumn 206 December 2, 206 NAME: SIGNATURE: Instructions: there are 6 problems on the final. Aim for solving 4 problems, but do as much as you can. Partial credit will be given on all
More informationProblems of Eigenvalues/Eigenvectors
5 Problems of Eigenvalues/Eigenvectors Reveiw of Eigenvalues and Eigenvectors Gerschgorin s Disk Theorem Power and Inverse Power Methods Jacobi Transform for Symmetric Matrices Singular Value Decomposition
More informationLecture 10 - Eigenvalues problem
Lecture 10 - Eigenvalues problem Department of Computer Science University of Houston February 28, 2008 1 Lecture 10 - Eigenvalues problem Introduction Eigenvalue problems form an important class of problems
More informationMathematical foundations - linear algebra
Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar
More informationBasic Elements of Linear Algebra
A Basic Review of Linear Algebra Nick West nickwest@stanfordedu September 16, 2010 Part I Basic Elements of Linear Algebra Although the subject of linear algebra is much broader than just vectors and matrices,
More informationNOTES ON LINEAR ALGEBRA CLASS HANDOUT
NOTES ON LINEAR ALGEBRA CLASS HANDOUT ANTHONY S. MAIDA CONTENTS 1. Introduction 2 2. Basis Vectors 2 3. Linear Transformations 2 3.1. Example: Rotation Transformation 3 4. Matrix Multiplication and Function
More information14 Singular Value Decomposition
14 Singular Value Decomposition For any high-dimensional data analysis, one s first thought should often be: can I use an SVD? The singular value decomposition is an invaluable analysis tool for dealing
More informationDeep Learning Book Notes Chapter 2: Linear Algebra
Deep Learning Book Notes Chapter 2: Linear Algebra Compiled By: Abhinaba Bala, Dakshit Agrawal, Mohit Jain Section 2.1: Scalars, Vectors, Matrices and Tensors Scalar Single Number Lowercase names in italic
More informationforms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms
Christopher Engström November 14, 2014 Hermitian LU QR echelon Contents of todays lecture Some interesting / useful / important of matrices Hermitian LU QR echelon Rewriting a as a product of several matrices.
More informationLinear Algebra for Machine Learning. Sargur N. Srihari
Linear Algebra for Machine Learning Sargur N. srihari@cedar.buffalo.edu 1 Overview Linear Algebra is based on continuous math rather than discrete math Computer scientists have little experience with it
More informationSTA141C: Big Data & High Performance Statistical Computing
STA141C: Big Data & High Performance Statistical Computing Lecture 5: Numerical Linear Algebra Cho-Jui Hsieh UC Davis April 20, 2017 Linear Algebra Background Vectors A vector has a direction and a magnitude
More informationThe Cartan Dieudonné Theorem
Chapter 7 The Cartan Dieudonné Theorem 7.1 Orthogonal Reflections Orthogonal symmetries are a very important example of isometries. First let us review the definition of a (linear) projection. Given a
More informationSpectral radius, symmetric and positive matrices
Spectral radius, symmetric and positive matrices Zdeněk Dvořák April 28, 2016 1 Spectral radius Definition 1. The spectral radius of a square matrix A is ρ(a) = max{ λ : λ is an eigenvalue of A}. For an
More informationNumerical Linear Algebra
Numerical Linear Algebra By: David McQuilling; Jesus Caban Deng Li Jan.,31,006 CS51 Solving Linear Equations u + v = 8 4u + 9v = 1 A x b 4 9 u v = 8 1 Gaussian Elimination Start with the matrix representation
More informationLINEAR ALGEBRA KNOWLEDGE SURVEY
LINEAR ALGEBRA KNOWLEDGE SURVEY Instructions: This is a Knowledge Survey. For this assignment, I am only interested in your level of confidence about your ability to do the tasks on the following pages.
More informationLinear Algebra. Session 12
Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)
More informationChapter 5. Eigenvalues and Eigenvectors
Chapter 5 Eigenvalues and Eigenvectors Section 5. Eigenvectors and Eigenvalues Motivation: Difference equations A Biology Question How to predict a population of rabbits with given dynamics:. half of the
More information33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM
33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the
More informationFrom Matrix to Tensor. Charles F. Van Loan
From Matrix to Tensor Charles F. Van Loan Department of Computer Science January 28, 2016 From Matrix to Tensor From Tensor To Matrix 1 / 68 What is a Tensor? Instead of just A(i, j) it s A(i, j, k) or
More informationChapter 7: Symmetric Matrices and Quadratic Forms
Chapter 7: Symmetric Matrices and Quadratic Forms (Last Updated: December, 06) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved
More informationCS224W: Analysis of Networks Jure Leskovec, Stanford University
Announcements: Please fill HW Survey Weekend Office Hours starting this weekend (Hangout only) Proposal: Can use 1 late period CS224W: Analysis of Networks Jure Leskovec, Stanford University http://cs224w.stanford.edu
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationApplied Numerical Linear Algebra. Lecture 8
Applied Numerical Linear Algebra. Lecture 8 1/ 45 Perturbation Theory for the Least Squares Problem When A is not square, we define its condition number with respect to the 2-norm to be k 2 (A) σ max (A)/σ
More information7. Symmetric Matrices and Quadratic Forms
Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value
More informationACM106a - Homework 4 Solutions
ACM106a - Homework 4 Solutions prepared by Svitlana Vyetrenko November 17, 2006 1. Problem 1: (a) Let A be a normal triangular matrix. Without the loss of generality assume that A is upper triangular,
More informationThis appendix provides a very basic introduction to linear algebra concepts.
APPENDIX Basic Linear Algebra Concepts This appendix provides a very basic introduction to linear algebra concepts. Some of these concepts are intentionally presented here in a somewhat simplified (not
More informationEigenvalues and diagonalization
Eigenvalues and diagonalization Patrick Breheny November 15 Patrick Breheny BST 764: Applied Statistical Modeling 1/20 Introduction The next topic in our course, principal components analysis, revolves
More informationComputational Methods CMSC/AMSC/MAPL 460. EigenValue decomposition Singular Value Decomposition. Ramani Duraiswami, Dept. of Computer Science
Computational Methods CMSC/AMSC/MAPL 460 EigenValue decomposition Singular Value Decomposition Ramani Duraiswami, Dept. of Computer Science Hermitian Matrices A square matrix for which A = A H is said
More informationBasic Concepts in Matrix Algebra
Basic Concepts in Matrix Algebra An column array of p elements is called a vector of dimension p and is written as x p 1 = x 1 x 2. x p. The transpose of the column vector x p 1 is row vector x = [x 1
More informationLinear Algebra & Geometry why is linear algebra useful in computer vision?
Linear Algebra & Geometry why is linear algebra useful in computer vision? References: -Any book on linear algebra! -[HZ] chapters 2, 4 Some of the slides in this lecture are courtesy to Prof. Octavia
More informationAMS526: Numerical Analysis I (Numerical Linear Algebra)
AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue
More information