Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for

Size: px
Start display at page:

Download "Bindel, Fall 2016 Matrix Computations (CS 6210) Notes for"

Transcription

1 1 Logistics Notes for General announcement: we are switching from weekly to bi-weekly homeworks (mostly because the course is much bigger than planned). If you want to do HW but are not formally enrolled, send me to add you to CMS. Piazza will be set up today (not yet). Most of mathematics is best learned by doing. Linear algebra is no exception. You have had a previous class in which you learned the basics of linear algebra, and you will have plenty of practice with these concepts over the semester. This brief refresher lecture is supposed to remind you of what you ve already learned and introduce a few things you may not have seen. It also serves to set notation that will be used throughout the class. In addition to these notes, you may find it useful to go back to a good linear algebra text (there are several listed on the course syllabus) or to look at the linear algebra review chapters in the book. 2 Vectors A vector space (or linear space) is a set of vectors that can be added or scaled in a sensible way that is, addition is associative and commutative and scaling is distributive. We will generally denote vector spaces by script letters (e.g. V, W), vectors by lower case Roman letters (e.g. v, w), and scalars by lower case Greek letters (e.g. α, β). But we feel free to violate these conventions according to the dictates of our conscience or in deference to other conflicting conventions. There are many types of vector spaces. Apart from the ubiquitous spaces R n and C n, the most common vector spaces in applied mathematics are

2 different types of function spaces. These include P d = {polynomials of degree at most d}; V = {linear functions V R (or C)}; L(V, W) = {linear maps V W}; C k (Ω) = { k-times differentiable functions on a set Ω}; and many more. We compute with vectors in R n and C n, which we represent concretely by tuples of numbers in memory, usually stored in sequence. To keep a broader perspective, though, we will also frequently describe examples involving the polynomial spaces P d. 2.1 Spanning sets and bases We often think of a matrix as a set of vectors V = [ v 1 v 2... v n ]. The range space R(V ) or the span sp{v j } n j=1 is the set of vectors {V c : c R n } (or C n for complex spaces). The vectors are linearly independent if any vector in the span has a unique representation as a linear combination of the spanning vectors; equivalently, the vectors are linearly independent if there is no linear combination of the spanning vectors that equals zero except the one in which all coefficients are zero. A linearly independent set of vectors is a basis for a space V if sp{v j } n j=1 = V; the number of basis vectors n is the dimension of the space. Spaces like C k (Ω) do not generally have a finite basis; they are infinite-dimensional. We will focus on finite-dimensional spaces in the class, but it is useful to know that there are interesting infinitedimensional spaces in the broader world. The standard basis in R n is the set of column vectors e j with a one in the jth position and zeros elsewhere: I = [ e 1 e 2... e n ]. Of course, this is not the only basis. Any other set of n linearly independent vectors in R n is also a basis V = [ v 1 v 2... v n ].

3 The matrix V formed in this way is invertible, with multiplication by V corresponding to a change from the {v j } basis into the standard basis and multiplication by V 1 corresponding to a map from the {v j } basis to the standard basis. We will use matrix-like notation to describe bases and spanning sets even when we deal with spaces other than R n. For example, a common basis for P d is the power basis X = [ 1 x x 2... x d]. Each column in X is really a function of one variable (x), and matrixvector multiplication with X represents a map from a coefficient vector in R d+1 to a polynomial in P d. That is, we write polynomials p P d in terms of the basis as d p = Xc = c j x j and, we think of computing the coefficients from the abstract polynomial via a formal inverse: c = X 1 p. We typically think of a map like Y = X 1 in terms of rows j=0 y 0 Y y 1 =. where each row yj is a linear functional or dual vector (i.e. linear mappings from the vector space to the real or complex numbers). Collectively, {y0,..., yd } are the dual basis to {1, x,..., xd }. The power basis is not the only basis for P d. Other common choices include Newton or Lagrange polynomials with respect to a set of points, which you may have seen in another class such as CS In this class, we will sometimes use the Chebyshev 1 polynomial basis {T j (x)} given by the 1 Pafnuty Chebyshev was a nineteenth century Russian mathematician, and his name has been transliterated from the Cyrillic alphabet into the Latin alphabet in several different ways. We inherit our usual spelling from one of the French transliterations, but the symbol T for the polynomials comes from the German transliteration Tschebyscheff. y d

4 recurrence T 0 (x) = 1 T 1 (x) = x T j+1 (x) = 2xT j (x) T j 1 (x), j 1, and the Legendre polynomial basis {P j (x)}, given by the recurrence P 0 (x) = 1 P 1 (x) = x (j + 1)P j+1 (x) = (2j + 1)xP j (x) jp j 1 (x). As we will see over the course of the semester, sometimes the obvious choice of basis (e.g. the standard basis in R n or the power basis in P d ) is not the best choice for numerical computations. 2.2 Vector norms A norm measures vector lengths. It is positive definite, homogeneous, and sub-additive: v 0 and v = 0 iff v = 0 αv = α v u + v u + v. The three most common vector norms we work with in R n are the Euclidean norm (aka the 2-norm), the -norm (or max norm), and the 1-norm: v 2 = v j 2 v = max v j j v 1 = j j v j Many other norms can be related to one of these three norms. In particular, a natural norm in an abstract vector space will often look strange in the corresponding concrete representation with respect to some basis function. For example, consider the vector space of polynomials with degree at

5 most 2 on [ 1, 1]. This space also has a natural Euclidean norm, max norm, and 1-norm; for a given polynomial p(x) these are 1 p 2 = p(x) 2 dx 1 p = max p 1 = p(x) x [ 1,1] 1 1 p(x) dx. But when we write p(x) in terms of the coefficient vector with respect to the power basis (for example), the max norm of the polynomial is not the same as the max norm of the coefficient vector. In fact, if we consider a polynomial p(x) = c 0 + c 1 x, then the max norm of the polynomial p is the same as the one-norm of the coefficient vector the proof of which is left as a useful exercise to the reader. In a finite-dimensional vector space, all norms are equivalent: that is, if and are two norms on the same finite-dimensional vector space, then there exist constants c and C such that for any v in the space, c v v C v. Of course, there is no guarantee that the constants are small! An isometry is a mapping that preserves vector norms. For R n, the only isometries for the 1-norm and the -norm are permutations. For Euclidean space, though, there is a much richer set of isometries, represented by the orthogonal matrices (matrices s.t. Q Q = I). 2.3 Inner products An inner product, is a function from two vectors into the real numbers (or complex numbers for an complex vector space). It is positive definite, linear in the first slot, and symmetric (or Hermitian in the case of complex vectors); that is: v, v 0 and v, v = 0 iff v = 0 αu, w = α u, w and u + v, w = u, w + v, w u, v = v, u,

6 where the overbar in the latter case corresponds to complex conjugation. A vector space with an inner product is sometimes called an inner product space or a Euclidean space. Every inner product defines a corresponding norm v = v, v The inner product and the associated norm satisfy the Cauchy-Schwarz inequality u, v u v. The standard inner product on C n is x y = y x = n ȳ j x j. But the standard inner product is not the only inner product, just as the standard Euclidean norm is not the only norm. Just as norms allow us to reason about size, inner products let us reason about angles. In particular, we define the cosine of the angle θ between nonzero vectors v and w as cos θ = j=1 v, w v w. Returning to our example of a vector space of polynomials, the standard L 2 ([ 1, 1]) inner product is p, q L 2 ([ 1,1]) = 1 1 p(x) q(x) dx. If we express p and q with respect to a basis (e.g. the power basis), we fine that we can represent this inner product via a symmetric positive definite matrix. For example, let p(x) = c 0 +c 1 x+c 2 x 2 and let q(x) = d 0 +d 1 x+d 2 x 2. Then where d 0 p, q L 2 ([ 1,1]) = d 1 d 2 a ij = 1 1 x i 1 x j 1 dx = a 00 a 01 a 02 a 10 a 11 a 12 a 20 a 21 a 22 c 0 c 1 c 2 { 2/(i + j 1), = d Ac = c, d A i + j even 0, otherwise

7 The symmetric positive definite matrix A is what is sometimes called the Gram matrix for the basis {1, x, x 2 }. We say two vectors u and v are orthogonal with respect to an inner product if u, v = 0. If u and v are orthogonal, we have the Pythagorean theorem: u + v 2 = u + v, u + v = u, u + v, u + u, v + v, v = u 2 + v 2 Two vectors u and v are orthonormal if they are orthogonal with respect to the inner product and have unit length in the associated norm. When we work in an inner product space, we often use an orthonormal basis, i.e. a basis in which all the vectors are orthonormal. For example, the normalized Legendre polynomials 2 2j + 1 P j(x) form orthonormal bases for the P d inner product with respect to the L 2 ([ 1, 1]) inner product. 3 Matrices and mappings A matrix represents a mapping between two vector spaces. That is, if L : V W is a linear map, then the associated matrix A with respect to bases V and W satisfies A = W 1 LV. The same linear mapping corresponds to different matrices depending on the choices of basis. But matrices can reresent several other types of mappings as well. Over the course of this class, we will see several interpretations of matrices: Linear maps. A map L : V W is linear if L(x + y) = Lx + Ly and L(αx) = αlx. The corresponding matrix is A = W 1 LV. Linear operators. A linear map from a space to itself (L : V V) is a linear operator. The corresponding (square) matrix is A = V 1 LV. Bilinear forms. A map a : V W R (or C for complex spaces) is bilinear if it is linear in both slots: a(αu + v, w) = αa(u, w) + a(v, w) and a(v, αu + w) = αa(v, u) + a(v, w). The corresponding matrix has elements A ij = a(v i, w j ); if v = V c and w = W d then a(v, w) = d T Ac. We call a bilinear form on V V symmetric if a(v, w) = a(w, v); in this case, the corresponding matrix A is also symmetric (A = A T ). A

8 symmetric form and the corresponding matrix are called positive semidefinite if a(v, v) 0 for all v. The form and matrix are positive definite if a(v, v) > 0 for any v 0. A skew-symmetric matrix (A = A T ) corresponds to a skew-symmetric or anti-symmetric bilinear form, i.e. a(v, w) = a(w, v). Sesquilinear forms. A map a : V W C (where V and W are complex vector spaces) is sesquilinear if it is linear in the first slot and the conjugate is linear in the second slot: a(αu + v, w) = αa(u, w)+a(v, w) and a(v, αu+w) = ᾱa(v, u)+a(v, w). The matrix has elements A ij = a(v i, w j ); if v = V c and w = W d then a(v, w) = d Ac. We call a sesquilinear form on V V Hermitian if a(v, w) = a(w, v); in this case, the corresponding matrix A is also Hermitian (A = A ). A Hermitian form and the corresponding matrix are called positive semidefinite if a(v, v) 0 for all v. The form and matrix are positive definite if a(v, v) > 0 for any v 0. A skew-hermitian matrix (A = A ) corresponds to a skew-hermitian or anti-hermitian bilinear form, i.e. a(v, w) = a(w, v). Quadratic forms. A quadratic form φ : V R (or C) is a homogeneous quadratic function on V, i.e. φ(αv) = α 2 φ(v) for which the map b(v, w) = φ(v + w) φ(v) φ(w) is bilinear. Any quadratic form on a finite-dimensional space can be represented as c Ac where c is the coefficient vector for some Hermitian matrix A. The formula for the elements of A given φ is left as an exercise. We care about linear maps and linear operators almost everywhere, and most students come out of a first linear algebra class with some notion that these are important. But apart from very standard examples (inner products and norms), many students have only a vague notion of what a bilinear form, sesquilinear form, or quadratic form might be. Bilinear forms and sesquilinear forms show up when we discuss large-scale solvers based on projection methods. Quadratic forms are important in optimization, physics (where they often represent energy), and statistics (e.g. for understanding variance and covariance).

9 3.1 Matrix norms The space of matrices forms a vector space; and, as with other vector spaces, it makes sense to talk about norms. In particular, we frequently want norms that are consistent with vector norms on the range and domain spaces; that is, for any w and v, we want w = Av = w A v. One obvious consistent norm is the Frobenius norm, A = i,j a 2 ij. Even more useful are induced norms (or operator norms) Av A = sup v 0 v = sup Av. v =1 The induced norms corresponding to the vector 1-norm and -norm are A 1 = max a ij (max column sum) j i A = max a ij (max row sum) i j The norm induced by the vector Euclidean norm (variously called the matrix 2-norm or the spectral norm) is more complicated. The Frobenius norm and the matrix 2-norm are both orthogonally invariant (or unitarily invariant in a complex vector space. That is, if Q is a square matrix with Q = Q 1 (an orthogonal or unitary matrix) of the appropriate dimensions QA F = A F, AQ F = A F, QA 2 = A 2, AQ 2 = A 2. This property will turn out to be frequently useful throughout the course.

10 3.2 Decompositions and canonical forms Matrix decompositions (also known as matrix factorizations) are central to numerical linear algebra. We will get to know six such factorizations well: P A = LU (a.k.a. Gaussian elimination). Here L is unit lower triangular (triangular with 1 along the main diagonal), U is upper triangular, and P is a permutation matrix. A = LL (a.k.a. Cholesky factorization). Here A is Hermitian and positive definite, and L is a lower triangular matrix. A = QR (a.k.a. QR decomposition). Here Q has orthonormal columns and R is upper triangular. If we think of the columns of A as a basis, QR decomposition corresponds to the Gram-Schmidt orthogonalization process you have likely seen in the past (though we rarely compute with Gram-Schmidt). A = UΣV (a.k.a. the singular value decomposition or SVD). Here U and V have orthonormal columns and Σ is diagonal with non-negative entries. A = QΛQ (a.k.a. symmetric eigendecomposition). Here A is Hermitian (symmetric in the real case), Q is orthogonal or unitary, and Λ is a diagonal matrix with real numbers on the diagonal. A = QT Q (a.k.a. Schur form). Here A is a square matrix, Q is orthogonal or unitary, and T is upper triangular (or nearly so). The last three of these decompositions correspond to canonical forms for abstract operators. That is, we can view these decompositions as finding bases in which the matrix representation of some operator or form is particularly simple. More particularly: SVD: For any linear mapping L : V W, there are orthonormal bases for the two spaces such that the corresponding matrix is diagonal Symmetric eigendecomposition: For any Hermitian sesquilinear map on an inner product space, there is an orthonormal basis for the space such that the matrix representation is diagonal.

11 Schur form: For any linear operator L : V V, there is an orthonormal basis for the space such that the matrix representation is upper triangular. Equivalently, if {u 1,..., u n } is the basis in question, then sp({u j } k j=1) is an invariant subspace for each 1 k n. The Schur form turns out to be better for numerical work than the Jordan canonical form that you should have seen in an earlier class. We will discuss this in more detail when we discuss eigenvalue problems. 3.3 The SVD and the 2-norm The singular value decomposition is useful for a variety of reasons; we close off the lecture by showing one such use. Suppose A = UΣV is the singular value decomposition of some matrix. Using orthogonal invariance (unitary invariance) of the 2-norm, we have i.e. A 2 = U AV 2 = Σ 2, A 2 = max v 2 =1 j σ j v j 2 vj 2. That is, the spectral norm is the largest weighted average of the singular values, which is the same as just the largest singular value. The small singular values also have a meaning. If A is a square, invertible matrix then A 1 2 = V Σ 1 U 2 = Σ 1 2, i.e. A 1 2 is the inverse of the smallest singular value of A. The smallest singular value of a nonsingular matrix A can also be interpreted as the distance to singularity : if σ n is the smallest singular value of A, then there is a matrix E such that E 2 = σ n and A + E is singular; and there is no such matrix with smaller norm. These facts about the singular value decomposition are worth pondering, as they will be particularly useful in the next lecture when we ponder sensitivity and conditioning.

1 Vectors. Notes for Bindel, Spring 2017 Numerical Analysis (CS 4220)

1 Vectors. Notes for Bindel, Spring 2017 Numerical Analysis (CS 4220) Notes for 2017-01-30 Most of mathematics is best learned by doing. Linear algebra is no exception. You have had a previous class in which you learned the basics of linear algebra, and you will have plenty

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

HONORS LINEAR ALGEBRA (MATH V 2020) SPRING 2013

HONORS LINEAR ALGEBRA (MATH V 2020) SPRING 2013 HONORS LINEAR ALGEBRA (MATH V 2020) SPRING 2013 PROFESSOR HENRY C. PINKHAM 1. Prerequisites The only prerequisite is Calculus III (Math 1201) or the equivalent: the first semester of multivariable calculus.

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

October 25, 2013 INNER PRODUCT SPACES

October 25, 2013 INNER PRODUCT SPACES October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal

More information

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.

More information

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms

forms Christopher Engström November 14, 2014 MAA704: Matrix factorization and canonical forms Matrix properties Matrix factorization Canonical forms Christopher Engström November 14, 2014 Hermitian LU QR echelon Contents of todays lecture Some interesting / useful / important of matrices Hermitian LU QR echelon Rewriting a as a product of several matrices.

More information

Diagonalizing Matrices

Diagonalizing Matrices Diagonalizing Matrices Massoud Malek A A Let A = A k be an n n non-singular matrix and let B = A = [B, B,, B k,, B n ] Then A n A B = A A 0 0 A k [B, B,, B k,, B n ] = 0 0 = I n 0 A n Notice that A i B

More information

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009 Preface The title of the book sounds a bit mysterious. Why should anyone read this

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

Matrix decompositions

Matrix decompositions Matrix decompositions Zdeněk Dvořák May 19, 2015 Lemma 1 (Schur decomposition). If A is a symmetric real matrix, then there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ T. The

More information

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N. Math 410 Homework Problems In the following pages you will find all of the homework problems for the semester. Homework should be written out neatly and stapled and turned in at the beginning of class

More information

WOMP 2001: LINEAR ALGEBRA. 1. Vector spaces

WOMP 2001: LINEAR ALGEBRA. 1. Vector spaces WOMP 2001: LINEAR ALGEBRA DAN GROSSMAN Reference Roman, S Advanced Linear Algebra, GTM #135 (Not very good) Let k be a field, eg, R, Q, C, F q, K(t), 1 Vector spaces Definition A vector space over k is

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters, transforms,

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak, scribe: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters,

More information

Index. book 2009/5/27 page 121. (Page numbers set in bold type indicate the definition of an entry.)

Index. book 2009/5/27 page 121. (Page numbers set in bold type indicate the definition of an entry.) page 121 Index (Page numbers set in bold type indicate the definition of an entry.) A absolute error...26 componentwise...31 in subtraction...27 normwise...31 angle in least squares problem...98,99 approximation

More information

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For

More information

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2 Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition

More information

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors Mathematical Methods wk : Vectors John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

Mathematical Methods wk 1: Vectors

Mathematical Methods wk 1: Vectors Mathematical Methods wk : Vectors John Magorrian, magog@thphysoxacuk These are work-in-progress notes for the second-year course on mathematical methods The most up-to-date version is available from http://www-thphysphysicsoxacuk/people/johnmagorrian/mm

More information

LinGloss. A glossary of linear algebra

LinGloss. A glossary of linear algebra LinGloss A glossary of linear algebra Contents: Decompositions Types of Matrices Theorems Other objects? Quasi-triangular A matrix A is quasi-triangular iff it is a triangular matrix except its diagonal

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

INNER PRODUCT SPACE. Definition 1

INNER PRODUCT SPACE. Definition 1 INNER PRODUCT SPACE Definition 1 Suppose u, v and w are all vectors in vector space V and c is any scalar. An inner product space on the vectors space V is a function that associates with each pair of

More information

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM Unless otherwise stated, all vector spaces in this worksheet are finite dimensional and the scalar field F is R or C. Definition 1. A linear operator

More information

Linear Algebra. Min Yan

Linear Algebra. Min Yan Linear Algebra Min Yan January 2, 2018 2 Contents 1 Vector Space 7 1.1 Definition................................. 7 1.1.1 Axioms of Vector Space..................... 7 1.1.2 Consequence of Axiom......................

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

Quantum Computing Lecture 2. Review of Linear Algebra

Quantum Computing Lecture 2. Review of Linear Algebra Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 Sathaye Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v

More information

Notes on Eigenvalues, Singular Values and QR

Notes on Eigenvalues, Singular Values and QR Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra Foundations of Numerics from Advanced Mathematics Linear Algebra Linear Algebra, October 23, 22 Linear Algebra Mathematical Structures a mathematical structure consists of one or several sets and one or

More information

Lecture 1: Review of linear algebra

Lecture 1: Review of linear algebra Lecture 1: Review of linear algebra Linear functions and linearization Inverse matrix, least-squares and least-norm solutions Subspaces, basis, and dimension Change of basis and similarity transformations

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

More information

2. Review of Linear Algebra

2. Review of Linear Algebra 2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear

More information

Lecture 5. Ch. 5, Norms for vectors and matrices. Norms for vectors and matrices Why?

Lecture 5. Ch. 5, Norms for vectors and matrices. Norms for vectors and matrices Why? KTH ROYAL INSTITUTE OF TECHNOLOGY Norms for vectors and matrices Why? Lecture 5 Ch. 5, Norms for vectors and matrices Emil Björnson/Magnus Jansson/Mats Bengtsson April 27, 2016 Problem: Measure size of

More information

SYLLABUS. 1 Linear maps and matrices

SYLLABUS. 1 Linear maps and matrices Dr. K. Bellová Mathematics 2 (10-PHY-BIPMA2) SYLLABUS 1 Linear maps and matrices Operations with linear maps. Prop 1.1.1: 1) sum, scalar multiple, composition of linear maps are linear maps; 2) L(U, V

More information

6 Inner Product Spaces

6 Inner Product Spaces Lectures 16,17,18 6 Inner Product Spaces 6.1 Basic Definition Parallelogram law, the ability to measure angle between two vectors and in particular, the concept of perpendicularity make the euclidean space

More information

Numerical Linear Algebra

Numerical Linear Algebra Numerical Linear Algebra The two principal problems in linear algebra are: Linear system Given an n n matrix A and an n-vector b, determine x IR n such that A x = b Eigenvalue problem Given an n n matrix

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

Basic Calculus Review

Basic Calculus Review Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V

More information

Review of some mathematical tools

Review of some mathematical tools MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical

More information

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation

More information

Numerical Linear Algebra

Numerical Linear Algebra University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Functional Analysis Review

Functional Analysis Review Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all

More information

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A). Math 344 Lecture #19 3.5 Normed Linear Spaces Definition 3.5.1. A seminorm on a vector space V over F is a map : V R that for all x, y V and for all α F satisfies (i) x 0 (positivity), (ii) αx = α x (scale

More information

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014

Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014 Duke University, Department of Electrical and Computer Engineering Optimization for Scientists and Engineers c Alex Bronstein, 2014 Linear Algebra A Brief Reminder Purpose. The purpose of this document

More information

Lecture 3: QR-Factorization

Lecture 3: QR-Factorization Lecture 3: QR-Factorization This lecture introduces the Gram Schmidt orthonormalization process and the associated QR-factorization of matrices It also outlines some applications of this factorization

More information

EE731 Lecture Notes: Matrix Computations for Signal Processing

EE731 Lecture Notes: Matrix Computations for Signal Processing EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University September 22, 2005 0 Preface This collection of ten

More information

GQE ALGEBRA PROBLEMS

GQE ALGEBRA PROBLEMS GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout

More information

Linear Algebra - Part II

Linear Algebra - Part II Linear Algebra - Part II Projection, Eigendecomposition, SVD (Adapted from Sargur Srihari s slides) Brief Review from Part 1 Symmetric Matrix: A = A T Orthogonal Matrix: A T A = AA T = I and A 1 = A T

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

22A-2 SUMMER 2014 LECTURE Agenda

22A-2 SUMMER 2014 LECTURE Agenda 22A-2 SUMMER 204 LECTURE 2 NATHANIEL GALLUP The Dot Product Continued Matrices Group Work Vectors and Linear Equations Agenda 2 Dot Product Continued Angles between vectors Given two 2-dimensional vectors

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM

33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM 33AH, WINTER 2018: STUDY GUIDE FOR FINAL EXAM (UPDATED MARCH 17, 2018) The final exam will be cumulative, with a bit more weight on more recent material. This outline covers the what we ve done since the

More information

Lecture 2: Linear operators

Lecture 2: Linear operators Lecture 2: Linear operators Rajat Mittal IIT Kanpur The mathematical formulation of Quantum computing requires vector spaces and linear operators So, we need to be comfortable with linear algebra to study

More information

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,

More information

Applied Linear Algebra in Geoscience Using MATLAB

Applied Linear Algebra in Geoscience Using MATLAB Applied Linear Algebra in Geoscience Using MATLAB Contents Getting Started Creating Arrays Mathematical Operations with Arrays Using Script Files and Managing Data Two-Dimensional Plots Programming in

More information

Maths for Signals and Systems Linear Algebra in Engineering

Maths for Signals and Systems Linear Algebra in Engineering Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18 th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL PROCESSING IMPERIAL COLLEGE LONDON Mathematics

More information

MATH 581D FINAL EXAM Autumn December 12, 2016

MATH 581D FINAL EXAM Autumn December 12, 2016 MATH 58D FINAL EXAM Autumn 206 December 2, 206 NAME: SIGNATURE: Instructions: there are 6 problems on the final. Aim for solving 4 problems, but do as much as you can. Partial credit will be given on all

More information

Linear Algebra: Matrix Eigenvalue Problems

Linear Algebra: Matrix Eigenvalue Problems CHAPTER8 Linear Algebra: Matrix Eigenvalue Problems Chapter 8 p1 A matrix eigenvalue problem considers the vector equation (1) Ax = λx. 8.0 Linear Algebra: Matrix Eigenvalue Problems Here A is a given

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all206 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: Commutativity:

More information

CS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works

CS168: The Modern Algorithmic Toolbox Lecture #8: How PCA Works CS68: The Modern Algorithmic Toolbox Lecture #8: How PCA Works Tim Roughgarden & Gregory Valiant April 20, 206 Introduction Last lecture introduced the idea of principal components analysis (PCA). The

More information

Cheat Sheet for MATH461

Cheat Sheet for MATH461 Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A

More information

The Eigenvalue Problem: Perturbation Theory

The Eigenvalue Problem: Perturbation Theory Jim Lambers MAT 610 Summer Session 2009-10 Lecture 13 Notes These notes correspond to Sections 7.2 and 8.1 in the text. The Eigenvalue Problem: Perturbation Theory The Unsymmetric Eigenvalue Problem Just

More information

Math 113 Final Exam: Solutions

Math 113 Final Exam: Solutions Math 113 Final Exam: Solutions Thursday, June 11, 2013, 3.30-6.30pm. 1. (25 points total) Let P 2 (R) denote the real vector space of polynomials of degree 2. Consider the following inner product on P

More information

Inner product spaces. Layers of structure:

Inner product spaces. Layers of structure: Inner product spaces Layers of structure: vector space normed linear space inner product space The abstract definition of an inner product, which we will see very shortly, is simple (and by itself is pretty

More information

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x BASIC ALGORITHMS IN LINEAR ALGEBRA STEVEN DALE CUTKOSKY Matrices and Applications of Gaussian Elimination Systems of Equations Suppose that A is an n n matrix with coefficents in a field F, and x = (x,,

More information

(v, w) = arccos( < v, w >

(v, w) = arccos( < v, w > MA322 F all203 Notes on Inner Products Notes on Chapter 6 Inner product. Given a real vector space V, an inner product is defined to be a bilinear map F : V V R such that the following holds: For all v,

More information

Numerical Methods for Solving Large Scale Eigenvalue Problems

Numerical Methods for Solving Large Scale Eigenvalue Problems Peter Arbenz Computer Science Department, ETH Zürich E-mail: arbenz@inf.ethz.ch arge scale eigenvalue problems, Lecture 2, February 28, 2018 1/46 Numerical Methods for Solving Large Scale Eigenvalue Problems

More information

(Refer Slide Time: 2:04)

(Refer Slide Time: 2:04) Linear Algebra By Professor K. C. Sivakumar Department of Mathematics Indian Institute of Technology, Madras Module 1 Lecture 1 Introduction to the Course Contents Good morning, let me welcome you to this

More information

Schur s Triangularization Theorem. Math 422

Schur s Triangularization Theorem. Math 422 Schur s Triangularization Theorem Math 4 The characteristic polynomial p (t) of a square complex matrix A splits as a product of linear factors of the form (t λ) m Of course, finding these factors is a

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University

Linear Algebra Done Wrong. Sergei Treil. Department of Mathematics, Brown University Linear Algebra Done Wrong Sergei Treil Department of Mathematics, Brown University Copyright c Sergei Treil, 2004, 2009 Preface The title of the book sounds a bit mysterious. Why should anyone read this

More information

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti

Mobile Robotics 1. A Compact Course on Linear Algebra. Giorgio Grisetti Mobile Robotics 1 A Compact Course on Linear Algebra Giorgio Grisetti SA-1 Vectors Arrays of numbers They represent a point in a n dimensional space 2 Vectors: Scalar Product Scalar-Vector Product Changes

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 2 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory April 5, 2012 Andre Tkacenko

More information

B553 Lecture 5: Matrix Algebra Review

B553 Lecture 5: Matrix Algebra Review B553 Lecture 5: Matrix Algebra Review Kris Hauser January 19, 2012 We have seen in prior lectures how vectors represent points in R n and gradients of functions. Matrices represent linear transformations

More information

Applied Linear Algebra

Applied Linear Algebra Applied Linear Algebra Peter J. Olver School of Mathematics University of Minnesota Minneapolis, MN 55455 olver@math.umn.edu http://www.math.umn.edu/ olver Chehrzad Shakiban Department of Mathematics University

More information

Linear Algebra, part 3 QR and SVD

Linear Algebra, part 3 QR and SVD Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Math Linear Algebra

Math Linear Algebra Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner

More information

A Review of Linear Algebra

A Review of Linear Algebra A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review of Linear Algebra p.1/13 Basics Column vector x R n, Row vector x T, Matrix A R m n. Matrix Multiplication, (m n)(n k) m k, AB BA. Transpose

More information

Linear Algebra Highlights

Linear Algebra Highlights Linear Algebra Highlights Chapter 1 A linear equation in n variables is of the form a 1 x 1 + a 2 x 2 + + a n x n. We can have m equations in n variables, a system of linear equations, which we want to

More information

Iterative solvers for linear equations

Iterative solvers for linear equations Spectral Graph Theory Lecture 23 Iterative solvers for linear equations Daniel A. Spielman November 26, 2018 23.1 Overview In this and the next lecture, I will discuss iterative algorithms for solving

More information

Some notes on Linear Algebra. Mark Schmidt September 10, 2009

Some notes on Linear Algebra. Mark Schmidt September 10, 2009 Some notes on Linear Algebra Mark Schmidt September 10, 2009 References Linear Algebra and Its Applications. Strang, 1988. Practical Optimization. Gill, Murray, Wright, 1982. Matrix Computations. Golub

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 16: Eigenvalue Problems; Similarity Transformations Xiangmin Jiao Stony Brook University Xiangmin Jiao Numerical Analysis I 1 / 18 Eigenvalue

More information

MAT 211, Spring 2015, Introduction to Linear Algebra.

MAT 211, Spring 2015, Introduction to Linear Algebra. MAT 211, Spring 2015, Introduction to Linear Algebra. Lecture 04, 53103: MWF 10-10:53 AM. Location: Library W4535 Contact: mtehrani@scgp.stonybrook.edu Final Exam: Monday 5/18/15 8:00 AM-10:45 AM The aim

More information

Part 1a: Inner product, Orthogonality, Vector/Matrix norm

Part 1a: Inner product, Orthogonality, Vector/Matrix norm Part 1a: Inner product, Orthogonality, Vector/Matrix norm September 19, 2018 Numerical Linear Algebra Part 1a September 19, 2018 1 / 16 1. Inner product on a linear space V over the number field F A map,

More information

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis. Vector spaces DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis http://www.cims.nyu.edu/~cfgranda/pages/obda_fall17/index.html Carlos Fernandez-Granda Vector space Consists of: A set V A scalar

More information