Math 407: Linear Optimization

Size: px
Start display at page:

Download "Math 407: Linear Optimization"

Transcription

1 Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

2 Linear Least Squares A linear least squares problem is one of the form LLS 1 minimize x R n 2 Ax b 2 2, where A R m n, b R m, and y 2 2 := y y y 2 m. ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

3 Linear Least Squares A linear least squares problem is one of the form LLS 1 minimize x R n 2 Ax b 2 2, where A R m n, b R m, and y 2 2 := y y y 2 m. Theorem: Consider the linear least squares problem LLS. 1. A solution to the normal equations A T Ax = A T b always exists. 2. A solution to LLS always exists. 3. The linear least squares problem LLS has a unique solution if and only if Null(A) = {0} in which case (A T A) 1 exists and the unique solution is given by x = (A T A) 1 A T b. 4. If Ran(A) = R m, then (AA T ) 1 exists and x = A T (AA T ) 1 b solves LLS, indeed, Ax = b. ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

4 Distance to a Subspace Let S R m be a subspace and suppose b R m is not in S. Find the point z S such that z b 2 z b 2 z S, or equivalently, solve D 1 min z S 2 z b 2 2. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

5 Distance to a Subspace Let S R m be a subspace and suppose b R m is not in S. Find the point z S such that z b 2 z b 2 z S, or equivalently, solve D 1 min z S 2 z b 2 2. Least Squares Connection: Suppose S = Ran(A). Then z R m solves D if and only if there is an x R n with z = Ax such that x solves LLS. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

6 Orthogonal Projections and Orthonormal Bases Let Q R n k be a matrix whose columns form an orthonormal basis for the subspace S, so that k = dim S. Set P = QQ T and note that Q T Q = I k the k k identity matrix. Then P 2 =QQ T QQ T =QI k Q T =QQ T =P and P T =(QQ T ) T =QQ T =P, so that P = P S the orthogonal projection onto S! ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

7 Orthogonal Projections and Orthonormal Bases Let Q R n k be a matrix whose columns form an orthonormal basis for the subspace S, so that k = dim S. Set P = QQ T and note that Q T Q = I k the k k identity matrix. Then P 2 =QQ T QQ T =QI k Q T =QQ T =P and P T =(QQ T ) T =QQ T =P, so that P = P S the orthogonal projection onto S! Lemma: (1) The projection P R n n is orthogonal if and only if P = P T. (2) If the columns of the matrix Q R n k form an orthonormal basis for the subspace S R n, then P := QQ T is the orthogonal projection onto S. ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

8 Orthogonal Projections and Distance to a Subspace Theorem: Let S R m be a subspace and let b R m \ S. Then the unique solution to the least distance problem D minimize z b 2 z S is z := P S b, where P S is the orthogonal projector onto S. ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

9 Orthogonal Projections and Distance to a Subspace Theorem: Let S R m be a subspace and let b R m \ S. Then the unique solution to the least distance problem D minimize z b 2 z S is z := P S b, where P S is the orthogonal projector onto S. Proposition: Let A R m n with m n and Null(A) = {0}. Then the orthogonal projector onto Ran(A) is given by P Ran(A) = A(A T A) 1 A T. ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

10 Minimal Norm Solutions to Ax = b Let A R m n and suppose that m < n. Then A is short and fat so A most likely has rank m, or equivalently, Ran(A) = R m. So for the purposes of this discussion we assume that rank(a) = m. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

11 Minimal Norm Solutions to Ax = b Let A R m n and suppose that m < n. Then A is short and fat so A most likely has rank m, or equivalently, Ran(A) = R m. So for the purposes of this discussion we assume that rank(a) = m. Since m < n, the set of solutions to Ax = b will be infinite since the nullity of A is n m. Indeed, if x 0 is any particular solution to Ax = b, then the set of solutions is given by x 0 + Null(A) := { x 0 + z z Null(A) }. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

12 Minimal Norm Solutions to Ax = b Let A R m n and suppose that m < n. Then A is short and fat so A most likely has rank m, or equivalently, Ran(A) = R m. So for the purposes of this discussion we assume that rank(a) = m. Since m < n, the set of solutions to Ax = b will be infinite since the nullity of A is n m. Indeed, if x 0 is any particular solution to Ax = b, then the set of solutions is given by x 0 + Null(A) := { x 0 + z z Null(A) }. In this setting, one might prefer the solution having least norm: 1 min 2 z + x 0 2. z Null(A) 2 This problem is of the form D min { z b 2 z S } whose solution is z = P S b when S is a subspace. Consequently, the solution is given by z = P S x 0 where P S is now the orthogonal projection onto S := Null(A). Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

13 A Formula for P Null(A) Observe that P Null(A) = P Ran(A T ) = I P Ran(A T ). ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

14 A Formula for P Null(A) Observe that P Null(A) = P Ran(A T ) = I P Ran(A T ). We have already shown that if M R p q satisfies Null(M) = {0}, then P Ran(M) = M(M T M) 1 M T. If we take M = A T, then our assumption that Ran(A) = R m gives Null(M) = Null(A T ) = Ran(A) = (R m ) = {0}. Hence, P Ran(A T ) = A T (AA T ) 1 A and P Null(A) = P Ran(A T ) = I P Ran(A T ) = I A T (AA T ) 1 A. ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

15 Solution to D min { z b 2 z S } We can take x 0 := A T (AA T ) 1 b as our particular solution to Ax = b in D since Ax 0 = AA T (AA T ) 1 b = b. ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

16 Solution to D min { z b 2 z S } We can take x 0 := A T (AA T ) 1 b as our particular solution to Ax = b in D since Ax 0 = AA T (AA T ) 1 b = b. Hence, our solution to D is x = x 0 + P Null(A) ( x 0 ) = x 0 + (I A T (AA T ) 1 A)( x 0 ) = A T (AA T ) 1 Ax 0 = A T (AA T ) 1 AA T (AA T ) 1 b = A T (AA T ) 1 b = x 0. That is, x 0 = A T (AA T ) 1 b is the least norm solution to Ax = b. ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

17 Solution to D min { z b 2 z S } Theorem: Let A R m n be such that m n and Ran(A) = R m. (1) The matrix AA T is invertible. (2) The orthogonal projection onto Null(A) is given by P Null(A) = I A T (AA T ) 1 A. (3) For every b R m, the system Ax = b is consistent, and the least norm solution to this system is uniquely given by x = A T (AA T ) 1 b. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

18 Gram-Schmidt Orthogonalization and the QR-Factorization We now define the Gram-Schmidt orthogonalization process for a set of linearly independent vectors a 1,..., a n R m. (This implies that n < m (why?)) The orthogonal vectors q 1,..., q n are defined inductively, as follows: p 1 = a 1, q 1 = p 1 / p 1, j 1 p j = a j a j, q j q i and q j = p j / p j for 2 j n. i=1 For 1 j n, q j Span{a 1,..., a j }, so p j 0 by the linear independence of a 1,..., a j. An elementary induction argument shows that the q j s form an orthonormal basis for span (a 1,..., a n ). Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

19 Gram-Schmidt Orthogonalization and the QR-Factorization If we now define r jj = p j 0 and r ij = a j, q i for 1 i < j n, then a 1 = r 11 q 1, a 2 = r 12 q 1 + r 22 q 2, a 3 = r 13 q 1 + r 23 q 2 + r 33 q 3,. a n = n r in q i. i=1 Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

20 Gram-Schmidt Orthogonalization and the QR-Factorization Again suppose a 1,..., a n R m are linearly independent and set A = [a 1 a 2... a n ] R m n, R = [r ij ] R n n, and Q = [q 1 q 2... q n ] R m n, where r ij = 0, i > j. Then A = QR, where Q has orthonormal columns and R is an upper triangular n n matrix. In addition, R is invertible since the diagonal entries r jj are non-zero. This is called the QR factorization of the matrix A. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

21 Full QR-Factorization Suppose A R m n with m n. Then there exists a permutation matrix P R n n, a unitary matrix Q R m m, and an upper triangular matrix R R m n such that AP = QR. Let Q 1 R m n denote the first n columns of Q, Q 2 the remaining (m n) columns of Q, and R 1 R n n the first n rows of R, then [ ] R1 AP = QR = [Q 1 Q 2 ] = Q 0 1 R 1. (1) Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

22 Full QR-Factorization Moreover, we have the following: (a) We may choose R to have nonnegative diagonal entries. (b) If A is of full rank, then we can choose R with positive diagonal entries, in which case we obtain the condensed factorization A = Q 1 R 1, where R 1 R n n invertible and the columns of Q 1 form an orthonormal basis for Ran(A). (c) If rank (A) = k < n, then R 1 = [ ] R11 R 12, 0 0 where R 11 is a k k invertible upper triangular matrix and R 12 R k (n k). In particular, this implies that AP = Q 11 [R 11 R 12 ], where Q 11 are the first k columns of Q. In this case, the columns of Q 11 form an orthonormal basis for the range of A and R 11 is invertible. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

23 Condenced QR-Factorization Let A R m n have rank k min{m, n}. Then there exist Q R m k R R k n P R n n with orthonormal columns, full rank upper triangular, and a permutation matrix such that AP = QR. In particular, the columns of the matrix Q form a basis for the range of A. Moreover, the matrix R can be written in the form where R 1 R k k is nonsingular. R = [R 1 R 2 ], Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

24 Orthogonal Projections onto the Four Fundamental Subspaces Let A R m n have rank k min{m, n}. Let A and A T have generalized QR factorizations AP = Q[R 1 R 2 ] and A T P = Q[ R1 R2 ]. Since row rank equals column rank, P R n n is a permutation matrix, P R m m is a permutation matrix, Q R m k and Q R n k have orthonormal columns, R 1, R 1 R k k are both upper triangular nonsingular matrices, R 2 R k (n k), and R 2 R k (m k). Moreover, QQ T is the orthogonal projection onto Ran(A), I QQ T is the orthogonal projection onto Null(A T ), Q Q T is the orthogonal projection onto Ran(A T ), and I Q Q T is the orthogonal projection onto Null(A). Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

25 Solving the Normal Equations with the QR Factorization Let A R m n and b R m with k := rank (A) min{m, n}. The normal equations A T Ax = A T b yield solutions to the LLS problem min 1 2 Ax b 2 2. We show how the QR factorization of A is used to solve these equations. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

26 Solving the Normal Equations with the QR Factorization Let A R m n and b R m with k := rank (A) min{m, n}. The normal equations A T Ax = A T b yield solutions to the LLS problem min 1 2 Ax b 2 2. We show how the QR factorization of A is used to solve these equations. Suppose A has condensed QR factorization AP = Q[R 1 R 2 ] where P R n n is a permutation matrix, Q R m k has orthonormal columns, R 1 R k k is nonsingular and upper triangular, and R 2 R k (n k) with k = rank (A) min{n, m}. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

27 Solving the Normal Equations with the QR Factorization Since A = Q[R 1 R 2 ]P T the normal equations A T Ax = A T b become P [ R T 1 R T 2 ] Q T Q [ R 1 R 2 ] P T x = A T Ax = A T b = P [ R T 1 R T 2 ] Q T b, or equivalently since Q T Q = I k. [ ] [ ] R T [R1 P 1 ] R2 T R 2 P T R T x = P 1 R2 T Q T b. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

28 Solving the Normal Equations with the QR Factorization Multiply [ ] [ ] R T [R1 P 1 ] R2 T R 2 P T R T x = P 1 R2 T Q T b on the left by P T to obtain [ ] [ ] R T [R1 1 ] R2 T R 2 P T R T x = 1 R2 T Q T b Defining w := [ R 1 R 2 ] P T x and setting ˆb := Q T b gives [ R T 1 R T 2 ] [ ] R T w = 1 ˆb. R2 T Take w = ˆb and define ˆx := P T x. Then [ R1 R 2 ] ˆx = ˆb. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

29 Solving the Normal Equations with the QR Factorization We now need to solve the following equation for ˆx: [ R1 R 2 ] ˆx = ˆb. To do this write ) (ˆx1 ˆx = ˆx 2 with ˆx 1 R k. If we assume that ˆx 2 = 0, then R 1ˆx 1 = [ ] (ˆx ) R 1 R which we can solve by taking ˆx 1 = R 1 1 ˆb = ˆb since R 1 is an invertible upper triangular k k matrix by construction. Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

30 Solving the Normal Equations with the QR Factorization Then Does this x solve the normal equations? ( ) R 1 x = P ˆx = P 1 ˆb. 0 ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

31 Solving the Normal Equations with the QR Factorization Then ( ) R 1 x = P ˆx = P 1 ˆb. 0 Does this x solve the normal equations? [ ] A T Ax = A T R 1 AP 1 ˆb 0 = A T Q [ [ ] ] R 1 R 2 P T R 1 P 1 ˆb 0 = A T QR 1R 1 1 ˆb (since P T P = I ) = A T Q ˆb = A T QQ T b [ ] R T = P 1 Q T QQ T b R T 2 [ R T = P 1 R2 T = A T b! [ ] R (since A T T = P 1 R2 T Q T ) ] Q T b (since Q T Q = I ) ecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

32 Solving the Normal Equations with the QR Factorization This derivation yields the following recipe for solving the normal equations: AP = Q[R 1 R 2 ] the general condensed QR factorization o(m 2 n) ˆb = Q T b a matrix-vector product o(km) w 1 = R 1 1 ˆb a back solve o(k 2 ) [ ] R 1 x = P 1 ˆb 0 a matrix-vector product o(kn). Lecture 16: The Linear Least Squares Problem II (Math Dept, University Math 407: of Washington) Linear Optimization February 28, / 22

Linear Analysis Lecture 16

Linear Analysis Lecture 16 Linear Analysis Lecture 16 The QR Factorization Recall the Gram-Schmidt orthogonalization process. Let V be an inner product space, and suppose a 1,..., a n V are linearly independent. Define q 1,...,

More information

The Linear Least Squares Problem

The Linear Least Squares Problem CHAPTER 3 The Linear Least Squares Problem In this chapter we we study the linear least squares problem introduced in (4) Since this is such a huge and important topic, we will only be able to briefly

More information

This can be accomplished by left matrix multiplication as follows: I

This can be accomplished by left matrix multiplication as follows: I 1 Numerical Linear Algebra 11 The LU Factorization Recall from linear algebra that Gaussian elimination is a method for solving linear systems of the form Ax = b, where A R m n and bran(a) In this method

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

MA 265 FINAL EXAM Fall 2012

MA 265 FINAL EXAM Fall 2012 MA 265 FINAL EXAM Fall 22 NAME: INSTRUCTOR S NAME:. There are a total of 25 problems. You should show work on the exam sheet, and pencil in the correct answer on the scantron. 2. No books, notes, or calculators

More information

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible. MATH 2331 Linear Algebra Section 2.1 Matrix Operations Definition: A : m n, B : n p ( 1 2 p ) ( 1 2 p ) AB = A b b b = Ab Ab Ab Example: Compute AB, if possible. 1 Row-column rule: i-j-th entry of AB:

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this

More information

May 9, 2014 MATH 408 MIDTERM EXAM OUTLINE. Sample Questions

May 9, 2014 MATH 408 MIDTERM EXAM OUTLINE. Sample Questions May 9, 24 MATH 48 MIDTERM EXAM OUTLINE This exam will consist of two parts and each part will have multipart questions. Each of the 6 questions is worth 5 points for a total of points. The two part of

More information

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x

More information

Solutions to Review Problems for Chapter 6 ( ), 7.1

Solutions to Review Problems for Chapter 6 ( ), 7.1 Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007 You have 1 hour and 20 minutes. No notes, books, or other references. You are permitted to use Maple during this exam, but you must start with a blank

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

Math 265 Linear Algebra Sample Spring 2002., rref (A) = Math 265 Linear Algebra Sample Spring 22. It is given that A = rref (A T )= 2 3 5 3 2 6, rref (A) = 2 3 and (a) Find the rank of A. (b) Find the nullityof A. (c) Find a basis for the column space of A.

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

Lecture 4 Orthonormal vectors and QR factorization

Lecture 4 Orthonormal vectors and QR factorization Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 5: Projectors and QR Factorization Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 14 Outline 1 Projectors 2 QR Factorization

More information

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015 Solutions to Final Practice Problems Written by Victoria Kala vtkala@math.ucsb.edu Last updated /5/05 Answers This page contains answers only. See the following pages for detailed solutions. (. (a x. See

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

ELE/MCE 503 Linear Algebra Facts Fall 2018

ELE/MCE 503 Linear Algebra Facts Fall 2018 ELE/MCE 503 Linear Algebra Facts Fall 2018 Fact N.1 A set of vectors is linearly independent if and only if none of the vectors in the set can be written as a linear combination of the others. Fact N.2

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

5.6. PSEUDOINVERSES 101. A H w.

5.6. PSEUDOINVERSES 101. A H w. 5.6. PSEUDOINVERSES 0 Corollary 5.6.4. If A is a matrix such that A H A is invertible, then the least-squares solution to Av = w is v = A H A ) A H w. The matrix A H A ) A H is the left inverse of A and

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 9 1. qr and complete orthogonal factorization poor man s svd can solve many problems on the svd list using either of these factorizations but they

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017 Math 4A Notes Written by Victoria Kala vtkala@math.ucsb.edu Last updated June 11, 2017 Systems of Linear Equations A linear equation is an equation that can be written in the form a 1 x 1 + a 2 x 2 +...

More information

Cheat Sheet for MATH461

Cheat Sheet for MATH461 Cheat Sheet for MATH46 Here is the stuff you really need to remember for the exams Linear systems Ax = b Problem: We consider a linear system of m equations for n unknowns x,,x n : For a given matrix A

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

Pseudoinverse & Moore-Penrose Conditions

Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 7 ECE 275A Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego

More information

MATH36001 Generalized Inverses and the SVD 2015

MATH36001 Generalized Inverses and the SVD 2015 MATH36001 Generalized Inverses and the SVD 201 1 Generalized Inverses of Matrices A matrix has an inverse only if it is square and nonsingular. However there are theoretical and practical applications

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

7. Dimension and Structure.

7. Dimension and Structure. 7. Dimension and Structure 7.1. Basis and Dimension Bases for Subspaces Example 2 The standard unit vectors e 1, e 2,, e n are linearly independent, for if we write (2) in component form, then we obtain

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

The QR Factorization

The QR Factorization The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector

More information

Lecture 3: QR-Factorization

Lecture 3: QR-Factorization Lecture 3: QR-Factorization This lecture introduces the Gram Schmidt orthonormalization process and the associated QR-factorization of matrices It also outlines some applications of this factorization

More information

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2

σ 11 σ 22 σ pp 0 with p = min(n, m) The σ ii s are the singular values. Notation change σ ii A 1 σ 2 HE SINGULAR VALUE DECOMPOSIION he SVD existence - properties. Pseudo-inverses and the SVD Use of SVD for least-squares problems Applications of the SVD he Singular Value Decomposition (SVD) heorem For

More information

Matrix Factorization and Analysis

Matrix Factorization and Analysis Chapter 7 Matrix Factorization and Analysis Matrix factorizations are an important part of the practice and analysis of signal processing. They are at the heart of many signal-processing algorithms. Their

More information

SUMMARY OF MATH 1600

SUMMARY OF MATH 1600 SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You

More information

MATH 22A: LINEAR ALGEBRA Chapter 4

MATH 22A: LINEAR ALGEBRA Chapter 4 MATH 22A: LINEAR ALGEBRA Chapter 4 Jesús De Loera, UC Davis November 30, 2012 Orthogonality and Least Squares Approximation QUESTION: Suppose Ax = b has no solution!! Then what to do? Can we find an Approximate

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

5. Orthogonal matrices

5. Orthogonal matrices L Vandenberghe EE133A (Spring 2017) 5 Orthogonal matrices matrices with orthonormal columns orthogonal matrices tall matrices with orthonormal columns complex matrices with orthonormal columns 5-1 Orthonormal

More information

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6

Orthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6 Chapter 6 Orthogonality 6.1 Orthogonal Vectors and Subspaces Recall that if nonzero vectors x, y R n are linearly independent then the subspace of all vectors αx + βy, α, β R (the space spanned by x and

More information

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. Basis Definition. Let V be a vector space. A linearly independent spanning set for V is called a basis.

More information

LINEAR ALGEBRA REVIEW

LINEAR ALGEBRA REVIEW LINEAR ALGEBRA REVIEW SPENCER BECKER-KAHN Basic Definitions Domain and Codomain. Let f : X Y be any function. This notation means that X is the domain of f and Y is the codomain of f. This means that for

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent. Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. Let u = [u

More information

Lecture 6, Sci. Comp. for DPhil Students

Lecture 6, Sci. Comp. for DPhil Students Lecture 6, Sci. Comp. for DPhil Students Nick Trefethen, Thursday 1.11.18 Today II.3 QR factorization II.4 Computation of the QR factorization II.5 Linear least-squares Handouts Quiz 4 Householder s 4-page

More information

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB Glossary of Linear Algebra Terms Basis (for a subspace) A linearly independent set of vectors that spans the space Basic Variable A variable in a linear system that corresponds to a pivot column in the

More information

Linear Algebra Formulas. Ben Lee

Linear Algebra Formulas. Ben Lee Linear Algebra Formulas Ben Lee January 27, 2016 Definitions and Terms Diagonal: Diagonal of matrix A is a collection of entries A ij where i = j. Diagonal Matrix: A matrix (usually square), where entries

More information

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce Miderm II Solutions Problem. [8 points] (i) [4] Find the inverse of the matrix A = To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce We have A = 2 2 (ii) [2] Possibly

More information

Householder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity)

Householder reflectors are matrices of the form. P = I 2ww T, where w is a unit vector (a vector of 2-norm unity) Householder QR Householder reflectors are matrices of the form P = I 2ww T, where w is a unit vector (a vector of 2-norm unity) w Px x Geometrically, P x represents a mirror image of x with respect to

More information

Problem set 5: SVD, Orthogonal projections, etc.

Problem set 5: SVD, Orthogonal projections, etc. Problem set 5: SVD, Orthogonal projections, etc. February 21, 2017 1 SVD 1. Work out again the SVD theorem done in the class: If A is a real m n matrix then here exist orthogonal matrices such that where

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

Chapter 6. Orthogonality

Chapter 6. Orthogonality 6.4 The Projection Matrix 1 Chapter 6. Orthogonality 6.4 The Projection Matrix Note. In Section 6.1 (Projections), we projected a vector b R n onto a subspace W of R n. We did so by finding a basis for

More information

Row Space and Column Space of a Matrix

Row Space and Column Space of a Matrix Row Space and Column Space of a Matrix 1/18 Summary: To a m n matrix A = (a ij ), we can naturally associate subspaces of K n and of K m, called the row space of A and the column space of A, respectively.

More information

Matrix decompositions

Matrix decompositions Matrix decompositions Zdeněk Dvořák May 19, 2015 Lemma 1 (Schur decomposition). If A is a symmetric real matrix, then there exists an orthogonal matrix Q and a diagonal matrix D such that A = QDQ T. The

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Dot product and linear least squares problems

Dot product and linear least squares problems Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The

More information

CSL361 Problem set 4: Basic linear algebra

CSL361 Problem set 4: Basic linear algebra CSL361 Problem set 4: Basic linear algebra February 21, 2017 [Note:] If the numerical matrix computations turn out to be tedious, you may use the function rref in Matlab. 1 Row-reduced echelon matrices

More information

Review of Matrices and Block Structures

Review of Matrices and Block Structures CHAPTER 2 Review of Matrices and Block Structures Numerical linear algebra lies at the heart of modern scientific computing and computational science. Today it is not uncommon to perform numerical computations

More information

MATH. 20F SAMPLE FINAL (WINTER 2010)

MATH. 20F SAMPLE FINAL (WINTER 2010) MATH. 20F SAMPLE FINAL (WINTER 2010) You have 3 hours for this exam. Please write legibly and show all working. No calculators are allowed. Write your name, ID number and your TA s name below. The total

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015 Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 205. If A is a 3 3 triangular matrix, explain why det(a) is equal to the product of entries on the diagonal. If A is a lower triangular or diagonal

More information

Introduction to Numerical Linear Algebra II

Introduction to Numerical Linear Algebra II Introduction to Numerical Linear Algebra II Petros Drineas These slides were prepared by Ilse Ipsen for the 2015 Gene Golub SIAM Summer School on RandNLA 1 / 49 Overview We will cover this material in

More information

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th. Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

Math 2174: Practice Midterm 1

Math 2174: Practice Midterm 1 Math 74: Practice Midterm Show your work and explain your reasoning as appropriate. No calculators. One page of handwritten notes is allowed for the exam, as well as one blank page of scratch paper.. Consider

More information

Matrix invertibility. Rank-Nullity Theorem: For any n-column matrix A, nullity A +ranka = n

Matrix invertibility. Rank-Nullity Theorem: For any n-column matrix A, nullity A +ranka = n Matrix invertibility Rank-Nullity Theorem: For any n-column matrix A, nullity A +ranka = n Corollary: Let A be an R C matrix. Then A is invertible if and only if R = C and the columns of A are linearly

More information

AMS526: Numerical Analysis I (Numerical Linear Algebra)

AMS526: Numerical Analysis I (Numerical Linear Algebra) AMS526: Numerical Analysis I (Numerical Linear Algebra) Lecture 1: Course Overview & Matrix-Vector Multiplication Xiangmin Jiao SUNY Stony Brook Xiangmin Jiao Numerical Analysis I 1 / 20 Outline 1 Course

More information

Math 353, Practice Midterm 1

Math 353, Practice Midterm 1 Math 353, Practice Midterm Name: This exam consists of 8 pages including this front page Ground Rules No calculator is allowed 2 Show your work for every problem unless otherwise stated Score 2 2 3 5 4

More information

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares

MATH 167: APPLIED LINEAR ALGEBRA Least-Squares MATH 167: APPLIED LINEAR ALGEBRA Least-Squares October 30, 2014 Least Squares We do a series of experiments, collecting data. We wish to see patterns!! We expect the output b to be a linear function of

More information

EXAM. Exam 1. Math 5316, Fall December 2, 2012

EXAM. Exam 1. Math 5316, Fall December 2, 2012 EXAM Exam Math 536, Fall 22 December 2, 22 Write all of your answers on separate sheets of paper. You can keep the exam questions. This is a takehome exam, to be worked individually. You can use your notes.

More information

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET This is a (not quite comprehensive) list of definitions and theorems given in Math 1553. Pay particular attention to the ones in red. Study Tip For each

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS MATH Q MIDTERM EXAM I PRACTICE PROBLEMS Date and place: Thursday, November, 8, in-class exam Section : : :5pm at MONT Section : 9: :5pm at MONT 5 Material: Sections,, 7 Lecture 9 8, Quiz, Worksheet 9 8,

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show MTH 0: Linear Algebra Department of Mathematics and Statistics Indian Institute of Technology - Kanpur Problem Set Problems marked (T) are for discussions in Tutorial sessions (T) If A is an m n matrix,

More information

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Math 304 Final Exam (May 8) Spring 206 No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question. Name: Section: Question Points

More information

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Math 308 Practice Test for Final Exam Winter 2015

Math 308 Practice Test for Final Exam Winter 2015 Math 38 Practice Test for Final Exam Winter 25 No books are allowed during the exam. But you are allowed one sheet ( x 8) of handwritten notes (back and front). You may use a calculator. For TRUE/FALSE

More information

Numerical Linear Algebra Chap. 2: Least Squares Problems

Numerical Linear Algebra Chap. 2: Least Squares Problems Numerical Linear Algebra Chap. 2: Least Squares Problems Heinrich Voss voss@tu-harburg.de Hamburg University of Technology Institute of Numerical Simulation TUHH Heinrich Voss Numerical Linear Algebra

More information

MATH 2360 REVIEW PROBLEMS

MATH 2360 REVIEW PROBLEMS MATH 2360 REVIEW PROBLEMS Problem 1: In (a) (d) below, either compute the matrix product or indicate why it does not exist: ( )( ) 1 2 2 1 (a) 0 1 1 2 ( ) 0 1 2 (b) 0 3 1 4 3 4 5 2 5 (c) 0 3 ) 1 4 ( 1

More information

Math 291-2: Lecture Notes Northwestern University, Winter 2016

Math 291-2: Lecture Notes Northwestern University, Winter 2016 Math 291-2: Lecture Notes Northwestern University, Winter 2016 Written by Santiago Cañez These are lecture notes for Math 291-2, the second quarter of MENU: Intensive Linear Algebra and Multivariable Calculus,

More information

EE263: Introduction to Linear Dynamical Systems Review Session 2

EE263: Introduction to Linear Dynamical Systems Review Session 2 EE263: Introduction to Linear Dynamical Systems Review Session 2 Basic concepts from linear algebra nullspace range rank and conservation of dimension EE263 RS2 1 Prerequisites We assume that you are familiar

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition An Important topic in NLA Radu Tiberiu Trîmbiţaş Babeş-Bolyai University February 23, 2009 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition

More information

Solution of Linear Equations

Solution of Linear Equations Solution of Linear Equations (Com S 477/577 Notes) Yan-Bin Jia Sep 7, 07 We have discussed general methods for solving arbitrary equations, and looked at the special class of polynomial equations A subclass

More information

Least squares problems Linear Algebra with Computer Science Application

Least squares problems Linear Algebra with Computer Science Application Linear Algebra with Computer Science Application April 8, 018 1 Least Squares Problems 11 Least Squares Problems What do you do when Ax = b has no solution? Inconsistent systems arise often in applications

More information

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6A: Inner products In this chapter, the field F = R or C. We regard F equipped with a conjugation χ : F F. If F =

More information

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For

More information

0 2 0, it is diagonal, hence diagonalizable)

0 2 0, it is diagonal, hence diagonalizable) MATH 54 TRUE/FALSE QUESTIONS FOR MIDTERM 2 SOLUTIONS PEYAM RYAN TABRIZIAN 1. (a) TRUE If A is diagonalizable, then A 3 is diagonalizable. (A = P DP 1, so A 3 = P D 3 P = P D P 1, where P = P and D = D

More information