Lecture notes: Applied linear algebra Part 1. Version 2

Size: px
Start display at page:

Download "Lecture notes: Applied linear algebra Part 1. Version 2"

Transcription

1 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, Notation, basic notions and facts 1.1 Subspaces, range and kernel of a matrix In in following F m denotes the set of column vectors of length m with entries in F = R (real numbers) or F = C (complex numbers). A subset V F m is called a subspace if v 1, v 2 V implies λ 1 v 1 + λ 2 v 2 V for all λ 1, λ 2 F. Each subspace contains the zero vector. The sets {0} (trivial space) and F m are subspaces. A subset {v 1,..., v p } of a subspace V is said to be a basis of V if each v V can be written in the form v = v 1 x 1 + v 2 x v p, x p with unique coefficients x k F. If this is the case then in particular the relation 0 = v 1 x 1 + v 2 x v p, x p only holds if x 1 = x 2 =... = x p = 0. Hence the elements of a basis are linearly independent. Each basis of V has the same number of elements. This number is called the dimension of the subspace. Let A = [a jk ] F m n be a matrix with m rows and n columns whose entries are elements of F. Then the sets N F (A) = {x F n : Ax = 0}, R F (A) = {Ax : x F n } are subspaces. N F (A) is called the nullspace (synonym: kernel) of A. R F (A) is called the range (synonym: image) of A. In the sequel we omit the subscript F. Let A = [a 1,...,a n ] F m n, a k F m, x = [x 1,...,x n ] T, x k F. Then Ax = a 1 x 1 + a 2 x a n x n F m. Thus, Ax is a linear combination of the columns of A with coefficients x k, and the vector space R(A) is the set of all these linear combinations. One says that R(A) is the vector space that is spanned by the columns of A. Therefore R(A) is sometimes denoted as span(a 1,...,a n ) := R(A). 1

2 The maximum number of linearly independent columns of A is said to be the rank of A. The rank of A equals the dimension of R(A). It can be shown that dim(r(a)) +dim(n(a)) = n =rank A In particular, we have the equivalences ranka = n dim(r(a)) = n N(A) = {0} if m=n if m=n (n=number of columns of A). the columns of A are linearly independent the columns of A form a basis of R(A) R(A) = F n A is invertible (synonym: nonsingular), i.e. A 1 exists. Notation: In the following V U denotes the direct sum of the subspaces V, U. Recall: a sum of subspaces is direct if V U = {0}. 1.2 Scalar product, adjoint, unitary and Hermitian matrices Let V be a vector space over the field F, F = R or C. A scalar product on V is an F-valued function, : F F F with the following properties. For all u, v, w V, λ F, 1. v, v > 0 for v 0 (positive definiteness) 2. v, w = w, v (symmetry) 3. λ v, w = λ v, w 4. v, λ w = v, w λ. 5. u + v, w = u, w + v, w 6. u, v + w = u, v + u, w v, w V are orthogonal (synonym: perpendicular) if v, w = 0. The associated norm (also called the Euclidean length) is defined as v = v, v. If v = 1 then v is called a unit vector. If not otherwise stated in this lecture the scalar product is the standard scalar product on V = C n : n v, w := v w = v k w k, where v = [v 1... v n ] T, v = [w 1... w n ] T. v denotes the conjugate transpose of the column vector v. More generally, A = ĀT C n m denotes the adjoint (=conjugate transpose) of the matrix A = [a jk ] C m n. If A R m n is a matrix with real entries then A = A T. We have 2

3 1. Av, w = v, A w for all A C n n, v, w C n 2. (AB) = B A, (A ) = A, (λ A) = λa, for all A C m n, B C n p, λ C. Let a 1,...,a n C m be the columns of A C m n, i.e. A = [a 1,...,a n ]. Analogously B = [b 1,...,b n ] C m n. It is easily verified that a 1b 1 a 1b 2... a 1b n a 1, b 1 a 1, b 2... a 1, b n a 1 A B =. a n [b 1,..., b n ] = a 2b 1... = a, 2b a n b 1 a n b n a n, b a n, b n Thus A B is the matrix of scalar products of the columns of A with the columns of B. The matrix A A = [ a j, a k ] C n n is called the Gramian of the columns of A. The relation A A = I states that { 1 if j = k a j, a k = 0 otherwise. In words, we have A A = I if and only if the columns of A are pairwise orthogonal unit vectors. A square matrix A C n n with A A = I is said to be unitary. For a unitary matrix we have A = A 1. Exercise 1.1 (2+2 points) A = [a 1,...,a n ] C m n, B = [b 1,...,b p ] C q p and X = [x jk ] C n p. (a) Verify that AXB = n p x jk a j b k. (1) j=1 In particular AB = n a kb k (b) Suppose A A = I. Show that if n = p. v = n a k a k, v for all v R(A). A square matrix A C n n is said to be Hermitian if A = A. Exercise 1.2 (1 point) Let A C n n be Hermitian. Show that x Ax R for any x C n. A Hermitian matrix is said to be positive semi-definite if x Ax 0 for all x C n, 3

4 positive definite if x Ax > 0 for all x C n, x 0. Exercise 1.3 (4 points) Verify the following facts for A C m n. 1. Ax 2 = x, A Ax for all x C n. 2. A A is positive semi-definite. 3. A A is positive definite N(A) = {0} A A is invertible(non-singular). 4. N(A A) = N(A). 5. rank(a A) = rank(a). For a subspace V F m we define its orthogonal complement as V := {u F m : u, v = 0 for all v V}. We always have F m = V V. 2 The least squares problem Let A C m n, b C m and let V be a subspace of C m. We consider the functions f : V R g : C m R f(v) := v b, g(x) := Ax b. The least squares problem is to find the minimizers of f and g. Lemma 2.1 Let v 0 V. Then v 0 is the unique minimizer of f if and only if v 0 b, h = 0 for all h V. Proof: For any v 0, h V, λ C we have f(v 0 + λ h) 2 = v 0 + λ h b, v 0 + λ h b = v 0 b, v 0 b + v 0 b, λ h + λ h, v 0 b + λ h, λ h = f(v 0 ) R(λ v 0 b, h ) + λ 2 h 2. ( ) If ( ) 0 for some h then we can find a λ C of sufficiently small modulus and suitable phase angle such that f(v 0 + λ h) < f(v 0 ). Hence in this case v 0 is not a minimizer. If on the other hand ( ) = 0 for all h V then it follows that f(v 0 + h) > f(v 0 ) for h 0. So v 0 is the unique minimizer of f. Corollary 2.2 A vector x 0 C m is a minimizer of g if and only if (A A)x 0 = A b (this is called the normal equation). 4

5 Proof: Applying lemma 2.1 to V = R(A) we conclude that x 0 is a minimizer if and only if 0 = Ax 0 b, Aξ = A (Ax 0 b), ξ for all ξ C n. The latter holds if and only if 0 = A (Ax 0 b) = A Ax 0 A b. Proposition 2.3 Suppose A C m n has linearly independent columns and R(A) = V (i.e. the columns of A form a basis of V). Then A A is nonsingular, and the unique minimizer of g is x 0 = (A A) 1 A b. The unique minimizer of f is Pb, where P := A(A A) 1 A. The matrix P is called the orthogonal projector onto V. Proof: The nonsingularity of A A is shown in Exercise 1.3. The rest is obvious. Note that the orthogonal projector P is Hermitian: P = P. Special cases: If A has only one column, A = a C m \ {0}, then Pv = aa a, v a 2v = a a. 2 Suppose the columns of A = [a 1,..., a n ] form an orthonormal basis of R(A), (i.e. A A = I). Then Pv = AA v = n a k a kv = n a k, v a k. (2) Further remarks: Lemma 2.1 and its proof also hold in infinite dimensional Hilbert spaces. However, a mimimizer v 0 may not exists if V is not a closed subspace. Since every subspace of C m has a finite basis the problem to find the minimizers for f is (in principle) solved by Proposition 2.3. Just find a basis and compute the unique minimizer Pb. If A has not full column rank then the minimizer of g is not unique. The set of minimizers is the affine space ˆx 0 + N(A A), where ˆx 0 is any solution of the normal equation. Exercise 2.4 (8 points) Definition: Let V and U be subspaces of C m such that C m = V U. A matrix P C m m is said to be a projector onto V along U if Pv = v for all v V and Pu = 0 for all u U. A projector is said to be orthogonal if U = V. Prove the statements (a)-(d) below. (a) The following assertions are equivalent for P C m m. (1) P 2 = P. 5

6 (2) C m = R(P) N(P) and P is the projector onto R(P) along N(P). (b) A projector P is orthogonal if and only if it is Hermitian (P = P ). (c) Let V = [v 1,..., v r ] C m r and W = [w 1,..., w r ] C m r be such that { 1 if j = k w j, v k = (this is called biorthogonality). 0 otherwise Then the matrix P := V W is a projector onto R(V ) along R(W). (d) Let C m = V U. Suppose the columns of V C m r form a basis of V and the columns of Z C m r form a basis of U. Then Z V is nonsingular and P := V (Z V ) 1 Z is the projector onto V along U. 3 The QR-decomposition We are going to show the following result. Theorem 3.1 Let A F m n, m n. There exists a unitary matrix Q C m m and an upper triangular matrix R = [r jk ] such that [ ] R A = Q (QR-decomposition of A) (3) 0 (If m = n then the 0 block below R is not present and we have A = QR). Let Q = [q 1,...,q m ] F m m. Then the identity (3) states a 1 = q 1 r 11 a 2 = q 1 r 12 + q 2 r 22 a 3 = q 1 r 13 + q 2 r 23 + q 3 r 33. a k = k q j r jk, j=1 k = 1,..., n If the columns of A are linearly independent then the first n columns of Q can be found by Gram-Schmidt-Orthogonalization: q 1 = a 1 a 1 q 2 = a 2 q 1 q 1, a 2 a 2 q 1 q 1, a 2 q 3 = a 3 q 1 q 1, a 3 q 2 q 2, a 3 a 3 q 1 q 1, a 3 q 2 q 2, a 3. q k = a k k 1 j=1 q j q j, a k a k k 1 j=1 q, k = 1,...,n j q j, a k 6

7 The remaining columns of Q could then be computed by Gram-Schmidt orthogonalization of any basis of R(A). However, this method for constructing Q is numerically not stable. Moreover the method fails if A has not full column rank (since then division by zero occurs). We will use an alternative method that uses Householder matrices. Definition: A matrix H C m m of the form H = I 2 aa a 2 a C m \ {0} is called a Householder matrix. We have Ha = a and Hv = v if v, a = 0. Thus, the multiplication with H is a reflection at the subspace (C a). Exercise 3.2 (2 points) Show that H is both Hermitian and unitary. Lemma 3.3 Let x, y C m If x y, x = y and x y R then the Householder matrix (x y)(x y) H = I 2 x y 2 satisfies Hx = y and Hy = x. Proof: We have x y 2 = (x y) (x y) This implies the claim. = x x x y y x + y y (y y = x x and x y = y x since x y R) = 2(x x y x) = 2(x y) x = 2(y y x y) = 2(y x) y. Proof of Theorem 3.1 We proceed by induction on m. The case m = 1 is trivial. Let m 2 and e 1 = [1, 0...,0] T C m. If the first column a 1 of A satisfies a 1 = r 11 e 1 for some r 11 C let H = I. Otherwise choose the factor r 11 C such that (r 11 e 1 ) a 1 R and r 11 = a 1. Then there exists a Householder matrix H with Ha 1 = r 11 e 1. In both cases we have HA = [ r11 0  By the induction assumption we have  = ˆQ triangular matrix ˆR. Thus r 11 [ ] A = H ˆR 0 ˆQ 0 ],  C (m 1) (n 1). [ ] ˆR with a unitary matrix 0 ˆQ and an upper [ ] 1 0 = H 0 ˆQ 0 =:Q r 11 [ ] ˆR 0. 7

8 Exercise 3.4 (3 points) Suppose A has linearly independent columns and has the QRfactorization (3). Write Q in the form Q = [Q 1, Q 2 ] with Q 1 C m n, Q 2 C m (m n). Show that the unique solutions of the least squares problems in Section 2 are given by v 0 = Q 1 Q 1 b, x 0 = R 1 Q 1 b. 4 The Schur decomposition Theorem 4.1 To any A C n n there exists a unitary matrix V C n n and an upper triangular matrix T such that A = V TV (Schur decomposition). Proof: The proof is by induction. The statement is trivial for 1 1 matrices. Let n 2 and let v be an eigenvector of A such that Av = λ v. Choose an orthonormal basis v 1, v 2,...,v n of C n such that v 1 = v. We have Av k = ξ k v 1 + n j=2 x jkv j for some ξ k, x jk C. Let ξ = [ξ 2,...,ξ n ], X = [x jk ]. Then A [v 1, v 2,...,v n ] =:V 1 = [Av 1, Av 2,...,Av n ] = [λ v 1, Av 2,...,Av n ] [ ] λ ξ = [v 1, v 2,..., v n ]. 0 X By the induction assumption there exists a unitary matrix V 2 and an upper triangular matrix T 2 such that X = V 2 T 2 V2. Hence, [ ] [ ] [ ][ ][ ] λ ξ λ ξ 1 0 λ ξ 1 0 A = V 1 V1 0 X = V 1 0 V 2 T 2 V2 V1 = V 1 0 V 2 0 T 2 0 V V1. It is easily verified that V is unitary. } {{ } =:V =:T 2 } {{ } =V 5 Normal matrices A matrix A C n n is said to be normal if it commutes with its adjoint, i.e. if the identity AA = A A holds. Hemitian and unitary matrices are normal. Proposition 5.1 A matrix A C n n be normal if and only if there exists a unitary matrix V C n n and a diagonal matrix Λ = diag(λ 1,...,λ n ) C n n such that A = V ΛV. (4) Exercise 5.2 (2 points) Prove Proposition 5.1. Hint: First show that if A = V BV with unitary V then A is normal if and only if B is normal. Then use the Schur decomposition and show that a triangular matrix T is normal if and only if it is diagonal. 8

9 Recall that in the decomposition (4) the diagonal elements of Λ are the eigenvalues of A and the columns of V are the associated eigenvectors. Thus Proposition 5.1 states that a matrix A is normal if and only if there exists an orthonormal basis of eigenvectors of A. The eigenvalues can be arbitrary complex numbers. However, a normal matrix A is Hermitian (unitary) if and only if all its eigenvalues are real (have modulus 1). Finally, note that (4) can be written in the form A = n λ k v k vk. This follows from (1). 6 The singular value decomposition (SVD) Proposition 6.1 Let A C m n, ranka = r. Then there exist unitary matrices V C n n, U C m m and positive numbers σ 1 σ 2... σ r such that ] [ˆΣ 0 A = U V, 0 0 ˆΣ = diag(σ1,...,σ r ). (5) =:Σ In this factorization the numbers σ k are unique. They are called the singular values of A. Let λ 1 ( ) λ 2 ( )... denote the eigenvalues of a Hermitian matrix in decreasing order. Then σ k = λ k (A A) = λ k (AA ) for k = 1,..., r. Convention: we define the singular values σ k of A to be 0 for k > ranka. Proof: The positive semidefinite Hermitian matrix A A has nonnegative eigenvalues λ k 0. Define σ k = λ k. Let V = [v 1,...,v n ] be a unitary matrix whose columns form a basis of eigenvectors such that A Av k = σk 2 v k and σ 1 σ 2... σ n. Since ranka A = ranka = r we have σ k > 0 for k r and σ k = 0 for k > r. For k r define u k := Av k /σ k.then { u j u k = v j (A A)v k 1 if j = k = σ j σ k 0 otherwise. Thus, the vectors u 1,...,u r are pairwise orthogonal unit vectors. Now, choose u r+1,...,u m C m such that U = [u 1,..., u m ] C m m is unitary. Then we have AV = UΣ. This implies (5). Remark: Suppose the unitary matrices V = 1, (6) U = [u 1,...,u r, u r+1,...,u }{{ m ], } [v }...,v {{ r, v } r+1,...,v n ] =:U 1 =:U 2 =:V 1 =:V 2 (7) 9

10 satisfy (5). Then R(V 1 ) = N(A), R(V 2 ) = N(A), R(U 1 ) = R(A), R(V 2 ) = R(A). Furthermore, we have A = U 1ˆΣV 1 = (The second equation is a special case of (1).) r σ k u k vk. Exercise 6.2 (2 points) Show that the singular values of a normal matrix A are the absolute values of the eigenvalues of A. 7 The Moore-Penrose generalized inverse For any matrix A C m n the linear map l : N(A) R(A), x l A(x) is bijective (i.e. one-to-one and onto). Hence the map pinv : C m C n, pinv(y 1 + y 2 ) := l 1 (y 1 ), y 1 R(A), y 2 R(A) is well defined. It is easily seen that pinv is linear. By elementary linear algebra there is a unique matrix A + C n m such that pinv(y) = A + y for all y C m. This matrix is called the Moore-Penrose generalized inverse of A. It can be computed via a singular value decomposition. Precisely, (with the notation (5) and (6)) we have A + = V [ˆΣ 1 ] U = V 1ˆΣ 1 U 1 = r σ 1 k v k u k. The Moore-Penrose inverse yields the solution for the least squares problem even if A has not full column rank: Proposition 7.1 Let A C m n, b C m. The set of minimizers of the function g(x) = Ax b, x C n, is the affine space A + b + N(A). Furthermore A + b is the minimizer with the smallest (Euclidean) norm. Exercise 7.2 (4+1+1 points) (a) Prove Proposition 7.1. (b) Show that A + = (A A) 1 A if the columns of A are linearly independent. (c) Show that A + A is the orthogonal projector onto N(A), and that AA + is the orthogonal projector onto R(A). 10

11 Exercise 7.3 (6 points)prove the following statement. Let A C m n B C p q, C C m q. Then the matrix equation AXB = C has a solution X C n p if and only if In this case the general solution is AA + CB + B = C. X = A + CB + + Y A + AY BB +, Y C n p. 8 The spectral norm and the condition number of a matrix We consider the quantities A := max x C n,x =0 Ax Ax, inf(a) := min x x C n,x =0 x, where the norms x and Ax are the Euclidean vector norms. From the definition it follows that inf(a) x Ax A x for all x C n The the inequalities are sharp. The quantity A is called the spectral norm of A. Proposition 8.1 For any A C m n, A = σ 1, inf(a) = σ n, where σ 1 and σ n denote the maximum and the minimum (the nth) singular value of A. Proof: Multiplication of a vector with a unitary matrix does not change its Euclidean norm. Thus Ax = UΣV x = ΣV x x x V x = Σy n y = σ2 k y k 2 n y, ( ) k 2 where y = [y 1,...,y n ] T := V x. The quotient ( ) is obviously bounded from below by the smallest singular value. It is bounded from above by the largest singular value. The bounds are attained for the vectors y = [0,..., 0, 1] T = V v n and y = [1, 0,..., 0] T = V v 1 respectively. Exercise 8.2 (2 points) Let A C n n be nonsingular. Prove that inf(a) = A 1 1. (8) 11

12 For a nonsingular square matrix A C n n the quantity κ(a) := A A 1 = A inf(a) = σ 1 σ n is called its condition number. The condition number occures in the following error bound for the solution of linear equations. Proposition 8.3 Let A,  Cn n be nonsingular, and let x, ˆx, b 0 be such that Ax = b, ˆx = b. Then x ˆx A  κ(a). ˆx A Proof: We have x ˆx = A 1 b  1 b = (A 1  1 )b = A 1 ( A) 1 b = A 1 ( A)ˆx Thus, This yields the result. x ˆx A 1  A ˆx. The larger the condition number the less reliable is the numerical solution of a linear equation. 12

Chapter 3 Transformations

Chapter 3 Transformations Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases

More information

Notes on Eigenvalues, Singular Values and QR

Notes on Eigenvalues, Singular Values and QR Notes on Eigenvalues, Singular Values and QR Michael Overton, Numerical Computing, Spring 2017 March 30, 2017 1 Eigenvalues Everyone who has studied linear algebra knows the definition: given a square

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Math 407: Linear Optimization

Math 407: Linear Optimization Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University

More information

MATH36001 Generalized Inverses and the SVD 2015

MATH36001 Generalized Inverses and the SVD 2015 MATH36001 Generalized Inverses and the SVD 201 1 Generalized Inverses of Matrices A matrix has an inverse only if it is square and nonsingular. However there are theoretical and practical applications

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Stat 159/259: Linear Algebra Notes

Stat 159/259: Linear Algebra Notes Stat 159/259: Linear Algebra Notes Jarrod Millman November 16, 2015 Abstract These notes assume you ve taken a semester of undergraduate linear algebra. In particular, I assume you are familiar with the

More information

NORMS ON SPACE OF MATRICES

NORMS ON SPACE OF MATRICES NORMS ON SPACE OF MATRICES. Operator Norms on Space of linear maps Let A be an n n real matrix and x 0 be a vector in R n. We would like to use the Picard iteration method to solve for the following system

More information

Linear Algebra in Actuarial Science: Slides to the lecture

Linear Algebra in Actuarial Science: Slides to the lecture Linear Algebra in Actuarial Science: Slides to the lecture Fall Semester 2010/2011 Linear Algebra is a Tool-Box Linear Equation Systems Discretization of differential equations: solving linear equations

More information

Lecture notes: Applied linear algebra Part 2. Version 1

Lecture notes: Applied linear algebra Part 2. Version 1 Lecture notes: Applied linear algebra Part 2. Version 1 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 First, some exercises: xercise 0.1 (2 Points) Another least

More information

LinGloss. A glossary of linear algebra

LinGloss. A glossary of linear algebra LinGloss A glossary of linear algebra Contents: Decompositions Types of Matrices Theorems Other objects? Quasi-triangular A matrix A is quasi-triangular iff it is a triangular matrix except its diagonal

More information

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax = . (5 points) (a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? dim N(A), since rank(a) 3. (b) If we also know that Ax = has no solution, what do we know about the rank of A? C(A)

More information

Math Fall Final Exam

Math Fall Final Exam Math 104 - Fall 2008 - Final Exam Name: Student ID: Signature: Instructions: Print your name and student ID number, write your signature to indicate that you accept the honor code. During the test, you

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Pseudoinverse & Moore-Penrose Conditions

Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 7 ECE 275A Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego

More information

ECE 275A Homework #3 Solutions

ECE 275A Homework #3 Solutions ECE 75A Homework #3 Solutions. Proof of (a). Obviously Ax = 0 y, Ax = 0 for all y. To show sufficiency, note that if y, Ax = 0 for all y, then it must certainly be true for the particular value of y =

More information

Review of some mathematical tools

Review of some mathematical tools MATHEMATICAL FOUNDATIONS OF SIGNAL PROCESSING Fall 2016 Benjamín Béjar Haro, Mihailo Kolundžija, Reza Parhizkar, Adam Scholefield Teaching assistants: Golnoosh Elhami, Hanjie Pan Review of some mathematical

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice

linearly indepedent eigenvectors as the multiplicity of the root, but in general there may be no more than one. For further discussion, assume matrice 3. Eigenvalues and Eigenvectors, Spectral Representation 3.. Eigenvalues and Eigenvectors A vector ' is eigenvector of a matrix K, if K' is parallel to ' and ' 6, i.e., K' k' k is the eigenvalue. If is

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

Typical Problem: Compute.

Typical Problem: Compute. Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and

More information

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators. Adjoint operator and adjoint matrix Given a linear operator L on an inner product space V, the adjoint of L is a transformation

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

Properties of Matrices and Operations on Matrices

Properties of Matrices and Operations on Matrices Properties of Matrices and Operations on Matrices A common data structure for statistical analysis is a rectangular array or matris. Rows represent individual observational units, or just observations,

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information

Pseudoinverse & Orthogonal Projection Operators

Pseudoinverse & Orthogonal Projection Operators Pseudoinverse & Orthogonal Projection Operators ECE 174 Linear & Nonlinear Optimization Ken Kreutz-Delgado ECE Department, UC San Diego Ken Kreutz-Delgado (UC San Diego) ECE 174 Fall 2016 1 / 48 The Four

More information

EE731 Lecture Notes: Matrix Computations for Signal Processing

EE731 Lecture Notes: Matrix Computations for Signal Processing EE731 Lecture Notes: Matrix Computations for Signal Processing James P. Reilly c Department of Electrical and Computer Engineering McMaster University October 17, 005 Lecture 3 3 he Singular Value Decomposition

More information

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most

More information

The Singular Value Decomposition and Least Squares Problems

The Singular Value Decomposition and Least Squares Problems The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Numerical Linear Algebra

Numerical Linear Algebra University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2

EE/ACM Applications of Convex Optimization in Signal Processing and Communications Lecture 2 EE/ACM 150 - Applications of Convex Optimization in Signal Processing and Communications Lecture 2 Andre Tkacenko Signal Processing Research Group Jet Propulsion Laboratory April 5, 2012 Andre Tkacenko

More information

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR

THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR THE SINGULAR VALUE DECOMPOSITION MARKUS GRASMAIR 1. Definition Existence Theorem 1. Assume that A R m n. Then there exist orthogonal matrices U R m m V R n n, values σ 1 σ 2... σ p 0 with p = min{m, n},

More information

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian

2. Linear algebra. matrices and vectors. linear equations. range and nullspace of matrices. function of vectors, gradient and Hessian FE661 - Statistical Methods for Financial Engineering 2. Linear algebra Jitkomut Songsiri matrices and vectors linear equations range and nullspace of matrices function of vectors, gradient and Hessian

More information

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra

Lecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra Lecture: Linear algebra. 1. Subspaces. 2. Orthogonal complement. 3. The four fundamental subspaces 4. Solutions of linear equation systems The fundamental theorem of linear algebra 5. Determining the fundamental

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 Instructions Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012 The exam consists of four problems, each having multiple parts. You should attempt to solve all four problems. 1.

More information

Review problems for MA 54, Fall 2004.

Review problems for MA 54, Fall 2004. Review problems for MA 54, Fall 2004. Below are the review problems for the final. They are mostly homework problems, or very similar. If you are comfortable doing these problems, you should be fine on

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters, transforms,

More information

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether

More information

Chapter 0 Miscellaneous Preliminaries

Chapter 0 Miscellaneous Preliminaries EE 520: Topics Compressed Sensing Linear Algebra Review Notes scribed by Kevin Palmowski, Spring 2013, for Namrata Vaswani s course Notes on matrix spark courtesy of Brian Lois More notes added by Namrata

More information

Lecture 3: Review of Linear Algebra

Lecture 3: Review of Linear Algebra ECE 83 Fall 2 Statistical Signal Processing instructor: R Nowak, scribe: R Nowak Lecture 3: Review of Linear Algebra Very often in this course we will represent signals as vectors and operators (eg, filters,

More information

Linear Algebra- Final Exam Review

Linear Algebra- Final Exam Review Linear Algebra- Final Exam Review. Let A be invertible. Show that, if v, v, v 3 are linearly independent vectors, so are Av, Av, Av 3. NOTE: It should be clear from your answer that you know the definition.

More information

Notes on Linear Algebra

Notes on Linear Algebra 1 Notes on Linear Algebra Jean Walrand August 2005 I INTRODUCTION Linear Algebra is the theory of linear transformations Applications abound in estimation control and Markov chains You should be familiar

More information

Quantum Computing Lecture 2. Review of Linear Algebra

Quantum Computing Lecture 2. Review of Linear Algebra Quantum Computing Lecture 2 Review of Linear Algebra Maris Ozols Linear algebra States of a quantum system form a vector space and their transformations are described by linear operators Vector spaces

More information

2. Review of Linear Algebra

2. Review of Linear Algebra 2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear

More information

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible. MATH 2331 Linear Algebra Section 2.1 Matrix Operations Definition: A : m n, B : n p ( 1 2 p ) ( 1 2 p ) AB = A b b b = Ab Ab Ab Example: Compute AB, if possible. 1 Row-column rule: i-j-th entry of AB:

More information

Spectral Theorem for Self-adjoint Linear Operators

Spectral Theorem for Self-adjoint Linear Operators Notes for the undergraduate lecture by David Adams. (These are the notes I would write if I was teaching a course on this topic. I have included more material than I will cover in the 45 minute lecture;

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra Foundations of Numerics from Advanced Mathematics Linear Algebra Linear Algebra, October 23, 22 Linear Algebra Mathematical Structures a mathematical structure consists of one or several sets and one or

More information

Moore-Penrose Conditions & SVD

Moore-Penrose Conditions & SVD ECE 275AB Lecture 8 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 8 ECE 275A Moore-Penrose Conditions & SVD ECE 275AB Lecture 8 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 2/1

More information

2. LINEAR ALGEBRA. 1. Definitions. 2. Linear least squares problem. 3. QR factorization. 4. Singular value decomposition (SVD) 5.

2. LINEAR ALGEBRA. 1. Definitions. 2. Linear least squares problem. 3. QR factorization. 4. Singular value decomposition (SVD) 5. 2. LINEAR ALGEBRA Outline 1. Definitions 2. Linear least squares problem 3. QR factorization 4. Singular value decomposition (SVD) 5. Pseudo-inverse 6. Eigenvalue decomposition (EVD) 1 Definitions Vector

More information

MIT Final Exam Solutions, Spring 2017

MIT Final Exam Solutions, Spring 2017 MIT 8.6 Final Exam Solutions, Spring 7 Problem : For some real matrix A, the following vectors form a basis for its column space and null space: C(A) = span,, N(A) = span,,. (a) What is the size m n of

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition An Important topic in NLA Radu Tiberiu Trîmbiţaş Babeş-Bolyai University February 23, 2009 Radu Tiberiu Trîmbiţaş ( Babeş-Bolyai University)The Singular Value Decomposition

More information

Computational math: Assignment 1

Computational math: Assignment 1 Computational math: Assignment 1 Thanks Ting Gao for her Latex file 11 Let B be a 4 4 matrix to which we apply the following operations: 1double column 1, halve row 3, 3add row 3 to row 1, 4interchange

More information

A Review of Linear Algebra

A Review of Linear Algebra A Review of Linear Algebra Mohammad Emtiyaz Khan CS,UBC A Review of Linear Algebra p.1/13 Basics Column vector x R n, Row vector x T, Matrix A R m n. Matrix Multiplication, (m n)(n k) m k, AB BA. Transpose

More information

Math113: Linear Algebra. Beifang Chen

Math113: Linear Algebra. Beifang Chen Math3: Linear Algebra Beifang Chen Spring 26 Contents Systems of Linear Equations 3 Systems of Linear Equations 3 Linear Systems 3 2 Geometric Interpretation 3 3 Matrices of Linear Systems 4 4 Elementary

More information

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES MATRICES ARE SIMILAR TO TRIANGULAR MATRICES 1 Complex matrices Recall that the complex numbers are given by a + ib where a and b are real and i is the imaginary unity, ie, i 2 = 1 In what we describe below,

More information

Proposition 42. Let M be an m n matrix. Then (32) N (M M)=N (M) (33) R(MM )=R(M)

Proposition 42. Let M be an m n matrix. Then (32) N (M M)=N (M) (33) R(MM )=R(M) RODICA D. COSTIN. Singular Value Decomposition.1. Rectangular matrices. For rectangular matrices M the notions of eigenvalue/vector cannot be defined. However, the products MM and/or M M (which are square,

More information

Exercise Sheet 1.

Exercise Sheet 1. Exercise Sheet 1 You can download my lecture and exercise sheets at the address http://sami.hust.edu.vn/giang-vien/?name=huynt 1) Let A, B be sets. What does the statement "A is not a subset of B " mean?

More information

Linear Algebra, part 3 QR and SVD

Linear Algebra, part 3 QR and SVD Linear Algebra, part 3 QR and SVD Anna-Karin Tornberg Mathematical Models, Analysis and Simulation Fall semester, 2012 Going back to least squares (Section 1.4 from Strang, now also see section 5.2). We

More information

4.2. ORTHOGONALITY 161

4.2. ORTHOGONALITY 161 4.2. ORTHOGONALITY 161 Definition 4.2.9 An affine space (E, E ) is a Euclidean affine space iff its underlying vector space E is a Euclidean vector space. Given any two points a, b E, we define the distance

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Lecture Summaries for Linear Algebra M51A

Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

More information

Schur s Triangularization Theorem. Math 422

Schur s Triangularization Theorem. Math 422 Schur s Triangularization Theorem Math 4 The characteristic polynomial p (t) of a square complex matrix A splits as a product of linear factors of the form (t λ) m Of course, finding these factors is a

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence)

Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence) Math 413/513 Chapter 6 (from Friedberg, Insel, & Spence) David Glickenstein December 7, 2015 1 Inner product spaces In this chapter, we will only consider the elds R and C. De nition 1 Let V be a vector

More information

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA)

The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) Chapter 5 The Singular Value Decomposition (SVD) and Principal Component Analysis (PCA) 5.1 Basics of SVD 5.1.1 Review of Key Concepts We review some key definitions and results about matrices that will

More information

October 25, 2013 INNER PRODUCT SPACES

October 25, 2013 INNER PRODUCT SPACES October 25, 2013 INNER PRODUCT SPACES RODICA D. COSTIN Contents 1. Inner product 2 1.1. Inner product 2 1.2. Inner product spaces 4 2. Orthogonal bases 5 2.1. Existence of an orthogonal basis 7 2.2. Orthogonal

More information

Numerical Linear Algebra Homework Assignment - Week 2

Numerical Linear Algebra Homework Assignment - Week 2 Numerical Linear Algebra Homework Assignment - Week 2 Đoàn Trần Nguyên Tùng Student ID: 1411352 8th October 2016 Exercise 2.1: Show that if a matrix A is both triangular and unitary, then it is diagonal.

More information

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,

More information

I. Multiple Choice Questions (Answer any eight)

I. Multiple Choice Questions (Answer any eight) Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY

More information

Lecture 2: Linear Algebra Review

Lecture 2: Linear Algebra Review EE 227A: Convex Optimization and Applications January 19 Lecture 2: Linear Algebra Review Lecturer: Mert Pilanci Reading assignment: Appendix C of BV. Sections 2-6 of the web textbook 1 2.1 Vectors 2.1.1

More information

This can be accomplished by left matrix multiplication as follows: I

This can be accomplished by left matrix multiplication as follows: I 1 Numerical Linear Algebra 11 The LU Factorization Recall from linear algebra that Gaussian elimination is a method for solving linear systems of the form Ax = b, where A R m n and bran(a) In this method

More information

The QR Factorization

The QR Factorization The QR Factorization How to Make Matrices Nicer Radu Trîmbiţaş Babeş-Bolyai University March 11, 2009 Radu Trîmbiţaş ( Babeş-Bolyai University) The QR Factorization March 11, 2009 1 / 25 Projectors A projector

More information

Review of Some Concepts from Linear Algebra: Part 2

Review of Some Concepts from Linear Algebra: Part 2 Review of Some Concepts from Linear Algebra: Part 2 Department of Mathematics Boise State University January 16, 2019 Math 566 Linear Algebra Review: Part 2 January 16, 2019 1 / 22 Vector spaces A set

More information

7. Symmetric Matrices and Quadratic Forms

7. Symmetric Matrices and Quadratic Forms Linear Algebra 7. Symmetric Matrices and Quadratic Forms CSIE NCU 1 7. Symmetric Matrices and Quadratic Forms 7.1 Diagonalization of symmetric matrices 2 7.2 Quadratic forms.. 9 7.4 The singular value

More information

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013. The University of Texas at Austin Department of Electrical and Computer Engineering EE381V: Large Scale Learning Spring 2013 Assignment Two Caramanis/Sanghavi Due: Tuesday, Feb. 19, 2013. Computational

More information

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5

STAT 309: MATHEMATICAL COMPUTATIONS I FALL 2017 LECTURE 5 STAT 39: MATHEMATICAL COMPUTATIONS I FALL 17 LECTURE 5 1 existence of svd Theorem 1 (Existence of SVD) Every matrix has a singular value decomposition (condensed version) Proof Let A C m n and for simplicity

More information

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic

Applied Mathematics 205. Unit II: Numerical Linear Algebra. Lecturer: Dr. David Knezevic Applied Mathematics 205 Unit II: Numerical Linear Algebra Lecturer: Dr. David Knezevic Unit II: Numerical Linear Algebra Chapter II.3: QR Factorization, SVD 2 / 66 QR Factorization 3 / 66 QR Factorization

More information

. = V c = V [x]v (5.1) c 1. c k

. = V c = V [x]v (5.1) c 1. c k Chapter 5 Linear Algebra It can be argued that all of linear algebra can be understood using the four fundamental subspaces associated with a matrix Because they form the foundation on which we later work,

More information

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.

MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. Orthogonality Definition 1. Vectors x,y R n are said to be orthogonal (denoted x y)

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

Singular Value Decomposition (SVD)

Singular Value Decomposition (SVD) School of Computing National University of Singapore CS CS524 Theoretical Foundations of Multimedia More Linear Algebra Singular Value Decomposition (SVD) The highpoint of linear algebra Gilbert Strang

More information

18.06 Professor Johnson Quiz 1 October 3, 2007

18.06 Professor Johnson Quiz 1 October 3, 2007 18.6 Professor Johnson Quiz 1 October 3, 7 SOLUTIONS 1 3 pts.) A given circuit network directed graph) which has an m n incidence matrix A rows = edges, columns = nodes) and a conductance matrix C [diagonal

More information

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = 30 MATHEMATICS REVIEW G A.1.1 Matrices and Vectors Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = a 11 a 12... a 1N a 21 a 22... a 2N...... a M1 a M2... a MN A matrix can

More information

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background

Lecture notes on Quantum Computing. Chapter 1 Mathematical Background Lecture notes on Quantum Computing Chapter 1 Mathematical Background Vector states of a quantum system with n physical states are represented by unique vectors in C n, the set of n 1 column vectors 1 For

More information

Homework 11 Solutions. Math 110, Fall 2013.

Homework 11 Solutions. Math 110, Fall 2013. Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting

More information

Eigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems.

Eigenvalues, Eigenvectors. Eigenvalues and eigenvector will be fundamentally related to the nature of the solutions of state space systems. Chapter 3 Linear Algebra In this Chapter we provide a review of some basic concepts from Linear Algebra which will be required in order to compute solutions of LTI systems in state space form, discuss

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

5.6. PSEUDOINVERSES 101. A H w.

5.6. PSEUDOINVERSES 101. A H w. 5.6. PSEUDOINVERSES 0 Corollary 5.6.4. If A is a matrix such that A H A is invertible, then the least-squares solution to Av = w is v = A H A ) A H w. The matrix A H A ) A H is the left inverse of A and

More information

APPLIED LINEAR ALGEBRA

APPLIED LINEAR ALGEBRA APPLIED LINEAR ALGEBRA Giorgio Picci November 24, 2015 1 Contents 1 LINEAR VECTOR SPACES AND LINEAR MAPS 10 1.1 Linear Maps and Matrices........................... 11 1.2 Inverse of a Linear Map............................

More information