NORMS ON SPACE OF MATRICES

Size: px
Start display at page:

Download "NORMS ON SPACE OF MATRICES"

Transcription

1 NORMS ON SPACE OF MATRICES. Operator Norms on Space of linear maps Let A be an n n real matrix and x 0 be a vector in R n. We would like to use the Picard iteration method to solve for the following system of ordinary linear differential equation { x (.) (t) = Ax(t) x(0) = x 0 By integrating the differential equation (.), we obtain that x(t) = x 0 + t 0 Ax(s)ds. Let x 0 (t) = x 0 for t [0, b] be the constant function and define x : [0, b] R n recursively by x n (t) = x 0 + By induction, we obtain that for any k N, We can rewrite x k (t) as x k (t) = x k (t) = x 0 + (ta)! t 0 Ax n (s)ds. x (ta)k x 0. k! ( I n + (ta) ) + + (ta)k x 0, for any k.! k! Here I n denotes the n n identify matrix. It is natural for us to ask: can we define (ta) k I n +? k! k= To define an infinite series of matrices, we need to define a norm on M n (R), where M n (R) is the space of n n real matrices. Let us define a norm on the space M mn (R) of m n matrices where m and n are not necessarily equal. Let A be an m n matrix. We define a function L A : R n R m by L A (x) = Ax. It is routine to check that L A is a linear map. Conversely, given a linear map T : R n R m, can we find a unique m n matrix A such that T = L A? Let T : R n R m be a linear map. For each x R n, we write T (x) = (T (x),, T m (x)). Since T is linear, T i : R n R is linear for each i m. Lemma.. Let S : R n R be a linear map. Then there exists a vector a R n such that S(x) = a, x for any x R n. Proof. Let β = {e,, e n } be the standard basis for R n. We write x = n i= x ie i. By linearity of S, S(x) = x S(e ) + + x n S(e n ). Let a i = S(e i ) for i n. For a = (a,, a n ), we have S(x) = a x + + a n x n = a, x.

2 NORMS ON SPACE OF MATRICES Since T i : R n R is linear for each i m, we can choose b i = (a i,, a in ) R n such that T i (x) = b i, x for any x R n. Let A be the m n matrix a a a n a a a n A =......, a m a mn i.e. A is the matrix whose i-th row vector is b i. By definition, T = L A. Proposition.. Let L(R n, R m ) be the space of linear maps from R n to R m. Then L(R n, R m ) is a vector subspace of the space of functions F(R n, R m ) from R n to R m. Theorem.. Define φ : M m n (R) L(R n, R m ) by A L A. Then φ is a linear isomorphism. Proposition.. Let T : R n R m be a linear map. Then there exists M > 0 such that Hence T is a Lipschitz function. T(x) R m M x R n for any x R n. Proof. Let us write T = (T,, T m ). For each i m, we choose b i so that T i (x) = b i, x for all x R n. For any x R n, T (x) R m = T (x) + + T m (x) = b, x + + b m, x By Cauchy Schwarz inequality, b i, x b i R n x R n. For any x R n, T (x) R m ( b R n + + b m R n) x R n. Let M = b R + + b n m R. This proves the assertion. n Let T : R n R m be a linear map and M > 0 be as above. For x R n =, we find T (x) R m M. This allows us to define the operator norm of T by T op = sup T (x) R m. x R n= Theorem.. The space L(R n, R m ) of linear maps from R n to R m together with op is a real Banach space. Proof. We leave it to the reader to verify that op is a norm on L(R n, R m ). The completeness of L(R n, R m ) wth respect to op will be proved later. It is not difficult for us to prove the following properties. Proposition.3. Let T L(R n, R m ) and S L(R m, R p ). Then () T (x) R m T op x R n, and () S T op S op T op.

3 NORMS ON SPACE OF MATRICES 3 Proof. Let us prove (). When x = 0, the statement is obvious. When x 0, let y = x/ x R n. Then T (y) R m T op. By linearity of T, ( ) T (y) R m = x T = T (x) R n T op. x R n x R n Multiplying the above inequality both side by x R n, we obtain (). For (), we use (): (S T )(x) R p S op T (x) R m S op T op x R m. Hence S T op S op T op. Corollary.. If T : R n R n is linear, then T k op T k op for any k N. Proof. This can be proved by induction. For A M mn (R), we define the matrix norm of A to be the operator norm of L A, i.e. A = L A op. Theorem.3. The space M mn (R) of m n real matrices together with the matrix norm is a real Banach space. Proof. Let us denote by m A = i= j= n a ij for each A M mn (R). For any A M mn (R), A = max{ a ij : i m, j n}, A A A. (See class note for the detail of the proof of this inequality) Let (A k ) be a Cauchy sequence in M mn (R). For any ɛ > 0, there exists K ɛ N such that A k A l < ɛ/4mn whenever k, l K ɛ. Denote A k by [a ij (k)]. For any k, l K ɛ, a ij (k) a ij (l) A k A l < ɛ 4mn. This implies that (a ij (k)) is a Cauchy sequence in R for i m and j n. By completeness of R, (a ij (k)) is convergent in R. Denote a ij = lim k a ij (k). When k K ɛ, For k K ɛ, a ij (k) a ij = lim l a ij (k) a ij (l) ɛ 4mn. ɛ A k A A k A = a ij (k) a ij 4 = ɛ < ɛ. ij This shows that lim k A k = A in M mn (R). Hence M mn (R) is a real Banach space.

4 4 NORMS ON SPACE OF MATRICES This theorem implies that L(R n, R m ) forms a Banach space with respect to the operator norm. Given an n n real matrix A, it is make sense for us to ask if the limit (s k ) exists in M n (R), where (.) s k = I n + A! + + Ak k!, k. Theorem.4. Let (V, ) be a real Banach space and (v n ) be a sequence of vectors in V. Suppose that n= v n is convergent in R. Then n= v n is convergent in (V, ). Proof. Exercise. Theorem.5. (Comparison Test) Let (a n ) and (b n ) be a sequence of nonnegative real numbers. Suppose that () 0 a n b n for any n, () n= b n is convergent in R. Then n= a n is convergent. Proof. Exercise. Corollary.. (Comparison Test in Banach Space) Let (V, ) be a real Banach space and (v n ) be a sequence of vectors in V. Suppose that there exists a sequence of nonnegative real numbers (b n ) such that () v n b n for any n, () n= b n is convergent in R. Then n= v n is convergent in (V, ). Let v = I n and v k = A k /(k )! for k and b = and b k = A k /(k )! for k. Since A k A k for any k, we find v k b k for any k. Using elementary calculus, we know k= b k is convergent in R (to e A ). By comparison test in Banach space, we find k= v k is convergent in (M n (R), ), i.e. (s k ) is convergent in (M n (R), ). We define e A to be the limit of (s k ) in (M n (R), ), i.e. k=0 e A = lim k s k = I n + Here the right hand side is the expression for lim k s k. Similarly, we can define cos A and sin A for any n n real matrix A by ( ) k ( ) k cos A = (k)! Ak, sin A = (k + )! Ak+. We leave it to the reader to verify that cos A and sin A are defined. (The infinite series of n n matrices are convergent in M n (R).) Theorem.6. Let x(t) = e ta x 0 for t [0, b]. Then x : [0, b] R n is the unique differentiable function which solves (.). To prove this theorem, we need to introduce the space of vector valued continuous functions. This theorem will be proved later. Let us study more about the space M n (R). In calculus, we have seen that for each x R with x <, k= k=0 x = x k. k=0 A k k!.

5 NORMS ON SPACE OF MATRICES 5 We may ask ourself that whether the equality holds for any n n matrix A with A <. Since A <, k=0 A k is convergent in R. Here A 0 = I n the identity matrix. Since M n (R) is a real Banach space, k=0 Ak is convergent in M n (R). Proposition.4. Let A M n (R) with A < and B = k=0 Ak. Then B(I n A) = (I n A)B = I n. This implies that I n A is invertible with B = (I n A). Proof. Let s k = I n + A + + A k for each k. For each k, Using the inequality, As k = A + + A k = s k+ I n = s k A. As k AB = A(s k B) A s k B s k A BA = s k A BA s k B A, we know lim k As k = AB and lim k s k A = BA. On the other hand, lim k (s k+ I n ) = B I n. We find that AB = B I n = BA. This implies that This proves our assertion. (I n A)B = B(I n A) = I n. As an application to this proposition, let us prove the following. Theorem.7. Let GL n (R) be the set of all real n n invertible matrices. Then GL n (R) forms an open subset of M n (R). Proof. Let A GL n (R). Choose ɛ = / A. Claim that B(A, ɛ) GL n (R). This is equivalent to say that if B A < ɛ, i.e. B B(A, ɛ), then B is invertible. Observe that B = B A + A = ( (B A)A + I ) A. Let C = (B A)A Then B = (I + C)A and C B A A < A A =. By the previous proposition, I + C is invertible. Since A is invertible and product of any two invertible matrices is again invertible, B = (I + C)A is invertible. Hence B GL n (R). We proved that B(A, ɛ) GL n (R), and hence A is an interior point of GL n (R). Since A is arbitrarily in GL n (R), GL n (R) is open. Theorem.8. Let φ : GL n (R) GL n (R) be the map φ(a) = A for A GL n (R). Then φ is continuous. Proof. Let A GL n (R) and choose d = / A. For any B B(A, d), B A < d < A. Hence B is invertible; hence B(A, d) GL n (R). Observe that for B B(A, d), and hence φ(b) φ(a) = B (A B)A φ(b) φ(a) B A B A.

6 6 NORMS ON SPACE OF MATRICES Let us estimate B when B B(A, d). For each y R n, y R n = A (Ay) R n A Ay R n. Hence A y R n Ay R n. By triangle inequality and the norm inequality, Ay R n = (A B)y + By R n (A B)y R n + By R n A B y R n + By R n. This shows that for any y R n, ( A A B ) y R n By R n. Since A B < d = A /, ( A A B ) > A /. This shows that for any y R n, A y R n By R n. Since B is invertible, for any x R n, there exists a unique y R n so that By = x. We find that y = B x and hence B x R n A x R n. We find that B A. Thus for any B B(A, ɛ), φ(b) φ(a) A A B. For any ɛ > 0, we choose Then for any B B(A, δ A,ɛ ), This proves that φ is continuous. { δ A,ɛ = min ɛ A, d }. φ(b) φ(a) < A δ A ɛ A = ɛ.

7 NORMS ON SPACE OF MATRICES 7. Spectral Theorem for symmetric matrices with an application to the computation of Operator Norms Let A : R n R m be a linear map. The operator norm of A is defined to be A = where S n = {x R n : x = }. Then sup Ax, x S n A = sup Ax x S n = sup Ax, Ax x S n = sup x S n A t Ax, x. Denote T = A t A. Then T : R n R n is a linear map such that () T is symmetric, i.e. T t = T, and () T x, x = Ax 0 for any x R n. Definition.. A linear map T : R n R n is nonnegative definite if it satisfies () and (). For any nonnegative definite linear map Q : R n R n, we define a function Q T : R n R by Q T (x) = T x, x, x R n. If we denote T by [T ij ] and x = (x,, x n ), then n Q T (x) = T ij x i x j. i,j= We see that Q T is a real valued continuous function. Since S n is closed and bounded, by Bolzano-Weierstrass Theorem, S n is sequentially compact. By Extreme value Theorem, Q T attains its maximum (and also minimum) on S n. Let λ be the maximum of Q T on S n and v S n so that Q T (v ) = λ. Since Q T (x) 0 for any x R n, λ 0. Since T v, v = λ, (λ I T )v, v = 0. Let us prove that, in fact, we have (λ I T )v, w = 0 for any w R n. If the statement is true, then T v = λ v, i.e. λ is an eigenvalue of T and v is an eigenvector of T corresponding to eigenvalue λ. Let B = λ I T. Then Bv, v = 0. We want to show that Bv, w = 0 for any w R n. Since any w R n can be written uniquely as w = av + y for a R and y {v }, Bv, w = a Bv, v + Bv, y = Bv, y. Hence if we can show that for any y {v }, Bv, y = 0, then Bv, w = 0 for any w R n. Thus we only need to prove that Bv, w = 0 is true for w {v }. Since Q T (v) λ for any v S n, Q T (x) λ x for any x R n. This implies that Bx, x 0 for any x R n. For any t R, and any w R n, 0 B(v + tw), (v + tw) = Bv, v + Bv, w t + Bw, w t = Bv, w t + Bw, w t.

8 8 NORMS ON SPACE OF MATRICES This shows that Bv, w 0. This shows that Bv, w = 0. Theorem.. Let T : R n R n be a nonnegative definite linear map. The number λ = max x, x x Sn T is an eigenvalue of T and any v S n with T v, v = λ is an eigenvector of T corresponding to λ. Lemma.. Let T, λ and v be as above. R n = V W and T (W ) W. Proof. If w W, then v, w = 0 and hence v, T w = T v, w = λ v, w = 0. Let V = span{v } and W = V. Then This shows that T w V = W. For any x R n, we can write x = av + w for w W. Hence T (x) = aλ v + T (w). Let T : W W be the map defined by T (w) = T (w). Since T is linear, T is linear. Furthermore, Since T is symmetric, for any w, w W, T w, w = T w, w = w, T w = w, T (w ). This shows that T is also symmetric. For any w W, T w, w = T w, w 0. This shows that T is nonnegative definite. Set λ = max T w, w. {w W : w =} Remark. The set {w W : w = } is the intersection of W and S n. If we denote v by (a,, a n ), then W = {(x,, x n ) R n : a x + + a n x n = 0}. Hence W is a closed subset of R n. Therefore W S n is closed. Since S n is bounded, W S n is bounded. The intersection {w W : w = } is closed and bounded. By Bolzano-Weierstrass Theorem, {w W : w = } is sequentially compact. We apply the extreme value theorem to find λ. Then λ λ 0. Choose v W so that v = and λ = T v, v. Then v v and T v = λ v. This shows that T v = λ v. By induction, we can find a finite nonincreasing nonnegative real numbers λ λ λ n 0 and an orthonormal basis {v i : i n} for R n such that T v i = λ i v i for i n. Since {v i : i n} is an orthonormal basis for R n, for any x R n, we can write x = n i= x, v i v i. By linearity of T, we obtain T x = n i= λ i x, v i v i. Let Λ = diag(λ,, λ n ) and V be the n n matrix whose i-th column vector is v i. Then T V = V Λ which implies that T = V ΛV t by V t V = I.

9 NORMS ON SPACE OF MATRICES 9 Theorem.. (Spectral Theorem for nonnegative definite linear map) Let T : R n R n be a nonnegative definite linear map. Then there exist a finite nonincreasing nonnegative real numbers λ λ λ n 0 and an orthonormal basis {v i : i n} for R n such that () T v i = λ i v i for i n. () T x = n i= λ i x, v i v i. (3) T = V ΛV t where Λ = diag(λ,, λ n ) and U is the n n matrix whose i-th column vector is v i. Let S : R n R n be any symmetric linear map. We consider m = min x. x Sn Sx, Set T = S mi. Then T is nonnegative definite. By spectral theorem for T, we choose λ i R and v i R n such that T v i = λ i v i as above, Then Sv i = (λ i + m)v i for i n. Let µ i = λ i + m. Then Sv i = µ i v i for i n. We see that v i is also an eigenvector of S with eigenvalue µ i. Furthermore, n Sx = µ i λ i x, v i v i. Thus we prove that Corollary.. Spectral Theorem holds for any symmetric linear map. i= Let us go back to the computation of A. It follows from the definition that A is the largest eigenvalue of A t A. Let us denote λ (A t A) the largest eigenvalue of A t A. Then A = λ (A t A). Definition.. Let λ i (A t A) be the eigenvalue of A t A such that The i-th singular value of A is defined to be λ (A t A) λ n (A t A) 0. s i (A) = λ i (A t A). Corollary.. Let A : R n R m be a linear map. Then A = s (A). Now let us introduce the singular value decomposition for the any linear map. Let us recall a basic theorem in linear algebra. Proposition.. Let A : R n R m be a linear map. Then () (Im A) = ker A t () (Im A t ) = ker A. Proof. Let z (Im A). Then Ax, z = 0 for any x R n. x, A t z = Ax, z = 0 for any x R n. Therefore A t z = 0. We see that z ker A t. If z ker A t, then A t z = 0. Hence x, A t z = 0 for any x R n which implies that Ax, z = 0 for any x R n. Therefore z (Im A).

10 0 NORMS ON SPACE OF MATRICES Choose an orthonormal basis {v i : i n} such that A t Av i = s i v i for i n. Let us assume that s i = 0 for i > k and s k 0. Let z i = Av i for i k. We find A t z i = s i v i for i k. Let us compute z i, z j for any i, j k : z i, z j = Av i, Av j = A t Av i, v j = s i v i, v j = s i δ ij. Hence {z i : i k} forms an orthogonal subset of Im A. Since Av j = 0 for k + j n, nullity A = dim ker A = n k. By rank-nullity lemma, rank A + nullity A = n which shows that rank A = k. Since {z i : i k} is orthogonal, it is linearly independent. We see that {z i : i k} forms an orthogonal basis for Im A. Let u i = z i /s i for i k. Then {u i : i k} forms an orthonormal basis for Im A. We write k Ax = Ax, u i u i = i= Since A t u i = A t (z i /s i ) = s i v i, we find k x, A t u i u i. i= Ax = k s i x, v i u i. i= We can extend {u i : i k} to an orthonormal basis {u i : i m} for R m. Let U be the m m matrix whose i-th column vector is u i and V be the n n matrix whose j-th column vector is v i. Then AV = UΣ. Since V t V = I n (V is an orthogonal matrix), then we obtain the matrix form of the singular value decomposition Here Σ = diag(s,, s n ). A = UΣV t. Theorem.3. (Singular Value Decomposition for any linear map) Let A : R n R m be any linear map. Suppose k = rank A. There exists an orthonormal basis {v i } for R n and orthonormal basis {u i } for Im A and a finite nonnegative nonincreasing sequence of real numbers s s k such that () A t Av i = s i v i for i k and A t Av j = 0 for k + j n and () Av i = s i u i for i k and Av j = 0 for k+ j n and A t u i = s i v i for i k, and (3) Ax = k i= s i x, v i u i for any x R n. (4) A = UΣV t, where U is the m m matrix whose i-th column vector is u i and V is the n n matrix whose j-th column vector is v i and Σ = diag(s,, s n ). [ ] Example.. Let A =. () Find A. () Find the singular value decomposition of A. [ [ ] The rank of A is one and the image of A is spanned by. Take u ] = 5 and [ ] v =. Then Av = [ ] [ ] 0u. Take v = and u = 5. Then Av = 0.

11 NORMS ON SPACE OF MATRICES The larges singular value of A is 0 and hence A = 0. We see that [ ] [ ] 5 A [ 0 ] = Hence we obtain the singular value decomposition of A (in the matrix form) [ ] 5 [ 0 ] [ ] A =

Functional Analysis Review

Functional Analysis Review Outline 9.520: Statistical Learning Theory and Applications February 8, 2010 Outline 1 2 3 4 Vector Space Outline A vector space is a set V with binary operations +: V V V and : R V V such that for all

More information

Elementary linear algebra

Elementary linear algebra Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The

More information

Lecture notes: Applied linear algebra Part 1. Version 2

Lecture notes: Applied linear algebra Part 1. Version 2 Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and

More information

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent.

j=1 u 1jv 1j. 1/ 2 Lemma 1. An orthogonal set of vectors must be linearly independent. Lecture Notes: Orthogonal and Symmetric Matrices Yufei Tao Department of Computer Science and Engineering Chinese University of Hong Kong taoyf@cse.cuhk.edu.hk Orthogonal Matrix Definition. Let u = [u

More information

Mathematical foundations - linear algebra

Mathematical foundations - linear algebra Mathematical foundations - linear algebra Andrea Passerini passerini@disi.unitn.it Machine Learning Vector space Definition (over reals) A set X is called a vector space over IR if addition and scalar

More information

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T 1 1 Linear Systems The goal of this chapter is to study linear systems of ordinary differential equations: ẋ = Ax, x(0) = x 0, (1) where x R n, A is an n n matrix and ẋ = dx ( dt = dx1 dt,..., dx ) T n.

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

The following definition is fundamental.

The following definition is fundamental. 1. Some Basics from Linear Algebra With these notes, I will try and clarify certain topics that I only quickly mention in class. First and foremost, I will assume that you are familiar with many basic

More information

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra. DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1

More information

Basic Calculus Review

Basic Calculus Review Basic Calculus Review Lorenzo Rosasco ISML Mod. 2 - Machine Learning Vector Spaces Functionals and Operators (Matrices) Vector Space A vector space is a set V with binary operations +: V V V and : R V

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this

More information

5 Compact linear operators

5 Compact linear operators 5 Compact linear operators One of the most important results of Linear Algebra is that for every selfadjoint linear map A on a finite-dimensional space, there exists a basis consisting of eigenvectors.

More information

MATH 583A REVIEW SESSION #1

MATH 583A REVIEW SESSION #1 MATH 583A REVIEW SESSION #1 BOJAN DURICKOVIC 1. Vector Spaces Very quick review of the basic linear algebra concepts (see any linear algebra textbook): (finite dimensional) vector space (or linear space),

More information

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces. Math 350 Fall 2011 Notes about inner product spaces In this notes we state and prove some important properties of inner product spaces. First, recall the dot product on R n : if x, y R n, say x = (x 1,...,

More information

Chapter 2 Linear Transformations

Chapter 2 Linear Transformations Chapter 2 Linear Transformations Linear Transformations Loosely speaking, a linear transformation is a function from one vector space to another that preserves the vector space operations. Let us be more

More information

2. Review of Linear Algebra

2. Review of Linear Algebra 2. Review of Linear Algebra ECE 83, Spring 217 In this course we will represent signals as vectors and operators (e.g., filters, transforms, etc) as matrices. This lecture reviews basic concepts from linear

More information

Chapter 2: Linear Independence and Bases

Chapter 2: Linear Independence and Bases MATH20300: Linear Algebra 2 (2016 Chapter 2: Linear Independence and Bases 1 Linear Combinations and Spans Example 11 Consider the vector v (1, 1 R 2 What is the smallest subspace of (the real vector space

More information

Numerical Linear Algebra

Numerical Linear Algebra University of Alabama at Birmingham Department of Mathematics Numerical Linear Algebra Lecture Notes for MA 660 (1997 2014) Dr Nikolai Chernov April 2014 Chapter 0 Review of Linear Algebra 0.1 Matrices

More information

Optimization Theory. A Concise Introduction. Jiongmin Yong

Optimization Theory. A Concise Introduction. Jiongmin Yong October 11, 017 16:5 ws-book9x6 Book Title Optimization Theory 017-08-Lecture Notes page 1 1 Optimization Theory A Concise Introduction Jiongmin Yong Optimization Theory 017-08-Lecture Notes page Optimization

More information

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p. LINEAR ALGEBRA Fall 203 The final exam Almost all of the problems solved Exercise Let (V, ) be a normed vector space. Prove x y x y for all x, y V. Everybody knows how to do this! Exercise 2 If V is a

More information

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T. Notes on singular value decomposition for Math 54 Recall that if A is a symmetric n n matrix, then A has real eigenvalues λ 1,, λ n (possibly repeated), and R n has an orthonormal basis v 1,, v n, where

More information

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2

Fall TMA4145 Linear Methods. Exercise set Given the matrix 1 2 Norwegian University of Science and Technology Department of Mathematical Sciences TMA445 Linear Methods Fall 07 Exercise set Please justify your answers! The most important part is how you arrive at an

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Linear Algebra. Session 12

Linear Algebra. Session 12 Linear Algebra. Session 12 Dr. Marco A Roque Sol 08/01/2017 Example 12.1 Find the constant function that is the least squares fit to the following data x 0 1 2 3 f(x) 1 0 1 2 Solution c = 1 c = 0 f (x)

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

Lecture 5. Ch. 5, Norms for vectors and matrices. Norms for vectors and matrices Why?

Lecture 5. Ch. 5, Norms for vectors and matrices. Norms for vectors and matrices Why? KTH ROYAL INSTITUTE OF TECHNOLOGY Norms for vectors and matrices Why? Lecture 5 Ch. 5, Norms for vectors and matrices Emil Björnson/Magnus Jansson/Mats Bengtsson April 27, 2016 Problem: Measure size of

More information

Notes on Linear Algebra

Notes on Linear Algebra 1 Notes on Linear Algebra Jean Walrand August 2005 I INTRODUCTION Linear Algebra is the theory of linear transformations Applications abound in estimation control and Markov chains You should be familiar

More information

Knowledge Discovery and Data Mining 1 (VO) ( )

Knowledge Discovery and Data Mining 1 (VO) ( ) Knowledge Discovery and Data Mining 1 (VO) (707.003) Review of Linear Algebra Denis Helic KTI, TU Graz Oct 9, 2014 Denis Helic (KTI, TU Graz) KDDM1 Oct 9, 2014 1 / 74 Big picture: KDDM Probability Theory

More information

SPECTRAL THEOREM FOR SYMMETRIC OPERATORS WITH COMPACT RESOLVENT

SPECTRAL THEOREM FOR SYMMETRIC OPERATORS WITH COMPACT RESOLVENT SPECTRAL THEOREM FOR SYMMETRIC OPERATORS WITH COMPACT RESOLVENT Abstract. These are the letcure notes prepared for the workshop on Functional Analysis and Operator Algebras to be held at NIT-Karnataka,

More information

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th.

Elementary Linear Algebra Review for Exam 2 Exam is Monday, November 16th. Elementary Linear Algebra Review for Exam Exam is Monday, November 6th. The exam will cover sections:.4,..4, 5. 5., 7., the class notes on Markov Models. You must be able to do each of the following. Section.4

More information

LINEAR ALGEBRA SUMMARY SHEET.

LINEAR ALGEBRA SUMMARY SHEET. LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized

More information

Recall the convention that, for us, all vectors are column vectors.

Recall the convention that, for us, all vectors are column vectors. Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists

More information

Analysis Preliminary Exam Workshop: Hilbert Spaces

Analysis Preliminary Exam Workshop: Hilbert Spaces Analysis Preliminary Exam Workshop: Hilbert Spaces 1. Hilbert spaces A Hilbert space H is a complete real or complex inner product space. Consider complex Hilbert spaces for definiteness. If (, ) : H H

More information

Introduction to Linear Algebra, Second Edition, Serge Lange

Introduction to Linear Algebra, Second Edition, Serge Lange Introduction to Linear Algebra, Second Edition, Serge Lange Chapter I: Vectors R n defined. Addition and scalar multiplication in R n. Two geometric interpretations for a vector: point and displacement.

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

1. Bounded linear maps. A linear map T : E F of real Banach

1. Bounded linear maps. A linear map T : E F of real Banach DIFFERENTIABLE MAPS 1. Bounded linear maps. A linear map T : E F of real Banach spaces E, F is bounded if M > 0 so that for all v E: T v M v. If v r T v C for some positive constants r, C, then T is bounded:

More information

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013. The University of Texas at Austin Department of Electrical and Computer Engineering EE381V: Large Scale Learning Spring 2013 Assignment Two Caramanis/Sanghavi Due: Tuesday, Feb. 19, 2013. Computational

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v

More information

M17 MAT25-21 HOMEWORK 6

M17 MAT25-21 HOMEWORK 6 M17 MAT25-21 HOMEWORK 6 DUE 10:00AM WEDNESDAY SEPTEMBER 13TH 1. To Hand In Double Series. The exercises in this section will guide you to complete the proof of the following theorem: Theorem 1: Absolute

More information

Eigenspaces and Diagonalizable Transformations

Eigenspaces and Diagonalizable Transformations Chapter 2 Eigenspaces and Diagonalizable Transformations As we explored how heat states evolve under the action of a diffusion transformation E, we found that some heat states will only change in amplitude.

More information

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms (February 24, 2017) 08a. Operators on Hilbert spaces Paul Garrett garrett@math.umn.edu http://www.math.umn.edu/ garrett/ [This document is http://www.math.umn.edu/ garrett/m/real/notes 2016-17/08a-ops

More information

SPECTRAL THEORY EVAN JENKINS

SPECTRAL THEORY EVAN JENKINS SPECTRAL THEORY EVAN JENKINS Abstract. These are notes from two lectures given in MATH 27200, Basic Functional Analysis, at the University of Chicago in March 2010. The proof of the spectral theorem for

More information

Tutorials in Optimization. Richard Socher

Tutorials in Optimization. Richard Socher Tutorials in Optimization Richard Socher July 20, 2008 CONTENTS 1 Contents 1 Linear Algebra: Bilinear Form - A Simple Optimization Problem 2 1.1 Definitions........................................ 2 1.2

More information

1. General Vector Spaces

1. General Vector Spaces 1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule

More information

Matrix Theory. A.Holst, V.Ufnarovski

Matrix Theory. A.Holst, V.Ufnarovski Matrix Theory AHolst, VUfnarovski 55 HINTS AND ANSWERS 9 55 Hints and answers There are two different approaches In the first one write A as a block of rows and note that in B = E ij A all rows different

More information

Linear Algebra. Paul Yiu. 6D: 2-planes in R 4. Department of Mathematics Florida Atlantic University. Fall 2011

Linear Algebra. Paul Yiu. 6D: 2-planes in R 4. Department of Mathematics Florida Atlantic University. Fall 2011 Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6D: 2-planes in R 4 The angle between a vector and a plane The angle between a vector v R n and a subspace V is the

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs

Ma/CS 6b Class 23: Eigenvalues in Regular Graphs Ma/CS 6b Class 3: Eigenvalues in Regular Graphs By Adam Sheffer Recall: The Spectrum of a Graph Consider a graph G = V, E and let A be the adjacency matrix of G. The eigenvalues of G are the eigenvalues

More information

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure.

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure. Hints for Exercises 1.3. This diagram says that f α = β g. I will prove f injective g injective. You should show g injective f injective. Assume f is injective. Now suppose g(x) = g(y) for some x, y A.

More information

MAT Linear Algebra Collection of sample exams

MAT Linear Algebra Collection of sample exams MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system

More information

Math 407: Linear Optimization

Math 407: Linear Optimization Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University

More information

Lecture Summaries for Linear Algebra M51A

Lecture Summaries for Linear Algebra M51A These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture

More information

In particular, if A is a square matrix and λ is one of its eigenvalues, then we can find a non-zero column vector X with

In particular, if A is a square matrix and λ is one of its eigenvalues, then we can find a non-zero column vector X with Appendix: Matrix Estimates and the Perron-Frobenius Theorem. This Appendix will first present some well known estimates. For any m n matrix A = [a ij ] over the real or complex numbers, it will be convenient

More information

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det What is the determinant of the following matrix? 3 4 3 4 3 4 4 3 A 0 B 8 C 55 D 0 E 60 If det a a a 3 b b b 3 c c c 3 = 4, then det a a 4a 3 a b b 4b 3 b c c c 3 c = A 8 B 6 C 4 D E 3 Let A be an n n matrix

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

Lecture 8: Linear Algebra Background

Lecture 8: Linear Algebra Background CSE 521: Design and Analysis of Algorithms I Winter 2017 Lecture 8: Linear Algebra Background Lecturer: Shayan Oveis Gharan 2/1/2017 Scribe: Swati Padmanabhan Disclaimer: These notes have not been subjected

More information

Section 3.9. Matrix Norm

Section 3.9. Matrix Norm 3.9. Matrix Norm 1 Section 3.9. Matrix Norm Note. We define several matrix norms, some similar to vector norms and some reflecting how multiplication by a matrix affects the norm of a vector. We use matrix

More information

Math 113 Solutions: Homework 8. November 28, 2007

Math 113 Solutions: Homework 8. November 28, 2007 Math 113 Solutions: Homework 8 November 28, 27 3) Define c j = ja j, d j = 1 j b j, for j = 1, 2,, n Let c = c 1 c 2 c n and d = vectors in R n Then the Cauchy-Schwarz inequality on Euclidean n-space gives

More information

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A).

j=1 [We will show that the triangle inequality holds for each p-norm in Chapter 3 Section 6.] The 1-norm is A F = tr(a H A). Math 344 Lecture #19 3.5 Normed Linear Spaces Definition 3.5.1. A seminorm on a vector space V over F is a map : V R that for all x, y V and for all α F satisfies (i) x 0 (positivity), (ii) αx = α x (scale

More information

Linear Algebra Massoud Malek

Linear Algebra Massoud Malek CSUEB Linear Algebra Massoud Malek Inner Product and Normed Space In all that follows, the n n identity matrix is denoted by I n, the n n zero matrix by Z n, and the zero vector by θ n An inner product

More information

Examples include: (a) the Lorenz system for climate and weather modeling (b) the Hodgkin-Huxley system for neuron modeling

Examples include: (a) the Lorenz system for climate and weather modeling (b) the Hodgkin-Huxley system for neuron modeling 1 Introduction Many natural processes can be viewed as dynamical systems, where the system is represented by a set of state variables and its evolution governed by a set of differential equations. Examples

More information

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88 Math Camp 2010 Lecture 4: Linear Algebra Xiao Yu Wang MIT Aug 2010 Xiao Yu Wang (MIT) Math Camp 2010 08/10 1 / 88 Linear Algebra Game Plan Vector Spaces Linear Transformations and Matrices Determinant

More information

1 Last time: least-squares problems

1 Last time: least-squares problems MATH Linear algebra (Fall 07) Lecture Last time: least-squares problems Definition. If A is an m n matrix and b R m, then a least-squares solution to the linear system Ax = b is a vector x R n such that

More information

Vector Spaces and Linear Transformations

Vector Spaces and Linear Transformations Vector Spaces and Linear Transformations Wei Shi, Jinan University 2017.11.1 1 / 18 Definition (Field) A field F = {F, +, } is an algebraic structure formed by a set F, and closed under binary operations

More information

REPRESENTATION THEORY WEEK 7

REPRESENTATION THEORY WEEK 7 REPRESENTATION THEORY WEEK 7 1. Characters of L k and S n A character of an irreducible representation of L k is a polynomial function constant on every conjugacy class. Since the set of diagonalizable

More information

16 1 Basic Facts from Functional Analysis and Banach Lattices

16 1 Basic Facts from Functional Analysis and Banach Lattices 16 1 Basic Facts from Functional Analysis and Banach Lattices 1.2.3 Banach Steinhaus Theorem Another fundamental theorem of functional analysis is the Banach Steinhaus theorem, or the Uniform Boundedness

More information

SPECTRAL THEOREM FOR COMPACT SELF-ADJOINT OPERATORS

SPECTRAL THEOREM FOR COMPACT SELF-ADJOINT OPERATORS SPECTRAL THEOREM FOR COMPACT SELF-ADJOINT OPERATORS G. RAMESH Contents Introduction 1 1. Bounded Operators 1 1.3. Examples 3 2. Compact Operators 5 2.1. Properties 6 3. The Spectral Theorem 9 3.3. Self-adjoint

More information

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES

AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES AN ELEMENTARY PROOF OF THE SPECTRAL RADIUS FORMULA FOR MATRICES JOEL A. TROPP Abstract. We present an elementary proof that the spectral radius of a matrix A may be obtained using the formula ρ(a) lim

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

Lecture 7. Econ August 18

Lecture 7. Econ August 18 Lecture 7 Econ 2001 2015 August 18 Lecture 7 Outline First, the theorem of the maximum, an amazing result about continuity in optimization problems. Then, we start linear algebra, mostly looking at familiar

More information

Compact operators on Banach spaces

Compact operators on Banach spaces Compact operators on Banach spaces Jordan Bell jordan.bell@gmail.com Department of Mathematics, University of Toronto November 12, 2017 1 Introduction In this note I prove several things about compact

More information

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers. Linear Algebra - Test File - Spring Test # For problems - consider the following system of equations. x + y - z = x + y + 4z = x + y + 6z =.) Solve the system without using your calculator..) Find the

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

1 Linear Algebra Problems

1 Linear Algebra Problems Linear Algebra Problems. Let A be the conjugate transpose of the complex matrix A; i.e., A = A t : A is said to be Hermitian if A = A; real symmetric if A is real and A t = A; skew-hermitian if A = A and

More information

Math 290, Midterm II-key

Math 290, Midterm II-key Math 290, Midterm II-key Name (Print): (first) Signature: (last) The following rules apply: There are a total of 20 points on this 50 minutes exam. This contains 7 pages (including this cover page) and

More information

1.1 Limits and Continuity. Precise definition of a limit and limit laws. Squeeze Theorem. Intermediate Value Theorem. Extreme Value Theorem.

1.1 Limits and Continuity. Precise definition of a limit and limit laws. Squeeze Theorem. Intermediate Value Theorem. Extreme Value Theorem. STATE EXAM MATHEMATICS Variant A ANSWERS AND SOLUTIONS 1 1.1 Limits and Continuity. Precise definition of a limit and limit laws. Squeeze Theorem. Intermediate Value Theorem. Extreme Value Theorem. Definition

More information

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7 Linear Algebra and its Applications-Lab 1 1) Use Gaussian elimination to solve the following systems x 1 + x 2 2x 3 + 4x 4 = 5 1.1) 2x 1 + 2x 2 3x 3 + x 4 = 3 3x 1 + 3x 2 4x 3 2x 4 = 1 x + y + 2z = 4 1.4)

More information

The Singular Value Decomposition and Least Squares Problems

The Singular Value Decomposition and Least Squares Problems The Singular Value Decomposition and Least Squares Problems Tom Lyche Centre of Mathematics for Applications, Department of Informatics, University of Oslo September 27, 2009 Applications of SVD solving

More information

First we introduce the sets that are going to serve as the generalizations of the scalars.

First we introduce the sets that are going to serve as the generalizations of the scalars. Contents 1 Fields...................................... 2 2 Vector spaces.................................. 4 3 Matrices..................................... 7 4 Linear systems and matrices..........................

More information

Review Notes for the Basic Qualifying Exam

Review Notes for the Basic Qualifying Exam Review Notes for the Basic Qualifying Exam Topics: Analysis & Linear Algebra Fall 2016 Spring 2017 Created by Howard Purpose: This document is a compilation of notes generated to prepare for the Basic

More information

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same. Introduction Matrix Operations Matrix: An m n matrix A is an m-by-n array of scalars from a field (for example real numbers) of the form a a a n a a a n A a m a m a mn The order (or size) of A is m n (read

More information

SYLLABUS. 1 Linear maps and matrices

SYLLABUS. 1 Linear maps and matrices Dr. K. Bellová Mathematics 2 (10-PHY-BIPMA2) SYLLABUS 1 Linear maps and matrices Operations with linear maps. Prop 1.1.1: 1) sum, scalar multiple, composition of linear maps are linear maps; 2) L(U, V

More information

Positive Definite Matrix

Positive Definite Matrix 1/29 Chia-Ping Chen Professor Department of Computer Science and Engineering National Sun Yat-sen University Linear Algebra Positive Definite, Negative Definite, Indefinite 2/29 Pure Quadratic Function

More information

MTH 309 Supplemental Lecture Notes Based on Robert Messer, Linear Algebra Gateway to Mathematics

MTH 309 Supplemental Lecture Notes Based on Robert Messer, Linear Algebra Gateway to Mathematics MTH 309 Supplemental Lecture Notes Based on Robert Messer, Linear Algebra Gateway to Mathematics Ulrich Meierfrankenfeld Department of Mathematics Michigan State University East Lansing MI 48824 meier@math.msu.edu

More information

COMP 558 lecture 18 Nov. 15, 2010

COMP 558 lecture 18 Nov. 15, 2010 Least squares We have seen several least squares problems thus far, and we will see more in the upcoming lectures. For this reason it is good to have a more general picture of these problems and how to

More information

Functional Analysis. Franck Sueur Metric spaces Definitions Completeness Compactness Separability...

Functional Analysis. Franck Sueur Metric spaces Definitions Completeness Compactness Separability... Functional Analysis Franck Sueur 2018-2019 Contents 1 Metric spaces 1 1.1 Definitions........................................ 1 1.2 Completeness...................................... 3 1.3 Compactness......................................

More information

Determining a span. λ + µ + ν = x 2λ + 2µ 10ν = y λ + 3µ 9ν = z.

Determining a span. λ + µ + ν = x 2λ + 2µ 10ν = y λ + 3µ 9ν = z. Determining a span Set V = R 3 and v 1 = (1, 2, 1), v 2 := (1, 2, 3), v 3 := (1 10, 9). We want to determine the span of these vectors. In other words, given (x, y, z) R 3, when is (x, y, z) span(v 1,

More information

Your first day at work MATH 806 (Fall 2015)

Your first day at work MATH 806 (Fall 2015) Your first day at work MATH 806 (Fall 2015) 1. Let X be a set (with no particular algebraic structure). A function d : X X R is called a metric on X (and then X is called a metric space) when d satisfies

More information

THE INVERSE FUNCTION THEOREM

THE INVERSE FUNCTION THEOREM THE INVERSE FUNCTION THEOREM W. PATRICK HOOPER The implicit function theorem is the following result: Theorem 1. Let f be a C 1 function from a neighborhood of a point a R n into R n. Suppose A = Df(a)

More information

LINEAR ALGEBRA QUESTION BANK

LINEAR ALGEBRA QUESTION BANK LINEAR ALGEBRA QUESTION BANK () ( points total) Circle True or False: TRUE / FALSE: If A is any n n matrix, and I n is the n n identity matrix, then I n A = AI n = A. TRUE / FALSE: If A, B are n n matrices,

More information

L p Spaces and Convexity

L p Spaces and Convexity L p Spaces and Convexity These notes largely follow the treatments in Royden, Real Analysis, and Rudin, Real & Complex Analysis. 1. Convex functions Let I R be an interval. For I open, we say a function

More information

Linear Algebra. Preliminary Lecture Notes

Linear Algebra. Preliminary Lecture Notes Linear Algebra Preliminary Lecture Notes Adolfo J. Rumbos c Draft date April 29, 23 2 Contents Motivation for the course 5 2 Euclidean n dimensional Space 7 2. Definition of n Dimensional Euclidean Space...........

More information

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A =

Matrices and Vectors. Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = 30 MATHEMATICS REVIEW G A.1.1 Matrices and Vectors Definition of Matrix. An MxN matrix A is a two-dimensional array of numbers A = a 11 a 12... a 1N a 21 a 22... a 2N...... a M1 a M2... a MN A matrix can

More information

Review of linear algebra

Review of linear algebra Review of linear algebra 1 Vectors and matrices We will just touch very briefly on certain aspects of linear algebra, most of which should be familiar. Recall that we deal with vectors, i.e. elements of

More information

UNIT 6: The singular value decomposition.

UNIT 6: The singular value decomposition. UNIT 6: The singular value decomposition. María Barbero Liñán Universidad Carlos III de Madrid Bachelor in Statistics and Business Mathematical methods II 2011-2012 A square matrix is symmetric if A T

More information

Several variables. x 1 x 2. x n

Several variables. x 1 x 2. x n Several variables Often we have not only one, but several variables in a problem The issues that come up are somewhat more complex than for one variable Let us first start with vector spaces and linear

More information

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem

Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Review of Linear Algebra Definitions, Change of Basis, Trace, Spectral Theorem Steven J. Miller June 19, 2004 Abstract Matrices can be thought of as rectangular (often square) arrays of numbers, or as

More information

The Singular Value Decomposition

The Singular Value Decomposition The Singular Value Decomposition Philippe B. Laval KSU Fall 2015 Philippe B. Laval (KSU) SVD Fall 2015 1 / 13 Review of Key Concepts We review some key definitions and results about matrices that will

More information