Lecture 1: Basic Concepts
|
|
- Mervyn Jones
- 5 years ago
- Views:
Transcription
1 ENGG 5781: Matrix Analysis and Computations Lecture 1: Basic Concepts First Term Instructor: Wing-Kin Ma This note is not a supplementary material for the main slides. I will write notes such as this one when certain concepts cannot be put on slides. This time, the aim is to elaborate upon subspace concepts at a more fundamental level. 1 Subspace and Linear Independence 1.1 Subspace A nonempty subset S of R m is called a subspace of R m if, for any α, β R, x, y S = αx + βy S. It can be verified that if S R m is a subspace and a 1,... a n S, then any linear combination of a 1,... a n, i.e., n α ia i, where α R n, also lies in S. Given a collection of vectors a 1,... a n R m, the span of {a 1,... a n } is defined as { span{a 1,... a n } = y = α i a i α R }. n In words, span{a 1,... a n } is the set of all possible linear combinations of the vectors a 1,... a n. It is easy to verify that span{a 1,... a n } is a subspace for any given a 1,... a n. In the literature, span is commonly used to represent a subspace. For example, we can represent R m by In fact, any subspace can be written as a span: R m = span{e 1,..., e m }. Theorem 1.1 For every subspace S R m, there exists a positive integer n and a collection of vectors a 1,..., a n R m such that S = span{a 1,..., a n }. In the literature you would notice that we take the result in Theorem 1.1 for granted in a way that is almost like a common sense and without elaboration. There is an easy way to prove Theorem 1.1, but the proof requires some result in linear independence. We relegate the proof to the later part of this note. 1.2 Linear Independence A set of vectors {a 1,... a n } R m is said to be linearly independent if α i a i 0, for all α R n with α 0. Otherwise, it is called linearly dependent. A subset {a i1,..., a ik }, where {i 1,..., i k } {1,..., n} with i j i l for any j l, and 1 k n, is called a maximal linearly independent subset of {a 1,..., a n } if it is linearly independent and is not contained by any other linearly independent subset of {a 1,..., a n }. From the above definitions, we see that 1
2 1. if {a 1,..., a n } is linearly independent, then any a j cannot be a linear combination of the set of the other vectors {a i } i {1,...n},i j ; 2. if {a 1,..., a n } is linearly dependent, then there exists a vector a j such that it is a linear combination of the other vectors; 3. {a i1,..., a ik } is a maximal linearly independent subset of {a 1,..., a n } if and only if {a i1,..., a ik, a j } is linearly dependent for any j {1,..., n} \ {i 1,..., i k }. Thus, roughly speaking, we may see a linearly independent {a 1,..., a n } as a non-redundant or sufficiently different set of vectors, and a maximal linearly independent {a i1,..., a ik } as an irreducibly non-redundant set of vectors for representing the whole vector set {a 1,..., a n }. It can be easily shown that for any maximal linearly independent subset {a i1,..., a ik } of {a 1,..., a n }, we have span{a 1,..., a n } = span{a i1,..., a ik }. We also have the following results which are very basic and we again take them almost for granted: 1. Let {a 1,..., a m } R m be a linearly independent vector set. Suppose y span{a 1,..., a m }. Then the coefficient α for the representation y = α i a i is unique; i.e., there does not exist a β R n, β α, such that y = n β ia i. 2. If {a 1,... a n } R m is linearly independent, then n m must hold. The first result is simple: if there exists a β α such that y = n β ia i, then we have n (α i β i )a i = y y = 0, which contradicts the linear independence of {a 1,..., a m }. For the second result, I show you in the proof in [2]. The proof is done by induction. Suppose m = 1. Then it is easy to see that n 1 must hold. Next suppose m 2. We need to show that n m. Partition the a i s as [ ] bi a i =, i = 1,..., n, c i where b i R m 1, c i R. If c 1 =... = c n = 0, then the linear independence of {a 1,..., a n } implies that {b 1,..., b n } is also linearly independent. By induction, we know that for any collection of n linearly independent vectors in R m 1, it must hold that n m 1. It follows that {b 1,..., b n } must satisfy n m 1 too; hence, we get n m. If c i 0 for some i, we need more work. Assume w.l.o.g. that c n 0 (we can always reshuffle the ordering of a 1,..., a n ). For any α 1,..., α n 1 R, choose α n as follows ( α n = 1 n 1 ) α i c i. c n Then we see that [ n 1 α i a i = α ib i n 1 c ] [ n 1 ( )] i c n α i b n = α i b i c i c n b n 0 0 Let b i = b i (c i /c n )b n for ease of explanation. From the above equation, we see that the linear independence of {a 1,..., a n } implies that { b 1,..., b n 1 } is linearly independent. By induction, we know that { b 1,..., b n 1 } must satisfy n 1 m 1; hence, we get n m. The proof is complete. 2
3 1.3 Orthogonality and Orthonormality A collection of vectors a 1,..., a n R m is said to be orthogonal if a T i a j = 0 for all i, j with i j, and orthonormal if a T i a j = 0 for all i, j with i j and a i 2 = 1 for all i. Some basic results arising from such a definition are as follows. 1. An orthogonal or orthonormal collection of vectors is also linearly independent. 2. Let {a 1,..., a n } R m be an orthonormal set of vectors. Suppose y span{a 1,..., a n }. Then the coefficient α for the representation y = α i a i is uniquely given by α i = a T i y, i = 1,..., n. 3. Let {a 1,..., a n } R m be a linearly independent set of vectors. There exists an orthonormal set of vectors {q 1,..., q n } such that span{a 1,..., a n } = span{q 1,..., q n }. The first and second results above are easy to verify. The proof of the third result is constructive and is a consequence of the Gram-Schmidt procedure; see the main slides for the proof. 1.4 Proof of Theorem 1.1 We now prove Theorem 1.1. Suppose that the subspace S does not equal {0}; the case of S = {0} is trivial to prove. Let n be a positive integer and a 1,..., a n S. We have span{a 1,..., a n } S, and the reason is that any linear combination of vectors in S also lies in S. Now we wish to show that there exists a linearly independent collection of a 1,..., a n S such that S span{a 1,..., a n }. We adopt a constructive proof. First, pick any nonzero vector a 1 S. If any y S can be written as y = α 1 a 1 for some α 1, then we finish. If not, consider the following recursive process. Suppose that we have previously picked a linearly independent collection of vectors a 1,..., a k 1 S, but there exists y S such that y k 1 α ia i for any α i s. Note that we have the case of k = 1 already. Then we pick a k as any y S that satisfies y k 1 α ia i for any α i s. Clearly, a 1,..., a k are linearly independent. If any y S can be written as y = k α ia i for some α i s, then it means S span{a 1,..., a k } and we finish. Otherwise, we increase k by one and repeat the above steps. The question is whether the recursive process above will stop; if yes, our proof is complete. Suppose that the process has reached an iteration number of k > m. On one hand, the process says that {a 1,..., a k } is linearly independent. On the other hand, if {a 1,..., a k } is linearly independent, then k m must hold (a take-it-for-granted fact in linear independence). Hence, by contradiction, the process must stop; k = m is the largest possible iteration number. The proof is complete. 3
4 2 Basis and Dimension Let subspace S R m with S {0}. A set of vectors {b 1,..., b k } R m is called a basis for S if {b 1,..., b k } is linearly independent and S = span{b 1,..., b k }. Taking S = span{a 1,..., a n } as an example, any maximal linearly independent subset of {a 1,..., a n } is a basis for S. From the definition of bases, the following facts can be shown: 1. A subspace may have more than one basis. 2. A subspace always has an orthonormal basis (if it does not equal {0}). 3. All bases for a subspace S have the same number of elements; i.e., if {b 1,..., b k } and {c 1,..., c l } are both bases for S, then k = l. Given a subspace S R m with S {0}, the dimension of S is defined as the number of elements of a basis for S. Also, by convention, the dimension of the subspace {0} is defined as zero. The notation dim S is used to denote the dimension of S. Some examples are as follows: We have dim R m = dim span{e 1,..., e m } = m. If {a i1,..., a ik } is a maximal linearly independent subset of {a 1,..., a n }, then dim span{a 1,..., a n } = k. We have the following properties for subspace dimension. 1. Let S 1, S 2 R m be subspaces. If S 1 S 2, then dim S 1 dim S Let S be a subspace of R m. We have dim S = m if and only if S = R m. 3. Let S 1, S 2 R m be subspaces. We have dim(s 1 + S 2 ) dim S 1 + dim S 2 ; note that the notation X + Y = {x + y x X, y Y} denotes the sum of two subsets X, Y R m. 3 Projections onto Subspaces Let S R m be a nonempty closed set, and let y R m be any given vector. A projection of y onto S is any solution to the following problem min z z S y 2 2. In words, a projection of y onto S is a point in S that is closest to y in the Euclidean sense. In general, there may be more than one such closest point. If every y R m has only one projection of y onto S, we will use the following notation Π S (y) = arg min z S z y 2 2. to denote the projection of y onto S. We are interested in projections onto subspaces. Such concepts play a crucial role in linear algebra and matrix analysis. Consider the following theorem: Theorem 1.2 Let S be a subspace of R m. 1. For every y R m, there exists a unique vector y s S that minimizes z y 2 over all z S. Thus, we can write Π S (y) = arg min z S z y
5 2. Given y R m, we have y s = Π S (y) if and only if y s S, z T (y s y) = 0, for all z S. (1) The above theorem is a special case of the projection theorem in convex analysis and optimization [1, Proposition B.11], which deals with projections onto closed convex sets. In the following we provide a proof that is enough for the subspace case. Proof: First, we should mention that there always exists a vector in S at which the minimum of z y 2 2 over all z S is attained; in this claim we only need S to be closed. This result is shown by applying the Weierstrass theorem, and readers are referred to [1, proof of Proposition B.11] for details. Second, we show the sufficiency of Statement 2. Let y s S be a vector that minimizes z y 2 over all z S. Since z y 2 2 y s y 2 2 for all z S, and we have The above equation is equivalent to z y 2 2 = z y s + y s y 2 2 = z y s (z y s ) T (y s y) + y s y 2 2, z y s (z y s ) T (y s y) 0, for all z S. z z T (y s y) 0, for all z S; the reason is that z S implies z y s S, and the converse is also true. Now, suppose that there exists a point z S such that z T (y s y) 0. Then, by choosing z = α z, where α = z T (y s y)/ z 2 2, one can verify that z zT (y s y) < 0 and yet z S. Thus, by contradiction, we must have z T (y s y) = 0 for all z S. Third, we show the necessity of Statement 2. Suppose that there exists a vector ȳ s S such that z T (ȳ s y) = 0 for all z S. The aforementioned condition can be rewritten as (z ȳ s ) T (ȳ s y) = 0, for all z S, where we have used the equivalence z ȳ s S z S. Now, for any z S, we have z y 2 2 = z ȳ s (z ȳ s ) T (ȳ s y) + ȳ s y 2 2 = z ȳ s ȳ s y 2 2 ȳ s y 2 2. (2) The above inequality implies that ȳ s minimizes z y 2 over all z S. This, together with the previous sufficiency proof, completes the proof of Statement 2. In addition, we note that the equality in (2) holds if and only if z ȳ s = z. This implies that ȳ s is the only minimizer of z y 2 over all z S. Thus, we also obtain the uniqueness claim in Statement 1.. Theorem 1.2 has many implications, e.g., in least squares and orthogonal projections which we will learn in later lectures. In the next section we will consider an application of Theorem 1.2 to subspaces. 5
6 4 Orthogonal Complements Let S a nonempty subset in R m. The orthogonal complement of S is defined as the set S = { y R m z T y = 0 for all z S }. It is easy to verify that S is always a subspace even if S is not. From the definition, we see that 1. any z S, y S are orthogonal, i.e., S consists of all vectors that are orthogonal to all vectors of S; 2. we have either S S = {0} or S S = ; i.e., if we exclude 0, the sets S and S are non-intersecting. The following theorem is a direct consequence of the projection theorem in Theorem 1.2. Theorem 1.3 Let S be a subspace of R m. 1. For every y R m, there exists a unique 2-tuple (y s, y c ) S S such that y = y s + y c. Also, such a (y s, y c ) is given by y s = Π S (y), y c = y Π S (y). 2. The projection of y onto S is given by Π S (y) = y Π S (y). Proof: Let us rephrase the problem in Statement 1 as follows: Given a vector y R m, find a vector y s such that y s S and y y s S. This problem is exactly the same as (1) in Theorem 1.2. Thus, by Theorem 1.2 the solution is uniquely given by y s = Π S (y). Also, we have y c = y Π S (y) S. To show Statement 2, let us consider the following problem: Given a vector y R m, find a vector ȳ c such that ȳ c S and y ȳ c (S ), or equivalently, ȳ c S, z T (y ȳ c ) = 0, for all z S. By Theorem 1.2, the solution is uniquely given by ȳ c = Π S (y). On the other hand, the above conditions are seen to satisfy if we choose ȳ c = y Π S (y) S. It follows that Π S (y) = y Π S (y). movmov Armed with Theorem 1.3, we can easily prove the following results. Property 1.1 The following properties hold for any subspace S R m : 1. S + S = R m ; 2. dim S + dim S = m; 3. (S ) = S. 1 1 We also have the following result: Let S be any subset (and not necessarily a subspace) in R m. Then, we have (S ) = span S, where span S is defined as the set of all finite linear combinations of points in S. 6
7 Proof: Statement 1 is merely a consequence of the first statement in Theorem 1.3. For Statement 2, let us assume S does not equal either {0} or R m ; the cases of {0} and R m are trivial. Let {u 1,..., v k } and {v 1,..., v l } be orthonormal bases of S and S, respectively; here note that k = dim S, l = dim S. We can write S + S = span{u 1,..., u k, v 1,..., v l }. Also, from the definition of orthogonal complements, it is immediate that u T i v j = 0 for all i, j. Hence, {u 1,..., u k, v 1,..., v l } is an orthonormal basis for S + S, and consequently we have dim(s + S ) = k + l. Moreover, since S + S = R m, we also have dim(s + S ) = m. The result m = dim S + dim S therefore holds. The proof of Statement 3 is as follows. One can verify from the orthogonal complements definition that y S implies y (S ). On the other hand, any y (S ) satisfies y = Π (S ) (y) = y Π S (y) = y (y Π S (y)) = Π S (y), where the first equation above is due to the equivalence y S y = Π S (y) (which is easy to verify), and the second equation above is due to the second statement of Theorem 1.3. The equality y = Π S (y) shown above implies that y S. Let us give an example on the application of Property 1.1. It is known that dim N (A) = n rank(a). The above result can be shown by Property 1.1. First, we use N (A) = R(A T ). Second, from the second result in Property 1.1 we have n = dim R(A T ) + dim R(A T ). Third, it is known that dim R(A T ) = rank(a T ) = rank(a). Combining these results together gives dim N (A) = n rank(a). References [1] D. P. Bertsekas. Nonlinear Programming. Athena Scientific, Belmont, Mass., U.S.A., 2nd edition, [2] S. Boyd and L. Vandenberghe. Introduction to Applied Linear Algebra Vectors, Matrices, and Least Squares Available online at 7
A Primer in Econometric Theory
A Primer in Econometric Theory Lecture 1: Vector Spaces John Stachurski Lectures by Akshay Shanker May 5, 2017 1/104 Overview Linear algebra is an important foundation for mathematics and, in particular,
More informationOrthonormal Bases; Gram-Schmidt Process; QR-Decomposition
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 205 Motivation When working with an inner product space, the most
More informationNOTES on LINEAR ALGEBRA 1
School of Economics, Management and Statistics University of Bologna Academic Year 207/8 NOTES on LINEAR ALGEBRA for the students of Stats and Maths This is a modified version of the notes by Prof Laura
More informationDS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.
DS-GA 1002 Lecture notes 0 Fall 2016 Linear Algebra These notes provide a review of basic concepts in linear algebra. 1 Vector spaces You are no doubt familiar with vectors in R 2 or R 3, i.e. [ ] 1.1
More informationProblem # Max points possible Actual score Total 120
FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to
More informationMath 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections
Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product
More informationMATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.
MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix. Basis Definition. Let V be a vector space. A linearly independent spanning set for V is called a basis.
More informationElements of Convex Optimization Theory
Elements of Convex Optimization Theory Costis Skiadas August 2015 This is a revised and extended version of Appendix A of Skiadas (2009), providing a self-contained overview of elements of convex optimization
More informationorthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,
5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications
More informationFamily Feud Review. Linear Algebra. October 22, 2013
Review Linear Algebra October 22, 2013 Question 1 Let A and B be matrices. If AB is a 4 7 matrix, then determine the dimensions of A and B if A has 19 columns. Answer 1 Answer A is a 4 19 matrix, while
More informationLecture 4 Orthonormal vectors and QR factorization
Orthonormal vectors and QR factorization 4 1 Lecture 4 Orthonormal vectors and QR factorization EE263 Autumn 2004 orthonormal vectors Gram-Schmidt procedure, QR factorization orthogonal decomposition induced
More informationTypical Problem: Compute.
Math 2040 Chapter 6 Orhtogonality and Least Squares 6.1 and some of 6.7: Inner Product, Length and Orthogonality. Definition: If x, y R n, then x y = x 1 y 1 +... + x n y n is the dot product of x and
More informationSection 6.2, 6.3 Orthogonal Sets, Orthogonal Projections
Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u
More informationMATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)
MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m
More informationIntroduction to Mathematical Programming IE406. Lecture 3. Dr. Ted Ralphs
Introduction to Mathematical Programming IE406 Lecture 3 Dr. Ted Ralphs IE406 Lecture 3 1 Reading for This Lecture Bertsimas 2.1-2.2 IE406 Lecture 3 2 From Last Time Recall the Two Crude Petroleum example.
More informationSUPPLEMENT TO CHAPTER 3
SUPPLEMENT TO CHAPTER 3 1.1 Linear combinations and spanning sets Consider the vector space R 3 with the unit vectors e 1 = (1, 0, 0), e 2 = (0, 1, 0), e 3 = (0, 0, 1). Every vector v = (a, b, c) R 3 can
More informationVectors. Vectors and the scalar multiplication and vector addition operations:
Vectors Vectors and the scalar multiplication and vector addition operations: x 1 x 1 y 1 2x 1 + 3y 1 x x n 1 = 2 x R n, 2 2 y + 3 2 2x = 2 + 3y 2............ x n x n y n 2x n + 3y n I ll use the two terms
More informationLecture 10: Vector Algebra: Orthogonal Basis
Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using Gram-Schmidt Orthogonalization Process Orthogonal Set Any set of vectors that
More informationLinear Models Review
Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign
More informationOrthogonality. 6.1 Orthogonal Vectors and Subspaces. Chapter 6
Chapter 6 Orthogonality 6.1 Orthogonal Vectors and Subspaces Recall that if nonzero vectors x, y R n are linearly independent then the subspace of all vectors αx + βy, α, β R (the space spanned by x and
More informationOrthogonal Complements
Orthogonal Complements Definition Let W be a subspace of R n. If a vector z is orthogonal to every vector in W, then z is said to be orthogonal to W. The set of all such vectors z is called the orthogonal
More informationI. Multiple Choice Questions (Answer any eight)
Name of the student : Roll No : CS65: Linear Algebra and Random Processes Exam - Course Instructor : Prashanth L.A. Date : Sep-24, 27 Duration : 5 minutes INSTRUCTIONS: The test will be evaluated ONLY
More informationChapter 2: Linear Independence and Bases
MATH20300: Linear Algebra 2 (2016 Chapter 2: Linear Independence and Bases 1 Linear Combinations and Spans Example 11 Consider the vector v (1, 1 R 2 What is the smallest subspace of (the real vector space
More informationMATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.
MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors. Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v
More informationElementary linear algebra
Chapter 1 Elementary linear algebra 1.1 Vector spaces Vector spaces owe their importance to the fact that so many models arising in the solutions of specific problems turn out to be vector spaces. The
More informationOrthogonality and Least Squares
6 Orthogonality and Least Squares 6.1 INNER PRODUCT, LENGTH, AND ORTHOGONALITY INNER PRODUCT If u and v are vectors in, then we regard u and v as matrices. n 1 n The transpose u T is a 1 n matrix, and
More informationFunctional Analysis HW #5
Functional Analysis HW #5 Sangchul Lee October 29, 2015 Contents 1 Solutions........................................ 1 1 Solutions Exercise 3.4. Show that C([0, 1]) is not a Hilbert space, that is, there
More informationMath 3191 Applied Linear Algebra
Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,
More informationMODULE 8 Topics: Null space, range, column space, row space and rank of a matrix
MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix Definition: Let L : V 1 V 2 be a linear operator. The null space N (L) of L is the subspace of V 1 defined by N (L) = {x
More informationMATH Linear Algebra
MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization
More informationEC 521 MATHEMATICAL METHODS FOR ECONOMICS. Lecture 1: Preliminaries
EC 521 MATHEMATICAL METHODS FOR ECONOMICS Lecture 1: Preliminaries Murat YILMAZ Boğaziçi University In this lecture we provide some basic facts from both Linear Algebra and Real Analysis, which are going
More informationChapter 6: Orthogonality
Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products
More informationMath 3191 Applied Linear Algebra
Math 191 Applied Linear Algebra Lecture 1: Inner Products, Length, Orthogonality Stephen Billups University of Colorado at Denver Math 191Applied Linear Algebra p.1/ Motivation Not all linear systems have
More informationMTH 2032 SemesterII
MTH 202 SemesterII 2010-11 Linear Algebra Worked Examples Dr. Tony Yee Department of Mathematics and Information Technology The Hong Kong Institute of Education December 28, 2011 ii Contents Table of Contents
More informationMathematical Optimisation, Chpt 2: Linear Equations and inequalities
Mathematical Optimisation, Chpt 2: Linear Equations and inequalities Peter J.C. Dickinson p.j.c.dickinson@utwente.nl http://dickinson.website version: 12/02/18 Monday 5th February 2018 Peter J.C. Dickinson
More informationApprentice Linear Algebra, 1st day, 6/27/05
Apprentice Linear Algebra, 1st day, 6/7/05 REU 005 Instructor: László Babai Scribe: Eric Patterson Definitions 1.1. An abelian group is a set G with the following properties: (i) ( a, b G)(!a + b G) (ii)
More informationLINEAR ALGEBRA SUMMARY SHEET.
LINEAR ALGEBRA SUMMARY SHEET RADON ROSBOROUGH https://intuitiveexplanationscom/linear-algebra-summary-sheet/ This document is a concise collection of many of the important theorems of linear algebra, organized
More information6. Orthogonality and Least-Squares
Linear Algebra 6. Orthogonality and Least-Squares CSIE NCU 1 6. Orthogonality and Least-Squares 6.1 Inner product, length, and orthogonality. 2 6.2 Orthogonal sets... 8 6.3 Orthogonal projections... 13
More informationLecture 2: Linear Algebra
Lecture 2: Linear Algebra Rajat Mittal IIT Kanpur We will start with the basics of linear algebra that will be needed throughout this course That means, we will learn about vector spaces, linear independence,
More informationChapter 6 - Orthogonality
Chapter 6 - Orthogonality Maggie Myers Robert A. van de Geijn The University of Texas at Austin Orthogonality Fall 2009 http://z.cs.utexas.edu/wiki/pla.wiki/ 1 Orthogonal Vectors and Subspaces http://z.cs.utexas.edu/wiki/pla.wiki/
More informationMATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.
MATH 2331 Linear Algebra Section 2.1 Matrix Operations Definition: A : m n, B : n p ( 1 2 p ) ( 1 2 p ) AB = A b b b = Ab Ab Ab Example: Compute AB, if possible. 1 Row-column rule: i-j-th entry of AB:
More informationMIDTERM I LINEAR ALGEBRA. Friday February 16, Name PRACTICE EXAM SOLUTIONS
MIDTERM I LIEAR ALGEBRA MATH 215 Friday February 16, 2018. ame PRACTICE EXAM SOLUTIOS Please answer the all of the questions, and show your work. You must explain your answers to get credit. You will be
More informationLinear Independence. Stephen Boyd. EE103 Stanford University. October 9, 2017
Linear Independence Stephen Boyd EE103 Stanford University October 9, 2017 Outline Linear independence Basis Orthonormal vectors Gram-Schmidt algorithm Linear independence 2 Linear dependence set of n-vectors
More informationMATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces.
MATH 304 Linear Algebra Lecture 18: Orthogonal projection (continued). Least squares problems. Normed vector spaces. Orthogonality Definition 1. Vectors x,y R n are said to be orthogonal (denoted x y)
More informationSUMMARY OF MATH 1600
SUMMARY OF MATH 1600 Note: The following list is intended as a study guide for the final exam. It is a continuation of the study guide for the midterm. It does not claim to be a comprehensive list. You
More informationThe Dual Lattice, Integer Linear Systems and Hermite Normal Form
New York University, Fall 2013 Lattices, Convexity & Algorithms Lecture 2 The Dual Lattice, Integer Linear Systems and Hermite Normal Form Lecturers: D. Dadush, O. Regev Scribe: D. Dadush 1 Dual Lattice
More informationLecture Notes 1: Vector spaces
Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector
More informationMath 407: Linear Optimization
Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University
More informationSolutions to Review Problems for Chapter 6 ( ), 7.1
Solutions to Review Problems for Chapter (-, 7 The Final Exam is on Thursday, June,, : AM : AM at NESBITT Final Exam Breakdown Sections % -,7-9,- - % -9,-,7,-,-7 - % -, 7 - % Let u u and v Let x x x x,
More informationAlgebraic Methods in Combinatorics
Algebraic Methods in Combinatorics Po-Shen Loh 27 June 2008 1 Warm-up 1. (A result of Bourbaki on finite geometries, from Răzvan) Let X be a finite set, and let F be a family of distinct proper subsets
More informationChapter 3 Transformations
Chapter 3 Transformations An Introduction to Optimization Spring, 2014 Wei-Ta Chu 1 Linear Transformations A function is called a linear transformation if 1. for every and 2. for every If we fix the bases
More informationMTH 2310, FALL Introduction
MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly
More informationDot product and linear least squares problems
Dot product and linear least squares problems Dot Product For vectors u,v R n we define the dot product Note that we can also write this as u v = u,,u n u v = u v + + u n v n v v n = u v + + u n v n The
More informationLinear Algebra Review: Linear Independence. IE418 Integer Programming. Linear Algebra Review: Subspaces. Linear Algebra Review: Affine Independence
Linear Algebra Review: Linear Independence IE418: Integer Programming Department of Industrial and Systems Engineering Lehigh University 21st March 2005 A finite collection of vectors x 1,..., x k R n
More informationEcon Slides from Lecture 7
Econ 205 Sobel Econ 205 - Slides from Lecture 7 Joel Sobel August 31, 2010 Linear Algebra: Main Theory A linear combination of a collection of vectors {x 1,..., x k } is a vector of the form k λ ix i for
More informationMATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION
MATH (LINEAR ALGEBRA ) FINAL EXAM FALL SOLUTIONS TO PRACTICE VERSION Problem (a) For each matrix below (i) find a basis for its column space (ii) find a basis for its row space (iii) determine whether
More informationLecture: Linear algebra. 4. Solutions of linear equation systems The fundamental theorem of linear algebra
Lecture: Linear algebra. 1. Subspaces. 2. Orthogonal complement. 3. The four fundamental subspaces 4. Solutions of linear equation systems The fundamental theorem of linear algebra 5. Determining the fundamental
More informationNORMS ON SPACE OF MATRICES
NORMS ON SPACE OF MATRICES. Operator Norms on Space of linear maps Let A be an n n real matrix and x 0 be a vector in R n. We would like to use the Picard iteration method to solve for the following system
More informationMath 3C Lecture 25. John Douglas Moore
Math 3C Lecture 25 John Douglas Moore June 1, 2009 Let V be a vector space. A basis for V is a collection of vectors {v 1,..., v k } such that 1. V = Span{v 1,..., v k }, and 2. {v 1,..., v k } are linearly
More informationLinear Algebra II. 7 Inner product spaces. Notes 7 16th December Inner products and orthonormal bases
MTH6140 Linear Algebra II Notes 7 16th December 2010 7 Inner product spaces Ordinary Euclidean space is a 3-dimensional vector space over R, but it is more than that: the extra geometric structure (lengths,
More informationLinear Algebra 2 Spectral Notes
Linear Algebra 2 Spectral Notes In what follows, V is an inner product vector space over F, where F = R or C. We will use results seen so far; in particular that every linear operator T L(V ) has a complex
More informationy 2 . = x 1y 1 + x 2 y x + + x n y n 2 7 = 1(2) + 3(7) 5(4) = 3. x x = x x x2 n.
6.. Length, Angle, and Orthogonality In this section, we discuss the defintion of length and angle for vectors and define what it means for two vectors to be orthogonal. Then, we see that linear systems
More informationLinear Algebra. Grinshpan
Linear Algebra Grinshpan Saturday class, 2/23/9 This lecture involves topics from Sections 3-34 Subspaces associated to a matrix Given an n m matrix A, we have three subspaces associated to it The column
More informationLINEAR ALGEBRA W W L CHEN
LINEAR ALGEBRA W W L CHEN c W W L Chen, 1997, 2008. This chapter is available free to all individuals, on the understanding that it is not to be used for financial gain, and may be downloaded and/or photocopied,
More informationMATH 304 Linear Algebra Lecture 34: Review for Test 2.
MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1
More informationLINEAR ALGEBRA: THEORY. Version: August 12,
LINEAR ALGEBRA: THEORY. Version: August 12, 2000 13 2 Basic concepts We will assume that the following concepts are known: Vector, column vector, row vector, transpose. Recall that x is a column vector,
More information1. General Vector Spaces
1.1. Vector space axioms. 1. General Vector Spaces Definition 1.1. Let V be a nonempty set of objects on which the operations of addition and scalar multiplication are defined. By addition we mean a rule
More informationIFT 6760A - Lecture 1 Linear Algebra Refresher
IFT 6760A - Lecture 1 Linear Algebra Refresher Scribe(s): Tianyu Li Instructor: Guillaume Rabusseau 1 Summary In the previous lecture we have introduced some applications of linear algebra in machine learning,
More informationMAT Linear Algebra Collection of sample exams
MAT 342 - Linear Algebra Collection of sample exams A-x. (0 pts Give the precise definition of the row echelon form. 2. ( 0 pts After performing row reductions on the augmented matrix for a certain system
More information1 Shortest Vector Problem
Lattices in Cryptography University of Michigan, Fall 25 Lecture 2 SVP, Gram-Schmidt, LLL Instructor: Chris Peikert Scribe: Hank Carter Shortest Vector Problem Last time we defined the minimum distance
More informationLINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS
LINEAR ALGEBRA, -I PARTIAL EXAM SOLUTIONS TO PRACTICE PROBLEMS Problem (a) For each of the two matrices below, (i) determine whether it is diagonalizable, (ii) determine whether it is orthogonally diagonalizable,
More informationAlgorithms to Compute Bases and the Rank of a Matrix
Algorithms to Compute Bases and the Rank of a Matrix Subspaces associated to a matrix Suppose that A is an m n matrix The row space of A is the subspace of R n spanned by the rows of A The column space
More informationThe Hilbert Space of Random Variables
The Hilbert Space of Random Variables Electrical Engineering 126 (UC Berkeley) Spring 2018 1 Outline Fix a probability space and consider the set H := {X : X is a real-valued random variable with E[X 2
More informationSpanning and Independence Properties of Finite Frames
Chapter 1 Spanning and Independence Properties of Finite Frames Peter G. Casazza and Darrin Speegle Abstract The fundamental notion of frame theory is redundancy. It is this property which makes frames
More information47-831: Advanced Integer Programming Lecturer: Amitabh Basu Lecture 2 Date: 03/18/2010
47-831: Advanced Integer Programming Lecturer: Amitabh Basu Lecture Date: 03/18/010 We saw in the previous lecture that a lattice Λ can have many bases. In fact, if Λ is a lattice of a subspace L with
More informationCover Page. The handle holds various files of this Leiden University dissertation
Cover Page The handle http://hdl.handle.net/1887/57796 holds various files of this Leiden University dissertation Author: Mirandola, Diego Title: On products of linear error correcting codes Date: 2017-12-06
More informationMath 321: Linear Algebra
Math 32: Linear Algebra T. Kapitula Department of Mathematics and Statistics University of New Mexico September 8, 24 Textbook: Linear Algebra,by J. Hefferon E-mail: kapitula@math.unm.edu Prof. Kapitula,
More informationAnalysis II Lecture notes
Analysis II Lecture notes Christoph Thiele (lectures 11,12 by Roland Donninger lecture 22 by Diogo Oliveira e Silva) Summer term 2015 Universität Bonn July 5, 2016 Contents 1 Analysis in several variables
More informationHomework 11 Solutions. Math 110, Fall 2013.
Homework 11 Solutions Math 110, Fall 2013 1 a) Suppose that T were self-adjoint Then, the Spectral Theorem tells us that there would exist an orthonormal basis of P 2 (R), (p 1, p 2, p 3 ), consisting
More informationLinear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products
Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6A: Inner products In this chapter, the field F = R or C. We regard F equipped with a conjugation χ : F F. If F =
More informationVector Spaces, Orthogonality, and Linear Least Squares
Week Vector Spaces, Orthogonality, and Linear Least Squares. Opening Remarks.. Visualizing Planes, Lines, and Solutions Consider the following system of linear equations from the opener for Week 9: χ χ
More informationBASES. Throughout this note V is a vector space over a scalar field F. N denotes the set of positive integers and i,j,k,l,m,n,p N.
BASES BRANKO ĆURGUS Throughout this note V is a vector space over a scalar field F. N denotes the set of positive integers and i,j,k,l,m,n,p N. 1. Linear independence Definition 1.1. If m N, α 1,...,α
More informationLecture notes: Applied linear algebra Part 1. Version 2
Lecture notes: Applied linear algebra Part 1. Version 2 Michael Karow Berlin University of Technology karow@math.tu-berlin.de October 2, 2008 1 Notation, basic notions and facts 1.1 Subspaces, range and
More informationLecture Summaries for Linear Algebra M51A
These lecture summaries may also be viewed online by clicking the L icon at the top right of any lecture screen. Lecture Summaries for Linear Algebra M51A refers to the section in the textbook. Lecture
More informationVector Spaces and Linear Transformations
Vector Spaces and Linear Transformations Wei Shi, Jinan University 2017.11.1 1 / 18 Definition (Field) A field F = {F, +, } is an algebraic structure formed by a set F, and closed under binary operations
More informationRecall the convention that, for us, all vectors are column vectors.
Some linear algebra Recall the convention that, for us, all vectors are column vectors. 1. Symmetric matrices Let A be a real matrix. Recall that a complex number λ is an eigenvalue of A if there exists
More information5 Compact linear operators
5 Compact linear operators One of the most important results of Linear Algebra is that for every selfadjoint linear map A on a finite-dimensional space, there exists a basis consisting of eigenvectors.
More informationExercise Solutions to Functional Analysis
Exercise Solutions to Functional Analysis Note: References refer to M. Schechter, Principles of Functional Analysis Exersize that. Let φ,..., φ n be an orthonormal set in a Hilbert space H. Show n f n
More informationEE263: Introduction to Linear Dynamical Systems Review Session 2
EE263: Introduction to Linear Dynamical Systems Review Session 2 Basic concepts from linear algebra nullspace range rank and conservation of dimension EE263 RS2 1 Prerequisites We assume that you are familiar
More informationLinear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space
Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) Contents 1 Vector Spaces 1 1.1 The Formal Denition of a Vector Space.................................. 1 1.2 Subspaces...................................................
More information1 Overview. 2 Extreme Points. AM 221: Advanced Optimization Spring 2016
AM 22: Advanced Optimization Spring 206 Prof. Yaron Singer Lecture 7 February 7th Overview In the previous lectures we saw applications of duality to game theory and later to learning theory. In this lecture
More informationAnalysis-3 lecture schemes
Analysis-3 lecture schemes (with Homeworks) 1 Csörgő István November, 2015 1 A jegyzet az ELTE Informatikai Kar 2015. évi Jegyzetpályázatának támogatásával készült Contents 1. Lesson 1 4 1.1. The Space
More informationwhich arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i
MODULE 6 Topics: Gram-Schmidt orthogonalization process We begin by observing that if the vectors {x j } N are mutually orthogonal in an inner product space V then they are necessarily linearly independent.
More informationAssignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.
Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has
More information2.4 Hilbert Spaces. Outline
2.4 Hilbert Spaces Tom Lewis Spring Semester 2017 Outline Hilbert spaces L 2 ([a, b]) Orthogonality Approximations Definition A Hilbert space is an inner product space which is complete in the norm defined
More information1: Introduction to Lattices
CSE 206A: Lattice Algorithms and Applications Winter 2012 Instructor: Daniele Micciancio 1: Introduction to Lattices UCSD CSE Lattices are regular arrangements of points in Euclidean space. The simplest
More informationDesigning Information Devices and Systems I Spring 2018 Lecture Notes Note 25
EECS 6 Designing Information Devices and Systems I Spring 8 Lecture Notes Note 5 5. Speeding up OMP In the last lecture note, we introduced orthogonal matching pursuit OMP, an algorithm that can extract
More information4 Hilbert spaces. The proof of the Hilbert basis theorem is not mathematics, it is theology. Camille Jordan
The proof of the Hilbert basis theorem is not mathematics, it is theology. Camille Jordan Wir müssen wissen, wir werden wissen. David Hilbert We now continue to study a special class of Banach spaces,
More informationLinear algebra review
EE263 Autumn 2015 S. Boyd and S. Lall Linear algebra review vector space, subspaces independence, basis, dimension nullspace and range left and right invertibility 1 Vector spaces a vector space or linear
More informationJim Lambers MAT 610 Summer Session Lecture 1 Notes
Jim Lambers MAT 60 Summer Session 2009-0 Lecture Notes Introduction This course is about numerical linear algebra, which is the study of the approximate solution of fundamental problems from linear algebra
More information