Orthogonal complement

Size: px
Start display at page:

Download "Orthogonal complement"

Transcription

1 Orthogonal complement Aim lecture: Inner products give a special way of constructing vector space complements. As usual, in this lecture F = R or C. We also let V be an F-space equipped with an inner product ( ). Defn Let S V. We define the orthogonal complement to S to be S = {v V v S} = w S ker(w ) Hence S is a subspace orthogonal to S & in particular, is closed under addition. Proof. Clear. E.g. This concept is easily understood in R 3 Daniel Chan (UNSW) Lecture 33: Orthogonal complements & projections Semester / 10

2 Orthogonal complements of spans Lemma Let S V. Then Span(S) = S Proof. w S v w for all v S w v for all v S S ker(w ) Span(S) ker(w ) w Span(S) This completes the proof. E.g. The orthogonal complement to S = Span((1, 1, 0) T, (0, 1, 1) T ) is Daniel Chan (UNSW) Lecture 33: Orthogonal complements & projections Semester / 10

3 Orthogonal (internal) direct sums Prop-Defn Let W 1,..., W r V be mutually orthogonal subspaces i.e. W i W j whenever i j. Then the sum r i=1 W i is direct & we say the internal direct sum i W i is orthogonal. Proof. The lemma ensures that W r is orthogonal to W <r = r 1 i=1 W i so by induction, it suffices to show that any w W r W <r W r Wr must be 0. But w w so (w w) = 0 & w = 0. This completes the proof. E.g. Daniel Chan (UNSW) Lecture 33: Orthogonal complements & projections Semester / 10

4 Vector space complement Prop Let W V & dim W < then W + W = V so V = W W. Proof. We prove this here only in the case where dim V <. The propn on orthog direct sums ensures the sum W + W is direct. Pick a basis w 1,..., w r W for W. For i = 1,..., r we have l i = (w i ) L(V, F) so we may form the r 1-matrix T whose i-th entry is l i. Note T : V F r : v (l 1 (v),..., l r (v)) T. By the lemma W = i ker l i = ker T. Since im T F r, rank-nullity ensures that dim W = dim V dim im T dim V r = dim V dim W. However, the sum W + W is direct, so we must have dim W + W = dim W + dim W = dim V which ensures V = W + W as desired. Daniel Chan (UNSW) Lecture 33: Orthogonal complements & projections Semester / 10

5 Examples E.g. Let V = C[x] 1 with inner product (f g) = 1 f (t)g(t)dt. Find the 0 orthogonal complement to W = C(1 + ix). Daniel Chan (UNSW) Lecture 33: Orthogonal complements & projections Semester / 10

6 Orthogonal projections, bases Cor-Defn 1 Suppose that V = W W (e.g. when dim W < ). The orthogonal projection onto W is the linear map proj W : V V : ( w w ) ( w 0). 2 For v V we have v proj W v W. 3 If V = W 1... W r is an orthogonal direct sum then Wi particular, if V = W W then (W ) = W. = j i W j. In 4 We say a set S = {w 1,..., w r } V is orthogonal if w i w j for i j. Equivalently, the sum i F w i is an orthogonal direct sum. In particular, S is lin indep in this case. 5 An orthogonal set S V is orthonormal if furthermore, w i = 1 for all i. Proof. 2),4) follow from propns. We prove 3) first noting that W i = j i W j is orthogonal to W i. It thus suffices to show Wi W i so suppose w W i. We may write w = w i + w i with w i W i, w i W i. Then 0 = (w i w) = (w i w i + w i ) = (w i w i) + (w i w i ) = (w i w i ) so w i = 0 & w = w i W i. Daniel Chan (UNSW) Lecture 33: Orthogonal complements & projections Semester / 10

7 Existence of orthonormal bases Theorem Let V be fin dim. Then 1 V is the orthogonal direct sum of 1-dimensional vector spaces. 2 V has an orthonormal basis. Proof. 1) We argue by induction on d = dim V, the cases d = 0, 1 being clear so suppose that d > 1. We may thus pick a non-zero subspace W V e.g. F w for any non-zero w W. Now V = W W & dim W, dim W < d. By induction, each of W & W are orthogonal direct sums of 1-dimensional F-spaces, say W = i W i, W = j V j. Clearly, the subspaces {W i, V j } are still mutually orthog so V is the orthogonal direct sum of them. 2) By 1), it suffices to find an orthonormal basis for a 1-dim F-space F v. Just pick v v. Daniel Chan (UNSW) Lecture 33: Orthogonal complements & projections Semester / 10

8 Orthogonal projection formula Prop 1 Given a 1-dim F-space W = F w we have V = W W & proj W v = (w v) w 2 w for any v V. 2 Suppose that V = W i Wi for i = 1,..., r so we have orthog projn maps proj Wi. Suppose further that the W i are mutually orthog so we may consider the orthogonal direct sum W = W i. Then V = W W & proj W = i proj W i. 3 ( Fourier decomposition ) In particular, if W is spanned by the orthog set {w 1,..., w r }, then r (w i v) proj W v = w i 2 w i. Rem This gives a proof of the propn on vector space complements in general. Proof. Note 3) follows immediately from 1) & 2) i=1 Daniel Chan (UNSW) Lecture 33: Orthogonal complements & projections Semester / 10

9 Proof propn 1) We need only show v = v (w v) w 2 w W for then we see v = (w v) w 2 w + v W + W. But (w v ) = (w v (w v) (w v) w) = (w v) w 2 w 2 (w w) = 0 2) As above V = W W follows from showing v i proj W i v W. In this case we may write V = W 1... W r W & writing v = (w 1,..., w r, w ) T for w i W i, w W we see w 1 w proj W v =. = = proj Wi v. w r 0 w r i Daniel Chan (UNSW) Lecture 33: Orthogonal complements & projections Semester / 10

10 Example E.g. Consider the orthonormal basis f 1 (x) = 1, f 2 (x) = 2 3x 3 for W = R[x] 1 (wrt (f g) = 1 0 f (t)g(t)dt). Find proj W x 2. Daniel Chan (UNSW) Lecture 33: Orthogonal complements & projections Semester / 10

Rank & nullity. Defn. Let T : V W be linear. We define the rank of T to be rank T = dim im T & the nullity of T to be nullt = dim ker T.

Rank & nullity. Defn. Let T : V W be linear. We define the rank of T to be rank T = dim im T & the nullity of T to be nullt = dim ker T. Rank & nullity Aim lecture: We further study vector space complements, which is a tool which allows us to decompose linear problems into smaller ones. We give an algorithm for finding complements & an

More information

Definition of adjoint

Definition of adjoint Definition of adjoint Aim lecture: We generalise the adjoint of complex matrices to linear maps between fin dim inner product spaces. In this lecture, we let F = R or C. Let V, W be inner product spaces

More information

Non-square diagonal matrices

Non-square diagonal matrices Non-square diagonal matrices Aim lecture: We apply the spectral theory of hermitian operators to look at linear maps T : V W which are not necessarily endomorphisms. This gives useful factorisations of

More information

Rotations & reflections

Rotations & reflections Rotations & reflections Aim lecture: We use the spectral thm for normal operators to show how any orthogonal matrix can be built up from rotations & reflections. In this lecture we work over the fields

More information

Data fitting. Ideal equation for linear relation

Data fitting. Ideal equation for linear relation Problem Most likely, there will be experimental error so you should take more than 2 data points, & they will not lie on a line so the ideal eqn above has no solution. The question is what is the best

More information

Jordan chain. Defn. E.g. T = J 3 (λ) Daniel Chan (UNSW) Lecture 29: Jordan chains & tableaux Semester / 10

Jordan chain. Defn. E.g. T = J 3 (λ) Daniel Chan (UNSW) Lecture 29: Jordan chains & tableaux Semester / 10 Jordan chain Aim lecture: We introduce the notions of Jordan chains & Jordan form tableaux which are the key notions to proving the Jordan canonical form theorem. Throughout this lecture we fix the following

More information

Rotations & reflections

Rotations & reflections Rotations & reflections Aim lecture: We use the spectral thm for normal operators to show how any orthogonal matrix can be built up from rotations. In this lecture we work over the fields F = R & C. We

More information

Eigenvectors. Prop-Defn

Eigenvectors. Prop-Defn Eigenvectors Aim lecture: The simplest T -invariant subspaces are 1-dim & these give rise to the theory of eigenvectors. To compute these we introduce the similarity invariant, the characteristic polynomial.

More information

Kernel. Prop-Defn. Let T : V W be a linear map.

Kernel. Prop-Defn. Let T : V W be a linear map. Kernel Aim lecture: We examine the kernel which measures the failure of uniqueness of solns to linear eqns. The concept of linear independence naturally arises. This in turn gives the concept of a basis

More information

Jordan blocks. Defn. Let λ F, n Z +. The size n Jordan block with e-value λ is the n n upper triangular matrix. J n (λ) =

Jordan blocks. Defn. Let λ F, n Z +. The size n Jordan block with e-value λ is the n n upper triangular matrix. J n (λ) = Jordan blocks Aim lecture: Even over F = C, endomorphisms cannot always be represented by a diagonal matrix. We give Jordan s answer, to what is the best form of the representing matrix. Defn Let λ F,

More information

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections Math 6 Lecture Notes: Sections 6., 6., 6. and 6. Orthogonal Sets and Projections We will not cover general inner product spaces. We will, however, focus on a particular inner product space the inner product

More information

Old & new co-ordinates

Old & new co-ordinates Old & new co-ordinates Aim lecture: A wise choice of co-ordinates can make life much easier. We give some examples showing how to make a linear change of co-ords to facilitate calculations. Suppose we

More information

Quadratic forms. Defn

Quadratic forms. Defn Quadratic forms Aim lecture: We use the spectral thm for self-adjoint operators to study solns to some multi-variable quadratic eqns. In this lecture, we work over the real field F = R. We use the notn

More information

Matrix of linear maps. Matrix-vector product

Matrix of linear maps. Matrix-vector product Matrix of linear maps. Matrix-vector product Aim lecture: Introduce matrices of linear maps as a way of understanding more complicated linear maps. Consider direct sums of F-spaces V = m j=1 V j, W = n

More information

Orthogonal Complements

Orthogonal Complements Orthogonal Complements Definition Let W be a subspace of R n. If a vector z is orthogonal to every vector in W, then z is said to be orthogonal to W. The set of all such vectors z is called the orthogonal

More information

Data for a vector space

Data for a vector space Data for a vector space Aim lecture: We recall the notion of a vector space which provides the context for describing linear phenomena. Throughout the rest of these lectures, F denotes a field. Defn A

More information

Matrix-valued functions

Matrix-valued functions Matrix-valued functions Aim lecture: We solve some first order linear homogeneous differential equations using exponentials of matrices. Recall as in MATH2, the any function R M mn (C) : t A(t) can be

More information

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions YORK UNIVERSITY Faculty of Science Department of Mathematics and Statistics MATH 3. M Test # Solutions. (8 pts) For each statement indicate whether it is always TRUE or sometimes FALSE. Note: For this

More information

Matrix-valued functions

Matrix-valued functions Matrix-valued functions Aim lecture: We solve some first order linear homogeneous differential equations using exponentials of matrices. Recall as in MATH2, the any function R M mn (C) : t A(t) can be

More information

8 General Linear Transformations

8 General Linear Transformations 8 General Linear Transformations 8.1 Basic Properties Definition 8.1 If T : V W is a function from a vector space V into a vector space W, then T is called a linear transformation from V to W if, for all

More information

Lecture 11: Dimension. How does Span(S) vary with S

Lecture 11: Dimension. How does Span(S) vary with S Lecture 11: Dimension Aim Lecture A basis {v 1,..., v n } of V gives n-dim coord system. Suggests V is n-dimensional. Need theory to ensure any two bases have the same number of vectors. How does Span(S)

More information

i.e. the i-th column of A is the value of T at the i-th standard basis vector e i R n.

i.e. the i-th column of A is the value of T at the i-th standard basis vector e i R n. Aim lecture In first year you learnt that you can mutliply not only a (real matrix with a (real vector, but more generally, matrices together (of compatible sizes. Furthermore, you learnt the following

More information

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce

Miderm II Solutions To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce Miderm II Solutions Problem. [8 points] (i) [4] Find the inverse of the matrix A = To find the inverse we row-reduce the augumented matrix [I A]. In our case, we row reduce We have A = 2 2 (ii) [2] Possibly

More information

March 27 Math 3260 sec. 56 Spring 2018

March 27 Math 3260 sec. 56 Spring 2018 March 27 Math 3260 sec. 56 Spring 2018 Section 4.6: Rank Definition: The row space, denoted Row A, of an m n matrix A is the subspace of R n spanned by the rows of A. We now have three vector spaces associated

More information

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections Section 6. 6. Orthogonal Sets Orthogonal Projections Main Ideas in these sections: Orthogonal set = A set of mutually orthogonal vectors. OG LI. Orthogonal Projection of y onto u or onto an OG set {u u

More information

MTH 2310, FALL Introduction

MTH 2310, FALL Introduction MTH 2310, FALL 2011 SECTION 6.2: ORTHOGONAL SETS Homework Problems: 1, 5, 9, 13, 17, 21, 23 1, 27, 29, 35 1. Introduction We have discussed previously the benefits of having a set of vectors that is linearly

More information

SYMPLECTIC GEOMETRY: LECTURE 1

SYMPLECTIC GEOMETRY: LECTURE 1 SYMPLECTIC GEOMETRY: LECTURE 1 LIAT KESSLER 1. Symplectic Linear Algebra Symplectic vector spaces. Let V be a finite dimensional real vector space and ω 2 V, i.e., ω is a bilinear antisymmetric 2-form:

More information

MIDTERM I LINEAR ALGEBRA. Friday February 16, Name PRACTICE EXAM SOLUTIONS

MIDTERM I LINEAR ALGEBRA. Friday February 16, Name PRACTICE EXAM SOLUTIONS MIDTERM I LIEAR ALGEBRA MATH 215 Friday February 16, 2018. ame PRACTICE EXAM SOLUTIOS Please answer the all of the questions, and show your work. You must explain your answers to get credit. You will be

More information

The Fundamental Theorem of Linear Algebra

The Fundamental Theorem of Linear Algebra The Fundamental Theorem of Linear Algebra Nicholas Hoell Contents 1 Prelude: Orthogonal Complements 1 2 The Fundamental Theorem of Linear Algebra 2 2.1 The Diagram........................................

More information

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process Worksheet for Lecture Name: Section.4 Gram-Schmidt Process Goal For a subspace W = Span{v,..., v n }, we want to find an orthonormal basis of W. Example Let W = Span{x, x } with x = and x =. Give an orthogonal

More information

Math 407: Linear Optimization

Math 407: Linear Optimization Math 407: Linear Optimization Lecture 16: The Linear Least Squares Problem II Math Dept, University of Washington February 28, 2018 Lecture 16: The Linear Least Squares Problem II (Math Dept, University

More information

Consider a subspace V = im(a) of R n, where m. Then,

Consider a subspace V = im(a) of R n, where m. Then, 5.4 LEAST SQUARES AND DATA FIT- TING ANOTHER CHARACTERIZATION OF ORTHOG- ONAL COMPLEMENTS Consider a subspace V = im(a) of R n, where A = [ ] v 1 v 2... v m. Then, V = { x in R n : v x = 0, for all v in

More information

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination Math 0, Winter 07 Final Exam Review Chapter. Matrices and Gaussian Elimination { x + x =,. Different forms of a system of linear equations. Example: The x + 4x = 4. [ ] [ ] [ ] vector form (or the column

More information

2.4 Hilbert Spaces. Outline

2.4 Hilbert Spaces. Outline 2.4 Hilbert Spaces Tom Lewis Spring Semester 2017 Outline Hilbert spaces L 2 ([a, b]) Orthogonality Approximations Definition A Hilbert space is an inner product space which is complete in the norm defined

More information

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 240 Spring, Chapter 1: Linear Equations and Matrices MATH 240 Spring, 2006 Chapter Summaries for Kolman / Hill, Elementary Linear Algebra, 8th Ed. Sections 1.1 1.6, 2.1 2.2, 3.2 3.8, 4.3 4.5, 5.1 5.3, 5.5, 6.1 6.5, 7.1 7.2, 7.4 DEFINITIONS Chapter 1: Linear

More information

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work. Assignment 1 Math 5341 Linear Algebra Review Give complete answers to each of the following questions Show all of your work Note: You might struggle with some of these questions, either because it has

More information

Summary of Week 9 B = then A A =

Summary of Week 9 B = then A A = Summary of Week 9 Finding the square root of a positive operator Last time we saw that positive operators have a unique positive square root We now briefly look at how one would go about calculating the

More information

Chapter 2: Vector Geometry

Chapter 2: Vector Geometry Chapter 2: Vector Geometry Daniel Chan UNSW Semester 1 2018 Daniel Chan (UNSW) Chapter 2: Vector Geometry Semester 1 2018 1 / 32 Goals of this chapter In this chapter, we will answer the following geometric

More information

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis,

orthogonal relations between vectors and subspaces Then we study some applications in vector spaces and linear systems, including Orthonormal Basis, 5 Orthogonality Goals: We use scalar products to find the length of a vector, the angle between 2 vectors, projections, orthogonal relations between vectors and subspaces Then we study some applications

More information

Lecture 1.4: Inner products and orthogonality

Lecture 1.4: Inner products and orthogonality Lecture 1.4: Inner products and orthogonality Matthew Macauley Department of Mathematical Sciences Clemson University http://www.math.clemson.edu/~macaule/ Math 4340, Advanced Engineering Mathematics M.

More information

Announcements Monday, November 20

Announcements Monday, November 20 Announcements Monday, November 20 You already have your midterms! Course grades will be curved at the end of the semester. The percentage of A s, B s, and C s to be awarded depends on many factors, and

More information

MATH Linear Algebra

MATH Linear Algebra MATH 304 - Linear Algebra In the previous note we learned an important algorithm to produce orthogonal sequences of vectors called the Gramm-Schmidt orthogonalization process. Gramm-Schmidt orthogonalization

More information

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v ) Section 3.2 Theorem 3.6. Let A be an m n matrix of rank r. Then r m, r n, and, by means of a finite number of elementary row and column operations, A can be transformed into the matrix ( ) Ir O D = 1 O

More information

Functional Analysis Exercise Class

Functional Analysis Exercise Class Functional Analysis Exercise Class Week: January 18 Deadline to hand in the homework: your exercise class on week January 5 9. Exercises with solutions (1) a) Show that for every unitary operators U, V,

More information

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products Linear Algebra Paul Yiu Department of Mathematics Florida Atlantic University Fall 2011 6A: Inner products In this chapter, the field F = R or C. We regard F equipped with a conjugation χ : F F. If F =

More information

Chapter 4 Euclid Space

Chapter 4 Euclid Space Chapter 4 Euclid Space Inner Product Spaces Definition.. Let V be a real vector space over IR. A real inner product on V is a real valued function on V V, denoted by (, ), which satisfies () (x, y) = (y,

More information

Pseudoinverse & Moore-Penrose Conditions

Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 7 ECE 275A Pseudoinverse & Moore-Penrose Conditions ECE 275AB Lecture 7 Fall 2008 V1.0 c K. Kreutz-Delgado, UC San Diego

More information

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

MATH 304 Linear Algebra Lecture 34: Review for Test 2. MATH 304 Linear Algebra Lecture 34: Review for Test 2. Topics for Test 2 Linear transformations (Leon 4.1 4.3) Matrix transformations Matrix of a linear mapping Similar matrices Orthogonality (Leon 5.1

More information

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS

MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS MA 1B ANALYTIC - HOMEWORK SET 7 SOLUTIONS 1. (7 pts)[apostol IV.8., 13, 14] (.) Let A be an n n matrix with characteristic polynomial f(λ). Prove (by induction) that the coefficient of λ n 1 in f(λ) is

More information

MATH. 20F SAMPLE FINAL (WINTER 2010)

MATH. 20F SAMPLE FINAL (WINTER 2010) MATH. 20F SAMPLE FINAL (WINTER 2010) You have 3 hours for this exam. Please write legibly and show all working. No calculators are allowed. Write your name, ID number and your TA s name below. The total

More information

Math 3191 Applied Linear Algebra

Math 3191 Applied Linear Algebra Math 9 Applied Linear Algebra Lecture : Orthogonal Projections, Gram-Schmidt Stephen Billups University of Colorado at Denver Math 9Applied Linear Algebra p./ Orthonormal Sets A set of vectors {u, u,...,

More information

Mathematics Department Stanford University Math 61CM/DM Vector spaces and linear maps

Mathematics Department Stanford University Math 61CM/DM Vector spaces and linear maps Mathematics Department Stanford University Math 61CM/DM Vector spaces and linear maps We start with the definition of a vector space; you can find this in Section A.8 of the text (over R, but it works

More information

Math Linear Algebra II. 1. Inner Products and Norms

Math Linear Algebra II. 1. Inner Products and Norms Math 342 - Linear Algebra II Notes 1. Inner Products and Norms One knows from a basic introduction to vectors in R n Math 254 at OSU) that the length of a vector x = x 1 x 2... x n ) T R n, denoted x,

More information

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MATH 3 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL MAIN TOPICS FOR THE FINAL EXAM:. Vectors. Dot product. Cross product. Geometric applications. 2. Row reduction. Null space, column space, row space, left

More information

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018

Lecture 13: Orthogonal projections and least squares (Section ) Thang Huynh, UC San Diego 2/9/2018 Lecture 13: Orthogonal projections and least squares (Section 3.2-3.3) Thang Huynh, UC San Diego 2/9/2018 Orthogonal projection onto subspaces Theorem. Let W be a subspace of R n. Then, each x in R n can

More information

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold: Inner products Definition: An inner product on a real vector space V is an operation (function) that assigns to each pair of vectors ( u, v) in V a scalar u, v satisfying the following axioms: 1. u, v

More information

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning:

Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u u 2 2 = u 2. Geometric Meaning: Recall: Dot product on R 2 : u v = (u 1, u 2 ) (v 1, v 2 ) = u 1 v 1 + u 2 v 2, u u = u 2 1 + u 2 2 = u 2. Geometric Meaning: u v = u v cos θ. u θ v 1 Reason: The opposite side is given by u v. u v 2 =

More information

Math 115A: Homework 5

Math 115A: Homework 5 Math 115A: Homework 5 1 Suppose U, V, and W are finite-dimensional vector spaces over a field F, and that are linear a) Prove ker ST ) ker T ) b) Prove nullst ) nullt ) c) Prove imst ) im S T : U V, S

More information

Dot Products, Transposes, and Orthogonal Projections

Dot Products, Transposes, and Orthogonal Projections Dot Products, Transposes, and Orthogonal Projections David Jekel November 13, 2015 Properties of Dot Products Recall that the dot product or standard inner product on R n is given by x y = x 1 y 1 + +

More information

Here are some additional properties of the determinant function.

Here are some additional properties of the determinant function. List of properties Here are some additional properties of the determinant function. Prop Throughout let A, B M nn. 1 If A = (a ij ) is upper triangular then det(a) = a 11 a 22... a nn. 2 If a row or column

More information

Chapter 6: Orthogonality

Chapter 6: Orthogonality Chapter 6: Orthogonality (Last Updated: November 7, 7) These notes are derived primarily from Linear Algebra and its applications by David Lay (4ed). A few theorems have been moved around.. Inner products

More information

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP) MATH 20F: LINEAR ALGEBRA LECTURE B00 (T KEMP) Definition 01 If T (x) = Ax is a linear transformation from R n to R m then Nul (T ) = {x R n : T (x) = 0} = Nul (A) Ran (T ) = {Ax R m : x R n } = {b R m

More information

Linear Algebra. Grinshpan

Linear Algebra. Grinshpan Linear Algebra Grinshpan Saturday class, 2/23/9 This lecture involves topics from Sections 3-34 Subspaces associated to a matrix Given an n m matrix A, we have three subspaces associated to it The column

More information

Linear Analysis Lecture 16

Linear Analysis Lecture 16 Linear Analysis Lecture 16 The QR Factorization Recall the Gram-Schmidt orthogonalization process. Let V be an inner product space, and suppose a 1,..., a n V are linearly independent. Define q 1,...,

More information

Math Linear Algebra

Math Linear Algebra Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner

More information

Lecture 10: Vector Algebra: Orthogonal Basis

Lecture 10: Vector Algebra: Orthogonal Basis Lecture 0: Vector Algebra: Orthogonal Basis Orthogonal Basis of a subspace Computing an orthogonal basis for a subspace using Gram-Schmidt Orthogonalization Process Orthogonal Set Any set of vectors that

More information

2. Every linear system with the same number of equations as unknowns has a unique solution.

2. Every linear system with the same number of equations as unknowns has a unique solution. 1. For matrices A, B, C, A + B = A + C if and only if A = B. 2. Every linear system with the same number of equations as unknowns has a unique solution. 3. Every linear system with the same number of equations

More information

STAT 151A: Lab 1. 1 Logistics. 2 Reference. 3 Playing with R: graphics and lm() 4 Random vectors. Billy Fang. 2 September 2017

STAT 151A: Lab 1. 1 Logistics. 2 Reference. 3 Playing with R: graphics and lm() 4 Random vectors. Billy Fang. 2 September 2017 STAT 151A: Lab 1 Billy Fang 2 September 2017 1 Logistics Billy Fang (blfang@berkeley.edu) Office hours: Monday 9am-11am, Wednesday 10am-12pm, Evans 428 (room changes will be written on the chalkboard)

More information

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2 Contents Preface for the Instructor xi Preface for the Student xv Acknowledgments xvii 1 Vector Spaces 1 1.A R n and C n 2 Complex Numbers 2 Lists 5 F n 6 Digression on Fields 10 Exercises 1.A 11 1.B Definition

More information

Singular Value Decomposition (SVD) and Polar Form

Singular Value Decomposition (SVD) and Polar Form Chapter 2 Singular Value Decomposition (SVD) and Polar Form 2.1 Polar Form In this chapter, we assume that we are dealing with a real Euclidean space E. Let f: E E be any linear map. In general, it may

More information

Math 312 Final Exam Jerry L. Kazdan May 5, :00 2:00

Math 312 Final Exam Jerry L. Kazdan May 5, :00 2:00 Math 32 Final Exam Jerry L. Kazdan May, 204 2:00 2:00 Directions This exam has three parts. Part A has shorter questions, (6 points each), Part B has 6 True/False questions ( points each), and Part C has

More information

Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices

Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices Math 108B Professor: Padraic Bartlett Lecture 6: Lies, Inner Product Spaces, and Symmetric Matrices Week 6 UCSB 2014 1 Lies Fun fact: I have deceived 1 you somewhat with these last few lectures! Let me

More information

Mathematical Optimisation, Chpt 2: Linear Equations and inequalities

Mathematical Optimisation, Chpt 2: Linear Equations and inequalities Mathematical Optimisation, Chpt 2: Linear Equations and inequalities Peter J.C. Dickinson p.j.c.dickinson@utwente.nl http://dickinson.website version: 12/02/18 Monday 5th February 2018 Peter J.C. Dickinson

More information

Linear Algebra Lecture Notes-II

Linear Algebra Lecture Notes-II Linear Algebra Lecture Notes-II Vikas Bist Department of Mathematics Panjab University, Chandigarh-64 email: bistvikas@gmail.com Last revised on March 5, 8 This text is based on the lectures delivered

More information

NORMS ON SPACE OF MATRICES

NORMS ON SPACE OF MATRICES NORMS ON SPACE OF MATRICES. Operator Norms on Space of linear maps Let A be an n n real matrix and x 0 be a vector in R n. We would like to use the Picard iteration method to solve for the following system

More information

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013. The University of Texas at Austin Department of Electrical and Computer Engineering EE381V: Large Scale Learning Spring 2013 Assignment Two Caramanis/Sanghavi Due: Tuesday, Feb. 19, 2013. Computational

More information

LECTURE 7. k=1 (, v k)u k. Moreover r

LECTURE 7. k=1 (, v k)u k. Moreover r LECTURE 7 Finite rank operators Definition. T is said to be of rank r (r < ) if dim T(H) = r. The class of operators of rank r is denoted by K r and K := r K r. Theorem 1. T K r iff T K r. Proof. Let T

More information

Exam 2 Solutions. (a) Is W closed under addition? Why or why not? W is not closed under addition. For example,

Exam 2 Solutions. (a) Is W closed under addition? Why or why not? W is not closed under addition. For example, Exam 2 Solutions. Let V be the set of pairs of real numbers (x, y). Define the following operations on V : (x, y) (x, y ) = (x + x, xx + yy ) r (x, y) = (rx, y) Check if V together with and satisfy properties

More information

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7 Linear Algebra and its Applications-Lab 1 1) Use Gaussian elimination to solve the following systems x 1 + x 2 2x 3 + 4x 4 = 5 1.1) 2x 1 + 2x 2 3x 3 + x 4 = 3 3x 1 + 3x 2 4x 3 2x 4 = 1 x + y + 2z = 4 1.4)

More information

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam Math 8, Linear Algebra, Lecture C, Spring 7 Review and Practice Problems for Final Exam. The augmentedmatrix of a linear system has been transformed by row operations into 5 4 8. Determine if the system

More information

Vectors. Vectors and the scalar multiplication and vector addition operations:

Vectors. Vectors and the scalar multiplication and vector addition operations: Vectors Vectors and the scalar multiplication and vector addition operations: x 1 x 1 y 1 2x 1 + 3y 1 x x n 1 = 2 x R n, 2 2 y + 3 2 2x = 2 + 3y 2............ x n x n y n 2x n + 3y n I ll use the two terms

More information

Linear Models Review

Linear Models Review Linear Models Review Vectors in IR n will be written as ordered n-tuples which are understood to be column vectors, or n 1 matrices. A vector variable will be indicted with bold face, and the prime sign

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Linear Algebra Practice Problems Page of 7 Linear Algebra Practice Problems These problems cover Chapters 4, 5, 6, and 7 of Elementary Linear Algebra, 6th ed, by Ron Larson and David Falvo (ISBN-3 = 978--68-78376-2,

More information

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n

MTH 309Y 37. Inner product spaces. = a 1 b 1 + a 2 b a n b n MTH 39Y 37. Inner product spaces Recall: ) The dot product in R n : a. a n b. b n = a b + a 2 b 2 +...a n b n 2) Properties of the dot product: a) u v = v u b) (u + v) w = u w + v w c) (cu) v = c(u v)

More information

Mathematics Department Stanford University Math 61CM/DM Inner products

Mathematics Department Stanford University Math 61CM/DM Inner products Mathematics Department Stanford University Math 61CM/DM Inner products Recall the definition of an inner product space; see Appendix A.8 of the textbook. Definition 1 An inner product space V is a vector

More information

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure.

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure. Hints for Exercises 1.3. This diagram says that f α = β g. I will prove f injective g injective. You should show g injective f injective. Assume f is injective. Now suppose g(x) = g(y) for some x, y A.

More information

Math 21b. Review for Final Exam

Math 21b. Review for Final Exam Math 21b. Review for Final Exam Thomas W. Judson Spring 2003 General Information The exam is on Thursday, May 15 from 2:15 am to 5:15 pm in Jefferson 250. Please check with the registrar if you have a

More information

Math 113 Winter 2013 Prof. Church Midterm Solutions

Math 113 Winter 2013 Prof. Church Midterm Solutions Math 113 Winter 2013 Prof. Church Midterm Solutions Name: Student ID: Signature: Question 1 (20 points). Let V be a finite-dimensional vector space, and let T L(V, W ). Assume that v 1,..., v n is a basis

More information

Linear Algebra 18 Orthogonality

Linear Algebra 18 Orthogonality Linear Algebra 18 Orthogonality Wei-Shi Zheng, wszheng@ieeeorg, 2011 1 What Do You Learn from This Note We still observe the unit vectors we have introduced in Chapter 1: 1 0 0 e 1 = 0, e 2 = 1, e 3 =

More information

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by Linear Algebra [] 4.2 The Dot Product and Projections. In R 3 the dot product is defined by u v = u v cos θ. 2. For u = (x, y, z) and v = (x2, y2, z2), we have u v = xx2 + yy2 + zz2. 3. cos θ = u v u v,

More information

Solutions to Final Exam

Solutions to Final Exam Solutions to Final Exam. Let A be a 3 5 matrix. Let b be a nonzero 5-vector. Assume that the nullity of A is. (a) What is the rank of A? 3 (b) Are the rows of A linearly independent? (c) Are the columns

More information

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008 Exam 2 will be held on Tuesday, April 8, 7-8pm in 117 MacMillan What will be covered The exam will cover material from the lectures

More information

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Final Review Written by Victoria Kala vtkala@mathucsbedu SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015 Summary This review contains notes on sections 44 47, 51 53, 61, 62, 65 For your final,

More information

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Throughout these notes we assume V, W are finite dimensional inner product spaces over C. Math 342 - Linear Algebra II Notes Throughout these notes we assume V, W are finite dimensional inner product spaces over C 1 Upper Triangular Representation Proposition: Let T L(V ) There exists an orthonormal

More information

Problem # Max points possible Actual score Total 120

Problem # Max points possible Actual score Total 120 FINAL EXAMINATION - MATH 2121, FALL 2017. Name: ID#: Email: Lecture & Tutorial: Problem # Max points possible Actual score 1 15 2 15 3 10 4 15 5 15 6 15 7 10 8 10 9 15 Total 120 You have 180 minutes to

More information

Lecture 7: Positive Semidefinite Matrices

Lecture 7: Positive Semidefinite Matrices Lecture 7: Positive Semidefinite Matrices Rajat Mittal IIT Kanpur The main aim of this lecture note is to prepare your background for semidefinite programming. We have already seen some linear algebra.

More information

Econ Lecture 8. Outline. 1. Bases 2. Linear Transformations 3. Isomorphisms

Econ Lecture 8. Outline. 1. Bases 2. Linear Transformations 3. Isomorphisms Econ 204 2011 Lecture 8 Outline 1. Bases 2. Linear Transformations 3. Isomorphisms 1 Linear Combinations and Spans Definition 1. Let X be a vector space over a field F. A linear combination of x 1,...,

More information

Section 6.4. The Gram Schmidt Process

Section 6.4. The Gram Schmidt Process Section 6.4 The Gram Schmidt Process Motivation The procedures in 6 start with an orthogonal basis {u, u,..., u m}. Find the B-coordinates of a vector x using dot products: x = m i= x u i u i u i u i Find

More information

MTH5102 Spring 2017 HW Assignment 4: Sec. 2.2, #3, 5; Sec. 2.3, #17; Sec. 2.4, #14, 17 The due date for this assignment is 2/22/

MTH5102 Spring 2017 HW Assignment 4: Sec. 2.2, #3, 5; Sec. 2.3, #17; Sec. 2.4, #14, 17 The due date for this assignment is 2/22/ MTH50 Spring 07 HW Assignment : Sec.. # 5; Sec.. #7; Sec.. # 7 The due date for this assignment is //7. Sec.. #. Let T : R R be defined by T (a a = (a a a a + a. Let β be the standard ordered basis for

More information

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions

Warm-up. True or false? Baby proof. 2. The system of normal equations for A x = y has solutions iff A x = y has solutions Warm-up True or false? 1. proj u proj v u = u 2. The system of normal equations for A x = y has solutions iff A x = y has solutions 3. The normal equations are always consistent Baby proof 1. Let A be

More information