Math Linear Algebra

Similar documents
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

MTH 2310, FALL Introduction

Math 3191 Applied Linear Algebra

March 27 Math 3260 sec. 56 Spring 2018

Worksheet for Lecture 23 (due December 4) Section 6.1 Inner product, length, and orthogonality

Chapter 6: Orthogonality

v = v 1 2 +v 2 2. Two successive applications of this idea give the length of the vector v R 3 :

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Math 2331 Linear Algebra

MATH 221, Spring Homework 10 Solutions

6. Orthogonality and Least-Squares

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

Section 6.2, 6.3 Orthogonal Sets, Orthogonal Projections

Homework 11 Solutions. Math 110, Fall 2013.

Family Feud Review. Linear Algebra. October 22, 2013

Math 290, Midterm II-key

The geometry of least squares

MATH Linear Algebra

Applied Linear Algebra in Geoscience Using MATLAB

MIDTERM I LINEAR ALGEBRA. Friday February 16, Name PRACTICE EXAM SOLUTIONS

Math 3C Lecture 25. John Douglas Moore

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Problem # Max points possible Actual score Total 120

Linear Independence. MATH 322, Linear Algebra I. J. Robert Buchanan. Spring Department of Mathematics

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

Dot Products, Transposes, and Orthogonal Projections

2.3. VECTOR SPACES 25

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Orthogonal Complements

Section 6.1. Inner Product, Length, and Orthogonality

Linear Models Review

Linear Algebra Review. Vectors

Orthogonality and Least Squares

6.1. Inner Product, Length and Orthogonality

which are not all zero. The proof in the case where some vector other than combination of the other vectors in S is similar.

Mathematics Department Stanford University Math 61CM/DM Inner products

The Cross Product of Two Vectors

Review of Linear Algebra

Math 235: Linear Algebra

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Math 291-2: Lecture Notes Northwestern University, Winter 2016

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

The 'linear algebra way' of talking about "angle" and "similarity" between two vectors is called "inner product". We'll define this next.

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

of A in U satisfies S 1 S 2 = { 0}, S 1 + S 2 = R n. Examples 1: (a.) S 1 = span . 1 (c.) S 1 = span, S , S 2 = span 0 (d.

Overview. Motivation for the inner product. Question. Definition

Linear Algebra: Homework 3

Vectors in the Plane

Typical Problem: Compute.

Review problems for MA 54, Fall 2004.

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Linear Algebra Final Exam Solutions, December 13, 2008

Math Linear Algebra II. 1. Inner Products and Norms

Inner Product Spaces 6.1 Length and Dot Product in R n

Vector Spaces. distributive law u,v. Associative Law. 1 v v. Let 1 be the unit element in F, then

Chapter 6. Orthogonality and Least Squares

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Math 113 Final Exam: Solutions

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

SOLUTIONS TO EXERCISES FOR MATHEMATICS 133 Part 1. I. Topics from linear algebra

Chapter 3: Theory Review: Solutions Math 308 F Spring 2015

MATH10212 Linear Algebra B Homework Week 4

LINEAR ALGEBRA REVIEW

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Inner Product, Length, and Orthogonality

Linear Algebra. Alvin Lin. August December 2017

MATH 22A: LINEAR ALGEBRA Chapter 4

A Primer in Econometric Theory

Vectors. Vectors and the scalar multiplication and vector addition operations:

Math 414: Linear Algebra II, Fall 2015 Final Exam

x 2 For example, Theorem If S 1, S 2 are subspaces of R n, then S 1 S 2 is a subspace of R n. Proof. Problem 3.

Review 1 Math 321: Linear Algebra Spring 2010

Math 2331 Linear Algebra

Exercises for Unit I (Topics from linear algebra)

MAT2342 : Introduction to Applied Linear Algebra Mike Newman, fall Projections. introduction

Homework 5. (due Wednesday 8 th Nov midnight)

Exercises for Unit I (Topics from linear algebra)

Row Space, Column Space, and Nullspace

Math 21b: Linear Algebra Spring 2018

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

2. Review of Linear Algebra

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

MATH 115A: SAMPLE FINAL SOLUTIONS

Linear Algebra 2 Spectral Notes

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Chapter 3. Directions: For questions 1-11 mark each statement True or False. Justify each answer.

Lecture 5: Vector Spaces I - Definitions

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Math 113 Homework 5 Solutions (Starred problems) Solutions by Guanyang Wang, with edits by Tom Church.

A 2 G 2 A 1 A 1. (3) A double edge pointing from α i to α j if α i, α j are not perpendicular and α i 2 = 2 α j 2

Orthogonal Projection and Least Squares Prof. Philip Pennance 1 -Version: December 12, 2016

Transcription:

Math 220 - Linear Algebra (Summer 208) Solutions to Homework #7 Exercise 6..20 (a) TRUE. u v v u = 0 is equivalent to u v = v u. The latter identity is true due to the commutative property of the inner product (Theorem (a) on page 333). Therefore, the original identity is also true. (b) FALSE. This would be false for any c < 0 and v 0. If v 0, then, given c > 0, cv 0. By definition, v 0 for any v. Moreover, it is also easy to see from the definition that the vector norm is strictly greater than zero for any non-zero vector. Therefore, v > 0 and cv > 0. Clearly, given the above, the identity cv = c v does not make sense, since its left-hand side is positive, but the right-hand side is negative. (c) TRUE. Clearly follows from the definition of an orthogonal complement on page 336. (d) TRUE. The set of all vectors z that are orthogonal to W is called the orthogonal complement of W and is denoted by W. Clearly follows from the Pythagorean Theorem (Theorem 2 on page 336). Two vectors u and v are orthogonal if and only if u + v 2 = u 2 + v 2.

Exercise 6.2.23 (a) TRUE. For, example the set {[ ] [ ], 0 is linearly independent since 0 = 0 = 0, but not orthogonal since [ ] [ ] = + 0 = 0. 0 (c) FALSE. The book says: When the vectors in an orthogonal set of nonzero vectors are normalized to have unit length, the new vectors will still be orthogonal, and hence the new set will be an orthonormal set. This could be easily shown as follows. orthogonal set of non-zero vectors, i.e. Let S = {v,..., v n be an v i 0 for any i n, () and Let also v i v j = 0 for any i n, j n such that i j. (2) i.e. c,..., c n are norms of vectors from S. c i = v i for i n, (3) Normalizing a set is normalizing each vector in this set by scaling it by its own norm (so each vector becomes a unit vector). Therefore, we normalize S by constructing a new set S = {w,..., w n, where w i = v i v i, for all i n (4) 2

i.e. original vectors scaled by their norms. Now, to check that S is also orthogonal we need to show that w i w j = 0 for any i n, j n such that i j. (5) This could be done using (2) and (4). Let w i, w j S and i j. Then, w i w j (4) = v i v i v j v j = v i v j v (2) i v j = 0 = 0. (6) v i v j (d) FALSE. 0 This is not always true. Consider a counterexample A = 0. 0 0 Columns of A are orthonormal (orthogonal and have unit length), however A is not an orthogonal matrix. By definition, in order for A to be an orthogonal matrix its columns should form an orthonormal basis for R n (n = 3 in our example). Clearly, columns of A could not form a basis for R 3 since there are just 2 of them, and any basis in R 3 should have 3 vectors. Note that is A is not square. However, it is always true that any matrix which is square and has orthonormal columns is orthogonal. Also, note that the book gives a different definition of an orthogonal matrix on page 346 (which explicitly requires a matrix to be square): An orthogonal matrix is a square invertible matrix U such that U = U T. This definition of an orthogonal matrix is equivalent to what was provided in class (orthogonal matrix is a matrix whose columns form an orthonormal basis for R n ) due to Theorem 6 on page 345: (e) FALSE. A m n matrix U has orthonormal columns if and only if U T U = I. The distance from y to L is y ŷ, not ŷ (of course, the two values may be equal in some cases, but this is not true in general). See Example 4 on page 343. It considers specific vectors in R 2, however the argument made there is quite general and applicable to this question. 3

The distance from y to L is the length of the perpendicular line segment from y to the orthogonal projection ŷ. This length equals the length of y ŷ. Thus the distance is y ŷ. Figure 3 on page 343 gives a good illustration. Exercise 6.3.24 (a) {w,..., w p, v,..., v q is orthogonal if inner product of any two vectors in it is 0, i.e. all of the following should hold w i w j = 0 for any i p, j p such that i j, () v i v j = 0 for any i q, j q such that i j, (2) w i v j = 0 for any i p, j q. (3) () and (2) follows from the facts that W is orthogonal and W is also orthogonal, respectively. (3) follows from the definition of an orthogonal complement (page 336). Since ()-(3) is true, {w,..., w p, v,..., v q is orthogonal. 4

(b) By the Orthogonal Decomposition Theorem (Theorem 8 on page 350), any vector y R n can be written uniquely in the form y = ŷ + z () where ŷ is in W and z is in W. Since ŷ is in W and {w,..., w p is a basis for W, we can represent ŷ as a linear combination of w,..., w p, i.e. ŷ = a w + + a p w p (2) where a,..., a p are some scalars. Similarly, since z is in W and {v,..., v q is a basis for W, we can represent z as a linear combination of v,..., v q, i.e. where b,..., b q are some scalars. Now, plugging (2) and (3) into () we get z = b v + + b q v q (3) y = ŷ + z = a w + + a p w p + b v + + b q v q. (4) This means that y can be represented as a linear combination of w,..., w p, v,..., v q. Since y is an arbitrary vector from R n, by definition, the set {w,..., w p, v,..., v q spans R n. (c) We showed in (a) that {w,..., w p, v,..., v q is orthogonal. By Theorem 4 on page 340, any orthogonal set is linearly independent. Also, in (b) we showed that {w,..., w p, v,..., v q spans R n. Then, it follows from the above two facts (by definition) that {w,..., w p, v,..., v q is a basis for R n. The size of this set is p + q or, alternatively, dim {{ W + dim {{ W. Since p q this set is a basis for R n its size equals n, the dimension of R n, i.e. dim W + dim W = n. 5