MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Similar documents
Math 3191 Applied Linear Algebra

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

235 Final exam review questions

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

MATH 423 Linear Algebra II Lecture 20: Geometry of linear transformations. Eigenvalues and eigenvectors. Characteristic polynomial.

ft-uiowa-math2550 Assignment NOTRequiredJustHWformatOfQuizReviewForExam3part2 due 12/31/2014 at 07:10pm CST

Math 18, Linear Algebra, Lecture C00, Spring 2017 Review and Practice Problems for Final Exam

Solutions to Review Problems for Chapter 6 ( ), 7.1

2. Every linear system with the same number of equations as unknowns has a unique solution.

MATH 304 Linear Algebra Lecture 23: Diagonalization. Review for Test 2.

Homework 11 Solutions. Math 110, Fall 2013.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Worksheet for Lecture 25 Section 6.4 Gram-Schmidt Process

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Chapter 6: Orthogonality

MAT Linear Algebra Collection of sample exams

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

SOLUTION KEY TO THE LINEAR ALGEBRA FINAL EXAM 1 2 ( 2) ( 1) c a = 1 0

Lecture 10: Vector Algebra: Orthogonal Basis

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Exam questions with full solutions

MA 265 FINAL EXAM Fall 2012

Lecture 1: Review of linear algebra

MATH 115A: SAMPLE FINAL SOLUTIONS

Definition 1. A set V is a vector space over the scalar field F {R, C} iff. there are two operations defined on V, called vector addition

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra. Session 12

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #2 Solutions

Lecture 3: Review of Linear Algebra

MATH Spring 2011 Sample problems for Test 2: Solutions

Math 261 Lecture Notes: Sections 6.1, 6.2, 6.3 and 6.4 Orthogonal Sets and Projections

Definitions for Quizzes

MTH 2032 SemesterII

Family Feud Review. Linear Algebra. October 22, 2013

MATH Linear Algebra

Linear Algebra Final Exam Study Guide Solutions Fall 2012

Homework 5. (due Wednesday 8 th Nov midnight)

Applied Linear Algebra in Geoscience Using MATLAB

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

1 Last time: least-squares problems

2. Review of Linear Algebra

Math 3191 Applied Linear Algebra

(v, w) = arccos( < v, w >

Problem # Max points possible Actual score Total 120

The Gram Schmidt Process

The Gram Schmidt Process

Chapter 3 Transformations

I. Multiple Choice Questions (Answer any eight)

Elements of linear algebra

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Lecture 9: Vector Algebra

Math 115A: Homework 5

MATH 3330 INFORMATION SHEET FOR TEST 3 SPRING Test 3 will be in PKH 113 in class time, Tues April 21

ELE/MCE 503 Linear Algebra Facts Fall 2018

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

Section 6.4. The Gram Schmidt Process

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Linear Algebra Massoud Malek

Conceptual Questions for Review

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Lecture 1: Systems of linear equations and their solutions

Review problems for MA 54, Fall 2004.

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Math 290, Midterm II-key

Lecture 3: Review of Linear Algebra

Designing Information Devices and Systems II

Linear Algebra. Paul Yiu. Department of Mathematics Florida Atlantic University. Fall A: Inner products

LINEAR ALGEBRA SUMMARY SHEET.

EXAM. Exam 1. Math 5316, Fall December 2, 2012

Quizzes for Math 304

Math 265 Linear Algebra Sample Spring 2002., rref (A) =

Eigenvalues and Eigenvectors A =

Lecture 03. Math 22 Summer 2017 Section 2 June 26, 2017

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

Linear Algebra Review. Vectors

Chapter 5. Eigenvalues and Eigenvectors

HOSTOS COMMUNITY COLLEGE DEPARTMENT OF MATHEMATICS

Diagonalizing Matrices

Math 407: Linear Optimization

There are two things that are particularly nice about the first basis

Vectors. Vectors and the scalar multiplication and vector addition operations:

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 220 FINAL EXAMINATION December 13, Name ID # Section #

6. Orthogonality and Least-Squares

Cheat Sheet for MATH461

Lecture 4 Orthonormal vectors and QR factorization

and u and v are orthogonal if and only if u v = 0. u v = x1x2 + y1y2 + z1z2. 1. In R 3 the dot product is defined by

Linear Algebra Practice Problems

Final Exam Practice Problems Answers Math 24 Winter 2012

LINEAR ALGEBRA W W L CHEN

MATH 1553 PRACTICE FINAL EXAMINATION

Math 3C Lecture 25. John Douglas Moore

Transcription:

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Orthogonal sets Let V be a vector space with an inner product. Definition. Nonzero vectors v 1,v 2,...,v k V form an orthogonal set if they are orthogonal to each other: v i,v j = 0 for i j. If, in addition, all vectors are of unit norm, v i = 1, then v 1,v 2,...,v k is called an orthonormal set. Theorem Any orthogonal set is linearly independent.

Orthogonal projection Theorem Let V be a finite-dimensional inner product space and V 0 be a subspace of V. Then any vector x V is uniquely represented as x = p + o, where p V 0 and o V 0. The component p is the orthogonal projection of the vector x onto the subspace V 0. The distance from x to the subspace V 0 is o. If v 1,v 2,...,v n is an orthogonal basis for V 0 then p = x,v 1 v 1,v 1 v 1 + x,v 2 v 2,v 2 v 2 + + x,v n v n,v n v n.

o x p V 0

The Gram-Schmidt orthogonalization process Let V be a vector space with an inner product. Suppose x 1,x 2,...,x n is a basis for V. Let v 1 = x 1, v 2 = x 2 x 2,v 1 v 1,v 1 v 1, v 3 = x 3 x 3,v 1 v 1,v 1 v 1 x 3,v 2 v 2,v 2 v 2,... v n = x n x n,v 1 v 1,v 1 v 1 x n,v n 1 v n 1,v n 1 v n 1. Then v 1,v 2,...,v n is an orthogonal basis for V.

v 3 x 3 p 3 Span(v 1,v 2 ) = Span(x 1,x 2 )

Any basis x 1,x 2,...,x n Orthogonal basis v 1,v 2,...,v n Properties of the Gram-Schmidt process: v k = x k (α 1 x 1 + + α k 1 x k 1 ), 1 k n; the span of v 1,...,v k is the same as the span of x 1,...,x k ; v k is orthogonal to x 1,...,x k 1 ; v k = x k p k, where p k is the orthogonal projection of the vector x k on the subspace spanned by x 1,...,x k 1 ; v k is the distance from x k to the subspace spanned by x 1,...,x k 1.

Normalization Let V be a vector space with an inner product. Suppose v 1,v 2,...,v n is an orthogonal basis for V. Let w 1 = v 1 v 1, w 2 = v 2 v 2,..., w n = v n v n. Then w 1,w 2,...,w n is an orthonormal basis for V. Theorem Any finite-dimensional vector space with an inner product has an orthonormal basis. Remark. An infinite-dimensional vector space with an inner product may or may not have an orthonormal basis.

Orthogonalization / Normalization An alternative form of the Gram-Schmidt process combines orthogonalization with normalization. Suppose x 1,x 2,...,x n is a basis for an inner product space V. Let v 1 = x 1, w 1 = v 1 v 1, v 2 = x 2 x 2,w 1 w 1, w 2 = v 2 v 2, v 3 = x 3 x 3,w 1 w 1 x 3,w 2 w 2, w 3 = v 3 v 3,... v n = x n x n,w 1 w 1 x n,w n 1 w n 1, w n = v n v n. Then w 1,w 2,...,w n is an orthonormal basis for V.

Problem. Let Π be the plane in R 3 spanned by vectors x 1 = (1, 2, 2) and x 2 = ( 1, 0, 2). (i) Find an orthonormal basis for Π. (ii) Extend it to an orthonormal basis for R 3. x 1,x 2 is a basis for the plane Π. We can extend it to a basis for R 3 by adding one vector from the standard basis. For instance, vectors x 1, x 2, and x 3 = (0, 0, 1) form a basis for R 3 because 1 2 2 1 0 2 0 0 1 = 1 2 1 0 = 2 0.

Using the Gram-Schmidt process, we orthogonalize the basis x 1 = (1, 2, 2), x 2 = ( 1, 0, 2), x 3 = (0, 0, 1): v 1 = x 1 = (1, 2, 2), v 2 = x 2 x 2,v 1 v 1,v 1 v 1 = ( 1, 0, 2) 3 (1, 2, 2) 9 = ( 4/3, 2/3, 4/3), v 3 = x 3 x 3,v 1 v 1,v 1 v 1 x 3,v 2 v 2,v 2 v 2 = (0, 0, 1) 2 4/3 (1, 2, 2) ( 4/3, 2/3, 4/3) 9 4 = (2/9, 2/9, 1/9).

Now v 1 = (1, 2, 2), v 2 = ( 4/3, 2/3, 4/3), v 3 = (2/9, 2/9, 1/9) is an orthogonal basis for R 3 while v 1,v 2 is an orthogonal basis for Π. It remains to normalize these vectors. v 1,v 1 = 9 = v 1 = 3 v 2,v 2 = 4 = v 2 = 2 v 3,v 3 = 1/9 = v 3 = 1/3 w 1 = v 1 / v 1 = (1/3, 2/3, 2/3) = 1 3 (1, 2, 2), w 2 = v 2 / v 2 = ( 2/3, 1/3, 2/3) = 1 3 ( 2, 1, 2), w 3 = v 3 / v 3 = (2/3, 2/3, 1/3) = 1 3 (2, 2, 1). w 1,w 2 is an orthonormal basis for Π. w 1,w 2,w 3 is an orthonormal basis for R 3.

Problem. Find the distance from the point y = (0, 0, 0, 1) to the subspace Π R 4 spanned by vectors x 1 = (1, 1, 1, 1), x 2 = (1, 1, 3, 1), and x 3 = ( 3, 7, 1, 3). Let us apply the Gram-Schmidt process to vectors x 1,x 2,x 3,y. We should obtain an orthogonal system v 1,v 2,v 3,v 4. The desired distance will be v 4.

x 1 = (1, 1, 1, 1), x 2 = (1, 1, 3, 1), x 3 = ( 3, 7, 1, 3), y = (0, 0, 0, 1). v 1 = x 1 = (1, 1, 1, 1), v 2 = x 2 x 2,v 1 v 1,v 1 v 1 = (1, 1, 3, 1) 4 (1, 1, 1, 1) 4 = (0, 2, 2, 0), v 3 = x 3 x 3,v 1 v 1,v 1 v 1 x 3,v 2 v 2,v 2 v 2 = ( 3, 7, 1, 3) 12 16 (1, 1, 1, 1) (0, 2, 2, 0) 4 8 = (0, 0, 0, 0).

The Gram-Schmidt process can be used to check linear independence of vectors! The vector x 3 is a linear combination of x 1 and x 2. Π is a plane, not a 3-dimensional subspace. We should orthogonalize vectors x 1,x 2,y. v 4 = y y,v 1 v 1,v 1 v 1 y,v 2 v 2,v 2 v 2 = (0, 0, 0, 1) 1 4 (1, 1, 1, 1) 0 (0, 2, 2, 0) 8 = (1/4, 1/4, 1/4, 3/4). ( 1 v 4 = 4, 1 4, 1 4, 3 ) 1 12 = (1, 1, 1, 3) = = 4 4 4 3 2.

Problem. Find the distance from the point z = (0, 0, 1, 0) to the plane Π that passes through the point x 0 = (1, 0, 0, 0) and is parallel to the vectors v 1 = (1, 1, 1, 1) and v 2 = (0, 2, 2, 0). The plane Π is not a subspace of R 4 as it does not pass through the origin. Let Π 0 = Span(v 1,v 2 ). Then Π = Π 0 + x 0. Hence the distance from the point z to the plane Π is the same as the distance from the point z x 0 to the plane Π 0. We shall apply the Gram-Schmidt process to vectors v 1,v 2,z x 0. This will yield an orthogonal system w 1,w 2,w 3. The desired distance will be w 3.

v 1 = (1, 1, 1, 1), v 2 = (0, 2, 2, 0), z x 0 = ( 1, 0, 1, 0). w 1 = v 1 = (1, 1, 1, 1), w 2 = v 2 v 2,w 1 w 1,w 1 w 1 = v 2 = (0, 2, 2, 0) as v 2 v 1. w 3 = (z x 0 ) z x 0,w 1 w 1,w 1 w 1 z x 0,w 2 w 2 w 2,w 2 = ( 1, 0, 1, 0) 0 4 (1, 1, 1, 1) 2 (0, 2, 2, 0) 8 = ( 1, 1/2, 1/2, 0). ( w 3 = 1, 1 2, 1 ) 2, 0 1 = ( 2, 1, 1, 0) = 2 6 2 = 3 2.

Linear transformations of R 2 Any linear mapping f : R 2 R 2 is represented as multiplication of a 2-dimensional column vector by a 2 2 matrix: f (x) = Ax or f ( x y ) = ( a b c d ) ( ) x. y Linear transformations corresponding to particular matrices can have various geometric properties.

A = ( ) 0 1 1 0 Rotation by 90 o

( 1 2 1 2 ) A = 1 2 1 2 Rotation by 45 o

A = ( ) 1 0 0 1 Reflection in the vertical axis

A = ( ) 0 1 1 0 Reflection in the line x y = 0

A = ( ) 1 1/2 0 1 Horizontal shear

( ) 1/2 0 A = 0 1/2 Scaling

( ) 3 0 A = 0 1/3 Squeeze

A = ( ) 1 0 0 0 Vertical projection on the horizontal axis

A = ( ) 0 1 0 1 Horizontal projection on the line x + y = 0

A = ( ) 1 0 0 1 Identity

Eigenvalues and eigenvectors of a matrix Definition. Let A be an n n matrix. A number λ R is called an eigenvalue of the matrix A if Av = λv for a nonzero column vector v R n. The vector v is called an eigenvector of A belonging to (or associated with) the eigenvalue λ. Remarks. Alternative notation: eigenvalue = characteristic value, eigenvector = characteristic vector. The zero vector is never considered an eigenvector.

( ) 2 0 Example. A =. 0 3 ( ) ( ) ( ) ( ) 2 0 1 2 1 = = 2, 0 3 0 0 0 ( ) ( ) ( ) ( ) 2 0 0 0 0 = = 3. 0 3 2 6 2 Hence (1, 0) is an eigenvector of A belonging to the eigenvalue 2, while (0, 2) is an eigenvector of A belonging to the eigenvalue 3.