Lecture 14: Orthogonality and general vector spaces. 2 Orthogonal vectors, spaces and matrices

Similar documents
Lecture 13: Row and column spaces

Practice Problems for the Final Exam

Math 123, Week 5: Linear Independence, Basis, and Matrix Spaces. Section 1: Linear Independence

Lecture 10: Determinants and Cramer s Rule

Lecture 11: Vector space and subspace

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

Math 2174: Practice Midterm 1

PRACTICE PROBLEMS FOR THE FINAL

Math 369 Exam #2 Practice Problem Solutions

The definition of a vector space (V, +, )

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

1. General Vector Spaces

Dimension and Structure

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Math 308 Practice Test for Final Exam Winter 2015

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

Practice Final Exam. Solutions.

MATH2210 Notebook 3 Spring 2018

Jim Lambers MAT 610 Summer Session Lecture 1 Notes

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Glossary of Linear Algebra Terms. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

2018 Fall 2210Q Section 013 Midterm Exam II Solution

ELE/MCE 503 Linear Algebra Facts Fall 2018

Final Examination 201-NYC-05 - Linear Algebra I December 8 th, and b = 4. Find the value(s) of a for which the equation Ax = b

5.4 Basis And Dimension

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Lecture Summaries for Linear Algebra M51A

Math 544, Exam 2 Information.

Row Space, Column Space, and Nullspace

Math 2030 Assignment 5 Solutions

Lecture 3q Bases for Row(A), Col(A), and Null(A) (pages )

SUMMARY OF MATH 1600

7. Dimension and Structure.

2. Every linear system with the same number of equations as unknowns has a unique solution.

Cheat Sheet for MATH461

1. Linear systems of equations. Chapters 7-8: Linear Algebra. Solution(s) of a linear system of equations (continued)

Review Notes for Linear Algebra True or False Last Updated: February 22, 2010

Lecture 22: Section 4.7

Math Final December 2006 C. Robinson

Third Midterm Exam Name: Practice Problems November 11, Find a basis for the subspace spanned by the following vectors.

4. Matrix inverses. left and right inverse. linear independence. nonsingular matrices. matrices with linearly independent columns

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Linear Algebra Highlights

Linear Algebra Practice Problems

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

Vector Spaces 4.5 Basis and Dimension

MATH 2360 REVIEW PROBLEMS

Math 407: Linear Optimization

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Linear Algebra: Homework 7

EK102 Linear Algebra PRACTICE PROBLEMS for Final Exam Spring 2016

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

MATH 221: SOLUTIONS TO SELECTED HOMEWORK PROBLEMS

Math Exam 2, October 14, 2008

Lecture 7: Introduction to linear systems

Midterm #2 Solutions

MATH 2210Q MIDTERM EXAM I PRACTICE PROBLEMS

Solution of Linear Equations

(a) only (ii) and (iv) (b) only (ii) and (iii) (c) only (i) and (ii) (d) only (iv) (e) only (i) and (iii)

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

In Class Peer Review Assignment 2

Instructions Please answer the five problems on your own paper. These are essay questions: you should write in complete sentences.

Review 1 Math 321: Linear Algebra Spring 2010

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

Chapter 2. General Vector Spaces. 2.1 Real Vector Spaces

4.3 - Linear Combinations and Independence of Vectors

LINEAR ALGEBRA QUESTION BANK

Exam in TMA4110 Calculus 3, June 2013 Solution

Matrix Algebra for Engineers Jeffrey R. Chasnov

Conceptual Questions for Review

Computational Methods CMSC/AMSC/MAPL 460. Eigenvalues and Eigenvectors. Ramani Duraiswami, Dept. of Computer Science

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

x 1 + 2x 2 + 3x 3 = 0 x 1 + 2x 2 + 3x 3 = 0, x 2 + x 3 = 0 x 3 3 x 3 1

Math 314/ Exam 2 Blue Exam Solutions December 4, 2008 Instructor: Dr. S. Cooper. Name:

. The following is a 3 3 orthogonal matrix: 2/3 1/3 2/3 2/3 2/3 1/3 1/3 2/3 2/3

A SHORT SUMMARY OF VECTOR SPACES AND MATRICES

MTH 362: Advanced Engineering Mathematics

Chapter 2 Notes, Linear Algebra 5e Lay

AMS526: Numerical Analysis I (Numerical Linear Algebra)

MODULE 8 Topics: Null space, range, column space, row space and rank of a matrix

Math 321: Linear Algebra

Chapter 3. Vector spaces

Answer Key for Exam #2

Math 224, Fall 2007 Exam 3 Thursday, December 6, 2007

SECTION 3.3. PROBLEM 22. The null space of a matrix A is: N(A) = {X : AX = 0}. Here are the calculations of AX for X = a,b,c,d, and e. =

3.3 Linear Independence

YORK UNIVERSITY. Faculty of Science Department of Mathematics and Statistics MATH M Test #1. July 11, 2013 Solutions

Transpose & Dot Product

Find the solution set of 2x 3y = 5. Answer: We solve for x = (5 + 3y)/2. Hence the solution space consists of all vectors of the form

Linear Algebra Fundamentals

MATH 300, Second Exam REVIEW SOLUTIONS. NOTE: You may use a calculator for this exam- You only need something that will perform basic arithmetic.

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

Typical Problem: Compute.

Online Exercises for Linear Algebra XM511

DS-GA 1002 Lecture notes 10 November 23, Linear models

Math 51, Homework-2. Section numbers are from the course textbook.

Chapter 1 Vector Spaces

Transcription:

Lecture 14: Orthogonality and general vector spaces 1 Symmetric matrices Recall the definition of transpose A T in Lecture note 9. Definition 1.1. If a square matrix S satisfies then we say S is a symmetric matrix. Useful properties: 1. For any square matrix A, are two symmetric matrices. 2. If S is symmetric, then S 1 is also symmetric. S = S T, (1.1) S 1 = A+A T, (1.2) S 2 = A T A, (1.3) Remark 1.1. If S T = S, then we say S is a skew-symmetric (or antisymmetric or antimetric) matrix. 2 Orthogonal vectors, spaces and matrices 2.1 Orthogonal vectors and spaces Definition 2.1. The dot product of two vectors in R n is defined by u v = n u k v k, k=1 for any two vectors u = [u 1,u 2,...,u n ] T and v = [v 1,v 2,...,v n ] T in R n. Definition 2.2. Two vectors u and v are called orthogonal or perpendicular to each other, denoted as u v, if and only if u v = 0. 1 Copy right reserved by Yingwei Wang

Definition 2.3. Let S = {v 1,v 2,...,v k } be a basis for a vector space V. If the vectors in S are mutually orthogonal that is, each two of them are orthogonal then S is called an orthogonal basis for V. For example, the following is an orthogonal basis for for R n, 1 0., 1.,..., 0.. 1 Definition 2.4. The two vector spaces V and U are called orthogonal or perpendicular to each other, denoted as V U, if and only if for any u U and v V. u v = 0, Remark 2.1. Suppose the basis sets for the vector spaces V and U are respectively, Then V U is equivalent to S v S u. V = span (S v ), U = span (S u ). Forexample, anytwoofthefollowingsubspacesofr 3 areperpendicular/orthogonal to each other, 1 0 0 X = span 0, Y = span 1, Z = span 0, 1 X Y, X Z, Y Z. Another example: 1 U = span 0, 0 U Z. 0 0 1, Z = span 0, 2 Copy right reserved by Yingwei Wang

Proposition 2.1 (orthogonal relation between row space and null space). Given a matrix A, its null space Null(A) and row space Row(A) are perpendicular/orthogonal to each other. Proof. Let the j-th row of the matrix A m n be r j = [a j1,a j2,...,a jn ], i.e., a 11 a 12... a 1n r 1 a 21 a 22... a 2n r 2 A m n =..... =... a m1 a m2... a mn r m Then Ax = 0 is equivalent to the following: Ax = 0, r 1 x = 0, r 2 x = 0, r m x = 0. m n (c 1 r 1 +c 2 r 2 + +c m r m ) x = 0. It implies that r Row(A) and x Null(A), we have It follows that Null(A) Row(A). r x = 0. Remark 2.2. For any matrix A, its leftnull space and column space are also perpendicular/orthogonal to each other, i.e., 2.2 Orthogonal matrices Definition 2.5. If the matrix Q satisfies then we say Q is an orthogonal matrix. Null(A T ) Col(A). Q 1 = Q T, (2.1) 3 Copy right reserved by Yingwei Wang

Consider the columns of an orthogonal matrix Q = [q 1,q 2 ], then [ ] [ q Q T T [q1 Q = 1 ] q T q 2 = 1 q 1 q T 1 q ] [ ] 2 1 0 q T 2q 1 q T =. (2.2) 2q 2 q T 2 It implies that the columns Q, i.e., {q 1,q 2 }, are mutually orthogonal. By using QQ T = I, we can also show that the rows of Q are mutually orthogonal. That is why we have the Definition 2.5. Several 2-by-2 examples of orthogonal matrices: I = ( ) 1 0, P = ( ) 1 0, Q = ( cosθ sinθ sinθ cosθ ), R = ( ) cosθ sinθ. (2.3) sinθ cosθ Remark 2.3. Please check the equality (2.1) for P,Q,R defined by (2.3). Useful properties: Let Q be an orthogonal matrix. 1. The solution to the linear system Qx = b is x = Q T b. 2. Foranycolumnvectorx R n,wehave Qx = x,where x = x 2 1 +x2 2 + +x2 n. 3. If both Q 1 and Q 2 are orthogonal, then Q 1 Q 2 is also orthogonal. 3 More general vector spaces (other than R n ) Matrix spaces. Consider the M 2 : 1. Zero vector: 2. Basis set: M 2 = {all of 2-by-2 matrices}, { ( ) } a b = A =, a,b,c,d R. c d ( ). ), {( 1 0 3. Dimension: dim(m 2 ) = 4. ( ), 4. Some usefull subspaces of M 2 : ( ), 1 0 ( )}. 4 Copy right reserved by Yingwei Wang

(a) Diagonal matrix space. { ( ) a 0 D 2 = A = 0 d {( ) 1 0 = span, (b) Symmetric matrix space. (c) Zero trace matrix space. }, a,d R )} ( S 2 = { A M 2 : A = A T} { ( ) } a b = A =, b = c c d {( ) ( ) ( )} 1 1 = span,,. 1 0 T 2 = {A M 2 : trac(a) = 0} { ( ) } a b = A =, a+d = 0 c d {( ) ( ) ( )} 1 0 = span,,. 1 1 See Examples 1, 2 in Chapter 4.7 of your textbook. Polynomial spaces. Consider the P 2 : 1. Zero vector: p(x) = 0. 2. Basis set: {1,x,x 2 }. 3. Dimension: dim(p 2 ) = 3. P 2 = {all polynomials with degree 2} = { p(x) = a+bx+cx 2, a,b,c R }. 4. Subspaces: P 1 = {p(x) = a+bx, a,b R} and P 0 = {p(x) = a, a R}. See Example 4 in Chapter 4.4 and Examples 4, 5 in Chapter 4.7 of your textbook.. 5 Copy right reserved by Yingwei Wang

Remark 3.1. Actually, the matrix space M 2 = R 4 and the polynomial space P 2 = R 3. Solution spaces of linear differential equations. First order differential equations with zero right hand side. It implies that dim(v 1 ) = 1. V 1 = {all solutions to y +y = 0}, = { y(t) = ce t, c R }, = span { e t}. Second order differential equations with zero right hand side. V 2 = {all solutions to y = 0}, = {y(t) = c 1 t+c 2, c 1,c 2 R}, = span{t, 1}; V 3 = {all solutions to y +y = 0}, = {y(t) = c 1 sint+c 2 cost, c 1,c 2 R}, = span{sint, cost}; V 4 = {all solutions to y y = 0}, = { y(t) = c 1 e t +c 2 e t, c 1,c 2 R }, = span { e t, e t}. It implies that dim(v 2 ) = dim(v 3 ) = dim(v 4 ) = 2. In Chapter 5, we will learn how to find the linearly independent solutions to the second order linear differential equation: Ay +By +C = 0. 6 Copy right reserved by Yingwei Wang

4 Summary of square matrices Suppose A is an n n square matrix. The following statements are equivalent to each other: 1. A is nonsingular; 2. A is invertible (i.e. A 1 such that AA 1 = A 1 A = I n ); 3. det(a) 0; 4. There are no zero rows after Gaussian Eliminations; 5. A is row equivalent to I n ; 6. The nonhomogeneous equation Ax = b has unique solution for any right hand side b; 7. The homogeneous equation Ax = 0 has only trivial solution (zero solution); 8. The nullspace of A is {0}; 9. The columns (rows) of A are linearly independent (i.e. they form a basis for R n ) and Col(A) = Row(A) = R n ; 10. Rank(A) = dim(col(a)) = dim(row(a)) = n; 11. A T is also nonsingular (invertible). Similarly, the following statements are also equivalent to each other: 1. A is singular; 2. A is non-invertible; 3. det(a) = 0; 4. There are at least one zero row after Gaussian Elimination; 5. A is not row equivalent to I n ; 6. The nonhomogeneous equation Ax = b has either infinitely many solution or no solution; 7 Copy right reserved by Yingwei Wang

7. The homogeneous equation Ax = 0 has non trivial solutions; 8. x 0, s.t. x Null(A); 9. The columns (rows) of A are linearly dependent; 10. Rank(A) = dim(col(a)) = dim(row(a)) < n. 11. A T is also singular (non-invertible). 8 Copy right reserved by Yingwei Wang