Homework set 4 - Solutions

Similar documents
Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Spectral Theorem for Self-adjoint Linear Operators

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

Chapter 6: Orthogonality

Properties of Linear Transformations from R n to R m

Math Linear Algebra

1. General Vector Spaces

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Analysis Preliminary Exam Workshop: Hilbert Spaces

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

Differential Topology Final Exam With Solutions

Some notes on Coxeter groups

Math 396. Quotient spaces

Contents. Preface for the Instructor. Preface for the Student. xvii. Acknowledgments. 1 Vector Spaces 1 1.A R n and C n 2

Math 147, Homework 1 Solutions Due: April 10, 2012

ALGEBRA 8: Linear algebra: characteristic polynomial

Math 593: Problem Set 10

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

NORMS ON SPACE OF MATRICES

235 Final exam review questions

Definitions for Quizzes

REPRESENTATION THEORY WEEK 7

Math 396. An application of Gram-Schmidt to prove connectedness

Survey on exterior algebra and differential forms

Math Solutions to homework 5

MIDTERM I LINEAR ALGEBRA. Friday February 16, Name PRACTICE EXAM SOLUTIONS

In English, this means that if we travel on a straight line between any two points in C, then we never leave C.

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

MATH 304 Linear Algebra Lecture 20: The Gram-Schmidt process (continued). Eigenvalues and eigenvectors.

Linear Algebra Massoud Malek

Introduction to Geometry

Math 113 Final Exam: Solutions

Homework set 5 - Solutions

Functional Analysis Exercise Class

MATH 115A: SAMPLE FINAL SOLUTIONS

Final Review Written by Victoria Kala SH 6432u Office Hours R 12:30 1:30pm Last Updated 11/30/2015

Exercise Sheet 1.

Math 4153 Exam 3 Review. The syllabus for Exam 3 is Chapter 6 (pages ), Chapter 7 through page 137, and Chapter 8 through page 182 in Axler.

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

homogeneous 71 hyperplane 10 hyperplane 34 hyperplane 69 identity map 171 identity map 186 identity map 206 identity matrix 110 identity matrix 45

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Functional Analysis Review

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

The value of a problem is not so much coming up with the answer as in the ideas and attempted ideas it forces on the would be solver I.N.

Elementary linear algebra

Numerical Linear Algebra

Supplementary Notes on Linear Algebra

Math 291-2: Final Exam Solutions Northwestern University, Winter 2016

Linear vector spaces and subspaces.

The following definition is fundamental.

Review of Some Concepts from Linear Algebra: Part 2

REPRESENTATION THEORY NOTES FOR MATH 4108 SPRING 2012

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

Numerical Linear Algebra Homework Assignment - Week 2

Inner products. Theorem (basic properties): Given vectors u, v, w in an inner product space V, and a scalar k, the following properties hold:

Your first day at work MATH 806 (Fall 2015)

LINEAR ALGEBRA REVIEW

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Final Exam. Linear Algebra Summer 2011 Math S2010X (3) Corrin Clarkson. August 10th, Solutions

(d) Since we can think of isometries of a regular 2n-gon as invertible linear operators on R 2, we get a 2-dimensional representation of G for

(x, y) = d(x, y) = x y.

Chapter 3 Transformations

SPECTRAL PROPERTIES OF THE LAPLACIAN ON BOUNDED DOMAINS

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Math 121 Homework 5: Notes on Selected Problems

Notes on singular value decomposition for Math 54. Recall that if A is a symmetric n n matrix, then A has real eigenvalues A = P DP 1 A = P DP T.

Algebra I Fall 2007

Course Summary Math 211

Math 443 Differential Geometry Spring Handout 3: Bilinear and Quadratic Forms This handout should be read just before Chapter 4 of the textbook.

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

Categories and Quantum Informatics: Hilbert spaces

08a. Operators on Hilbert spaces. 1. Boundedness, continuity, operator norms

Linear algebra 2. Yoav Zemel. March 1, 2012

Fall, 2003 CIS 610. Advanced geometric methods. Homework 3. November 11, 2003; Due November 25, beginning of class

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

MET Workshop: Exercises

Math 108b: Notes on the Spectral Theorem

THE INVERSE FUNCTION THEOREM

Applied Linear Algebra in Geoscience Using MATLAB

Lecture 10: October 27, 2016

6 Inner Product Spaces

Typical Problem: Compute.

Lecture notes: Applied linear algebra Part 1. Version 2

Homework 2. Solutions T =

SPRING 2006 PRELIMINARY EXAMINATION SOLUTIONS

Math 21b: Linear Algebra Spring 2018

Dot Products. K. Behrend. April 3, Abstract A short review of some basic facts on the dot product. Projections. The spectral theorem.

Linear Algebra- Final Exam Review

INTRODUCTION TO ALGEBRAIC GEOMETRY

Math 121 Homework 4: Notes on Selected Problems

MATH 304 Linear Algebra Lecture 34: Review for Test 2.

Math 462: Homework 2 Solutions

Review problems for MA 54, Fall 2004.

EXERCISES ON DETERMINANTS, EIGENVALUES AND EIGENVECTORS. 1. Determinants

W if p = 0; ; W ) if p 1. p times

MATH 235. Final ANSWERS May 5, 2015

Transcription:

Homework set 4 - Solutions Math 407 Renato Feres 1. Exercise 4.1, page 49 of notes. Let W := T0 m V and denote by GLW the general linear group of W, defined as the group of all linear isomorphisms of W onto itself. For each σ in the symmetric group S m consisting of all the permutations of {1,,...,m} we have defined the multilinear map on V given by Πσv 1,..., v m = v σ 1 1 v σ 1 m for σ S m. In this problem we regard Πσ as a linear map on W. So we write Πσv 1 v m = v σ 1 1 v σ 1 m. Show that Πσ GLW for each σ S m and that Πσσ = Πσ Πσ and Πσ 1 = Πσ 1. This means that Π : S m GLW is a group homomorphism, and that Π defines a linear representation of S m with representation space W. Solution. For σ,σ S m, Πσσ v 1 v m = v σσ 1 1 v σσ 1 m = v σ 1 σ 1 1 v σ 1 σ 1 m = Πσ v σ 1 1 v σ 1 m = Πσ Πσ v 1 v m. It is immediate from the definition that if e is the identity element in the group, then Πev 1 v m = v 1 v m. Therefore, ΠσΠσ 1 = Πσσ 1 = Πe = I, from which we obtain that Πσ 1 = Πσ 1.. Exercise 4., page 50 of notes. Show that the antisymmetrizing operator A, defined by A = 1 signσπσ satisfies A = A.

Solution. Keep in mind the following observation: A = 1 signσ σπσ σ for any σ S m. Then A 1 = signσ Πσ 1 signσπσ = 1 1 signσ signσπσ Πσ = 1 1 signσ σπσ σ = 1 A = A. 3. Exercise 4.9, page 55 of notes. Show that if ω is an alternating n-form on the n-dimensional real vector space V and T : V V is any linear map, then ωt v 1,...,T v n = dett ωv 1,..., v n, for arbitrary vectors v 1,..., v n V. Solution. We may suppose that ω 0 since the claim is trivial otherwise. Thinking of ω as a non-zero linear map on V V makes it clear that the set of vectors v 1,..., v n for which ωv 1,..., v n 0 is dense. This is because the complement set of the kernel of a nonzero linear map is a dense set. So, by continuity, it is sufficient to prove the identity under the assumption ωv 1,..., v n 0. In particular, α = {v 1,..., v n } must be a basis of V. Let {v 1,..., v n } be the dual basis. Recall that v i T v j is the i, j -entry of the matrix [T ] representing T in the basis α. So the isomorphism between R n and V obtained under the basis α sends the column vector [T ] 1j,...,[T ] n j t to T v j. This makes it clear that the function F [T ] := ωt v 1,...,T v n ωv 1,..., v n is multilinear when regarded as a function of the columns of [T ]. It is also clear that this is antisymmetric and that F I = 1, where I is the identity matrix. But the determinant is uniquely characterized by these properties. Since the determinant of the matrix of a linear transformation is the same number regardless of choice of basis, we must have Consequently, the claimed identity holds. dett = ωt v 1,...,T v n. ωv 1,..., v n 4. Exercise 4.10, page 56 of notes. If θ is a k-form and ω is an l -form, show that θ ωv 1,..., v k+l = 1 signσθv σ1,..., v σk ωv σk+1,..., v σk+l. k!l! σ S k+l Solution. The wedge product of forms is defined in the notes by k + l θ ω = θ ω A. k

The definition of A yields k + l 1 θ ωv 1,..., v k+l = signσθ ωv k k + l! σ 1 1,..., v σ 1 k+l σ S k+l = 1 signσθv σ1,..., v σk ωv σk+1,..., v σk+l. k!l! σ S k+l Note that, summing over all σ 1 gives the same result as summing over all the σ. Thus the stated identity holds. 5. Exercise 5.10, page 65 of notes. If D is the determinant function, show that it is everywhere differentiable and its directional derivative at A in direction W is dd A W = DAtrW A 1 for all A GLn,R and all W Mn,R. Solution. It is clear that DA is differentiable to every order at all points since it is a polynomial function of the entries of A. Note that DA + tw = DI + tw A 1 DA. So dd A W = d DA + tw = DA d DI + tw A 1. Thus it suffices to prove the identity for A = I. That is, it suffices to show that d DI + tw = trw. Expressing the determinant explicitly as a function of the columns of the matrices, we have d DI + tw = d n De 1 + t w 1,...,e n + t w n = De 1,...,e i 1, w i,e i+1,...,e n = trw. Thus the stated identity holds. 6. Let M symm n,r Mn,R be the space of symmetric matrices. Define F : Mn,R M symm n,r by F A = A t A. i=1 a Show that for all A, X Mn, R df A X = A t X + X t A. b Show that if A is invertible, then df A : Mn,R M symm n,r is surjective. Solution. a We have df A = d F A + t X = d A + t X t A + t X = d A t A + ta t X + X t A + t X t X = A t X + X t A. b Let A be invertible. We need to show that for an arbitrary symmetric matrix S, there exists X such that A t X + X t A = S. But this holds for X = 1 At 1 S since A t X + X t A = 1 At A t 1 S + 1 S A 1 A = S. 3

7. Exercise 5.13, page 67 of notes. Spectral theorem for symmetric matrices. Let f : U R be a differentiable function defined on an open subset of R n equipped with the standard inner product. The sphere of radius 1 centered at the origin is indicated by S n 1 = {x R n : x = 1}. a Show that grad x f is orthogonal the kernel of d f x. b Show that grad x f is the maximum rate of change of f along any direction: max d f x v = grad x f v =1 and that the maximum is achieved when v = grad x f / grad x f. c Let A Mn,R be a symmetric matrix and define f : R n R by f x = 1 Ax, x. Show that if x S n is a point where f achieves its maximum or its minimum valued, then x is an eigenvector of A. d Show that there exists an orthonormal basis of R n consisting of eigenvectors of A. For this, use a finite induction starting from the existence of one eigenvector, then restricting the function to the intersection of the sphere with the subspace orthogonal to the previously obtained eigenvectors. Solution. a Let u be any vector in the kernel of d f x, so that d f x u = 0. By definition grad x f,u = d f x u = 0 proving the claim. b By the Schwarz inequality, d f x v = grad x f, v grad x f v. On the other hand, grad d f x f x grad x f = grad grad x f, x f = grad grad x f x f. This shows that the maximum of d f x v over vectors v of length 1 is the norm of the gradient, and the maximum is attained when v is the direction of the gradient. c A point x of maximum or minimum is a critical point for f, meaning that d f x v = 0 for all v tangent to S n 1 at x. Note that x is perpendicular to all these tangent vectors. This means that the gradient vector of f at x is parallel to x: grad x f = λx for some λ R n. On the other hand, the gradient of f x satisfies grad x f, w = d f x w = d 1 Ax + t w, x + t w = 1 Ax, w + Aw, x = Ax, w = Ax, w. Therefore, the gradient of f at any point x is grad x f = Ax. Thus if x is a critical point we have Ax = λx for some λ. But this means that x is an eigenvalue of A. 4

d Let V be the n 1-dimensional subspace of R n perpendicular to the critical point of R n. Since A is symmetric, 0 = λ x, v = Ax, v = x, Av. This means that Av is also perpendicular to x. So V is invariant under A. We can now repeat the argument of c for the sphere in V. A simple finite induction now gives an orhonormal basis of R n consisting of eigenvectors of A. 5