Math 113 Solutions: Homework 8 November 28, 27
3) Define c j = ja j, d j = 1 j b j, for j = 1, 2,, n Let c = c 1 c 2 c n and d = vectors in R n Then the Cauchy-Schwarz inequality on Euclidean n-space gives ( n ) 2 ( n )( n ) c j d j On writing c j and d j in terms of a j and b j in the above equation, we have as desired ( n ) 2 ( n a j b j c 2 j ja 2 j )( n d 2 j b 2 j j ) d 1 d 2 d n be 5) No such inner product exists We prove this by showing that the parallelogram equality is violated for the vectors u = (1, ) and v = (, 1) We have u + v = (1, 1) = 2 and u v = (1, 1) = 2 while u = v = 1 Thus u + v 2 + u v 2 = 8 2( u 2 + v 2 ) = 4 6) We have u + v 2 u v 2 = u + v, u + v u v, u v, which, after expanding yields, ( u, u + u, v + v, u + v, v ) ( u, u u, v v, u + v, v ) Collecting terms, we have that u + v 2 u v 2 = 2 u, v + 2 v, u = 4 u, v, and, on dividing by 4, we have the result 7) Expanding the right hand side, i p u + i p v 2 = i p u + i p v, u + i p v = i p ( u, u + u, i p v + i p v, u + i p v, i p v ) Now u, i p v = i p u, v and i p v, u = i p v, u, while i p v, i p v = v, v Our sum can thus be split 4RHS = u, u i p + u, v 1 + v, u i 2p + v, v since 4 ip = 4 i2p = On dividing by 4 we have the result i p = 4 u, v 1
9) We have cos 2 kx dx = ( ) 2 1 2π = 1, so 1 2π = 1 Also, sin k ( x + 2k) π = coskx, so ( sin 2 k x + π ) + π 2k π dx = sin 2 kx dx = sin 2 kx dx, 2k + π 2k where the last equality holds because sin 2 kx is 2π periodic (actually the period is π ) Thus k ( ) 2 π ( ) 2 cos kx π π = sinkx π But cos 2 kx + sin 2 kx = 1, so 2 = [ (coskx ) 2 ( ) ] 2 sin kx + π π and we get ( ) 2 π ( ) 2 cos kx π π = sin kx π = 1 Together, our calculations show that sinkx π, and cos kx π have norm 1 for k = 1, 2,, n It remains to show that pairs of distinct functions in the list are orthogonal To prove orthogonality with 1, observe that sin kx = since sin is odd Since coskx is just a phase shift from sin kx, this proves that coskx = as well The same reasoning shows that π sin ax cosbx =, since this is odd as well The remaining integrals to consider have the form sin ax sin bx and cosax cosbx for a b But now, cosax cos bx + sin ax sin bx = cos(ax bx) and cosax cosbx sin ax sin bx = cos(ax + bx) For a b we have both π cos(ax bx) = and cos(ax + bx) =, so that cosax cos bx dx = 1 ( ) cos(ax + bx) + cos(ax bx) dx = 2 sin ax sin bx dx = 1 ( ) cos(ax bx) cos(ax + bx) dx = 2 Thus the entire sequence is orthogonal 1) We can put e 1 = 1 since this has norm 1 Then x x, e 1 e 1 = x x = x 1/2 This has norm x 1/2 2 = so we put e 2 = 12(x 1 ) Finally, 2 x 2 x 2, e 1 e 1 x 2, e 2 e 2 = x 2 dx x 2 x + 1/4 = 1/12 x 2 12(x 1/2) = x 2 1/3 12(x 1/2) (1/4 1/6) = x 2 x + 1/6 2 12(x 3 1/2x 2 )
This has norm x 2 x + 1/6 2 = Thus e 3 = 6 5(x 2 x + 1/6) x 4 2x 3 + 4/3x 2 1/3x + 1/36 = 1/18 17) Given P 2 = P, we show that P is the orthogonal projection onto range P Given v V we can write v = u + w with u range P, w (range P) We need to show that Pv = u We know that Pv = Pu + Pw Observe that u range P implies there exists x with Px = u Then Pu = P 2 x = Px = u We claim that (range P) = null P so that Pw = Indeed, the condition each vector of null P is orthogonal to each vector of range P implies that null P (range P) But dim null P = dim V dim range P by Rank-Nullity and dim(range P) = dim V dim range P since V = range P (range P) Thus null P and (range P) have the same dimension, so that they are actually equal This completes our proof, since we ve shown Pv = Pu + Pw = u + 2) Suppose U and U are invariant under T Write v V as v = u + w with u U, w U Then P U Tv = P U (Tu + Tw) and Tu U, Tw W implies P U (Tu + Tw) = Tu Meanwhile, TP U (u + w) = Tu so TP U v = P U Tv for every v V Now suppose that TP U = P U T Take u U Then Tu = TP U u = P U Tu which is an element of U since range P U = U Thus Tu is contained in U Now take w U We have P U Tw = TP U w = T = It follows that when we write Tw = u + w with u U, w U we have u = Thus Tw U so U is also T-invariant 22) Since we are given that p() = p () =, p is a polynomial without constant or linear term This condition is satified if and only if p lies in the subspace of P 3 (R) spanned by (x 2, x 3 ) The quantity to be minimized is 2 + 3x p(x) 2 where the norm is the one inherited from the inner product, on P 3 (R) given by P, Q = P(x)Q(x) dx By proposition 636, this minimization is achieved exactly when p(x) is the orthogonal projection of 2 + 3x onto the span of (x 2, x 3 ) To calculate this projection, we first find an orthonormal basis for span (x 2, x 3 ) with respect to, We have x 2 2 = x4 = 1/5 so we choose e 1 = 5x 2 Then x 3 x 3, e 1 e 1 = x 3 5x 2 1 x5 = x 3 5/6x 2 This has squared norm x 3 5/6x 2 2 = x 6 5/3x 5 + 25/36x 4 = 1/252 3
Thus we set e 3 = 6 7(x 3 5/6x 2 ) The best choice for p(x) is then given by p(x) = 2 + 3x, e 1 e 1 + 2 + 3x, e 2 e 2, ie ( p(x) = 2 5x 2 + 3 ) 5x 3 ( ( 5x 2 ) + 6 ) 7(2 + 3x)(x 3 5/6x 2 ) 6 7(x 3 5/6x 2 ) = 85/12x 2 23/1(x 3 5/6x 2 ) = 23/1x 3 + 24x 2 Extra Problem 1: We first show that V is a vector space Given (u 1, w 1 ), (u 2, w 2 ) V we have (u 1, w 1 ) + (u 2, w 2 ) = (u 1 + u 2, w 1 + w 2 ) = (u 2 + u 1, w 2 + w 1 ) since both U and W are commutative But then (u 1, w 1 ) + (u 2, w 2 ) = (u 2 + u 1, w 2 + w 1 ) = (u 2, w 2 ) + (u 1, w 1 ) so V is commutative Given a third vector (u 3, w 3 ) V, then ((u 1, w 1 ) + (u 2, w w )) + (u 3, w 3 ) = (u 1 + u 2, w 1 + w 2 ) + (u 3, w 3 ) = ((u 1 + u 2 ) + u 3, (w 1 + w 2 ) + w 3 ) = (u 1 + (u 2 + u 3 ), w 1 + (w 2 + w 3 )) = (u 1, w 1 ) + (u 2 + u 3, w 2 + w 3 ) = (u 1, w 1 ) + ((u 2, w 2 ) + (u 3 + w 3 )) by invoking the associativity of both U and W This proves addition on V is associative To show scalar multiplication on V is associative, take a, b scalars and (u, w) V Then (ab)(u, w) = (abu, abw) = a(bu, bw) = a(b(u, w)) The additive identity is (, ), where the first is the identity in U and the second is the identity in W Indeed, (u 1, w 1 ) + (, ) = (u 1 +, w 1 + ) = (u 1, w 1 ) For the additive inverse to (u 1, w 1 ) we may choose ( u 1, w 1 ) where u 1 is the additive inverse of u 1 in U, and w 1 is the additive inverse of w 1 in W We have (u 1, w 1 ) + ( u 1, w 1 ) = (u 1 u 1, w 1 w 1 ) = (, ) which is the additive identity we chose for V We calculate 1 (u 1, w 1 ) = (1 u 1, 1 w 1 ) = (u 1, w 1 ) so 1 is the multiplicative identity Also a((u 1, w 1 ) + (u 2, w 2 )) = a(u 1 + u 2, w 1 + w 2 ) = (a(u 1 + u 2 ), a(w 1 + w 2 )) = (au 1 + au 2, aw 1 +aw 2 ) = (au 1, aw 1 )+(au 2, aw 2 ) = a(u 1, w 1 )+a(u 2, w 2 ) Similarly, (a+b)(u 1, w 1 ) = ((a+b)u 1, (a+b)w 1 ) = (au 1 +bu 1, aw 1 +bw 1 ) = (au 1, aw 1 )+(bu 1, bw 1 ) = a(u 1, w 1 )+b(u 1, w 1 ), so that the distributive properties hold for V We have used in these two calculations the distributive properties from U and W Together the above calculations show that V is a vector space To show that ((u 1, ),, (u m, ), (, w 1 ),, (, w k )) forms a basis for V, we first check that it spans V Take (u, w) V Then we may write u = a 1 u 1 + + a m u m for some a 1,, a m since the u i span U Similarly we may write w = b 1 w 1 + + b k w k since the w j span W But then a 1 (u 1, ) + + a m (u m, ) + b 1 (, w 1 ) + + b k (, w k ) = (a 1 u 1 + + a m u m, b 1 w 1 ++b k w k ) = (u, w) This proves that ((u 1, ),, (, w k )) spans V To check that the list is linearly independent, suppose that there exist constants c 1,, c m, d 1,, d k so that c 1 (u 1, )++c m (u m, )+d 1 (, w 1 )++d k (, w k ) = (, ) Then (c 1 u 1 ++c m u m, d 1 w 1 + +d k w k ) = (, ) and in particular, c 1 u 1 ++c m u m = in U and d 1 w 1 ++d k w k = in W But then the linear independence of the u i and w j implies c 1 = = c m = d 1 = = d k = This proves that ((u 1, ),, (, w k )) is independent Thus it forms a basis To prove that ι L(V, X) is an isomorphism we first show that it is linear, and then show that it is a bijection Take scalar a and two vectors (u, w), and (u, w ) in V Then 4
ι(a(u, w) + (u, w )) = ι(au + u, aw + w ) = au + u + aw + w = a(u + w) + (u + w ) = aι(u, w) + ι(u, w ) This proves that ι is linear To show that ι is a bijection, observe that X is the set of vectors {z Z z = u + w; u U, w W } so ι is a genuine map from V to X (that is, its image is indeed contained in X) Moreover, it is onto since for x X we have x = u + w for some u U, w W and x = ι(u, w) Furthermore, it is injective, since ι(u 1, w 1 ) = ι(u 2, w 2 ) implies u 1 +w 1 = u 2 +w 2, or u 1 u 2 = w 2 w 1 The left side here is a member of U and the right side of W, so each is in the intersection Since U W = {} we have u 1 u 2 = w 2 w 1 =, that is, u 1 = u 2, w 1 = w 2, so (u 1, w 1 ) = (u 2, w 2 ) Extra Problem 2: (1) To check that W has commutative and associative addition, additive identity and additive inverses is the same set of calculations as in problem 1 We check that 1 + i is the multiplicative identity on W We have (1 + i)(u, w) = 1(u, w) + ( w, u) = (u, w) + (( w), u) = (u, w) where this calculation uses the fact that 1 is the multiplicative identity on V V and that v = for all v V Next we show that scalar multiplication by complex numbers is associative For a, b, a, b real numbers and (u, v) W we have Meanwhile, ((a + bi)(a + b i))(u, v) = ((aa bb ) + (ab + a b)i)(u, v) = (aa bb )(u, v) + (ab + a b)( v, u) = (aa u bb u ab v a bv, aa v bb v + ab u + a bu) (a + bi)((a + b i)(u, v)) = (a + bi)(a (u, v) + b ( v, u)) = (a + bi)(a u b v, a v + b u) = a(a u b v, a v + b u) + b( a v b u, a u b v) = (aa u bb u ab v a bv, aa v bb v + ab u + a bu) Since these two equations are equal, we have that multiplication by complex scalars is associative It remains to prove the distributive properties Take (u 1, w 1 ), (u 2, w 2 ) W and a, b R Then (a + bi)((u 1, w 1 ) + (u 2, w 2 )) = (a + bi)(u 1 + u 2, w 1 + w 2 ) = a(u 1 + u 2, w 1 + w 2 ) + b( (w 1 + w 2 ), u 1 + u 2 ) = (au 1 bw 1 + au 2 bw 2, aw 1 + bu 1 + aw 2 + bu 2 ) = ((au 1, aw 1 ) + ( bw 1, bu 1 )) + ((au 2, aw 2 ) + ( bw 2, bu 2 )) = (a + bi)(u 1, w 1 ) + (a + bi)(u 2, w 2 ) 5
Now take a second pair of real numbers a, b Then ((a + bi) + (a + b i))(u, w) = ((a + a ) + (b + b )i)(u, w) = (a + a )(u, w) + (b + b )( w, u) = a(u, w) + a (u, w) + b( w, u) + b ( w, u) = (a(u, w) + b( w, u)) + (a (u, w) + b ( w, u)) = (a + bi)(u, w) + (a + b i)(u, w) These two calculations prove that scalar multiplication by complex numbers distributes over addition, and together with the previous facts, this shows that W is a vector space over the complex numbers (2) To show that ((e 1, ),, (e n, ) forms a basis for W we first show that this list spans, and then show that it is linearly independent Take (u, w) W Then since (e 1,, e n ) is a basis for V and u, w V we can find real scalars a 1,, a n and b 1,, b n so that u = a 1 e 1 + + a n e n and w = b 1 e 1 + + b n e n But then (a 1 + b 1 i)(e 1, ) + + (a n + b n i)(e n, ) = (a 1 e 1 b 1, a 1 + b 1 e 1 ) + + (a n e n b n, a n + b n e n ) = (a 1 e 1 + + a n e n, b 1 e 1 + + b n e n ) = (u, w) Hence ((e 1, ),, (e n, )) spans all of W To show that this list is linearly independent, suppose there exist real numbers c 1,, c n, d 1,, d n with n k=1 (c k + d k i)(e k, ) = (, ) Then n k=1 (c ke k, d k e k ) = (, ), so in V we have that n k=1 c ke k =, n k=1 d ke k = But (e 1,, e n ) is linearly independent in V, so we must have c 1 = = c n = d 1 = = d n = Hence ((e 1, ),, (e n, )) is independent over W Thus it is a basis (3) Since T maps V to V, we have S(v 1, v 2 ) = (Tv 1, Tv 2 ) W so S maps W to W We show that it is linear Take (u 1, w 1 ), (u 2, w 2 ) W and a, b R Then Also, S((u 1, w 1 ) + (u 2, w 2 )) = S(u 1 + u 2, w 1 + w 2 ) = (T(u 1 + u 2 ), T(w 1 + w 2 )) = (Tu 1 + Tu 2, Tw 1 + Tw 2 ) = (Tu 1, Tw 1 ) + (Tu 2, Tw 2 ) = S(u 1, w 1 ) + S(u 2, w 2 ) S((a + bi)(u 1, w 1 )) = S(au 1 bw 1, aw 1 + bu 1 ) = (T(au 1 bw 1 ), T(aw 1 + bu 1 )) = (atu 1 btw 1, atw 1 + btu 1 ) = (a + bi)(tu 1, Tw 1 ) = (a + bi)s(u 1, w 1 ) Together the last two equations show that S is linear 6
For the matrix computation, suppose that Te k = a 1k e 1 + + a nk e n Note that the a ik are real Then S(e k, ) = (Te k, T) = (Te k, ) = (a 1k e 1, ) + + (a nk e n, ) = (a 1k + i)(e 1, ) + + (a nk + i)(e n, ) Thus the kth column of M(T, (e 1,, e n )) is a 1k + i the kth column of M(S, ((e 1, ),, (e n, )) is a nk + i a 1k a nk and This holds for every k, so the two matrices are the same Suppose that λ R is an eigenvalue of T with associated non-zero eigenvector v Then S(v, ) = (Tv, T) = (λv, ) = (λ+i)(v, ), so λ is also an eigenvalue of S Suppose instead that λ is a real eigenvalue of S with non-zero eigenvector (u, w) Then S(u, w) = (Tu, Tw) = (λ + i)(u, w) = (λu, λw) We must have one of u, w non-zero, say u Then Tu = λu so λ is an eigenvalue of T Before considering the case of complex eigenvalues, we prove the claims made in the hint regarding the conjugation map c We have c(s(v 1, v 2 )) = c(tv 1, Tv 2 ) = (Tv 1, Tv 2 ) and S(c(v 1, v 2 )) = S(v 1, v 2 ) = (Tv 1, T( v 2 )) = (Tv 1, Tv 2 ) Thus indeed, c(s(v 1, v 2 )) = S(c(v 1, v 2 )) Also, for λ = a + bi, a, b R, we calculate c(λ(v 1, v 2 )) = c((a + bi)(v 1, v 2 )) = (av 1 bv 2, av 2 bv 1 ) and λc(v 1, v 2 ) = (a bi)(v 1, v 2 ) = (av 1 bv 2, av 2 bv 1 ) Also, c((v 1, v 2 ) + (v 1, v 2)) = c(v 1 + v 1, v 2 + v 2) = (v 1 + v 1, v 2 v 2) = c(v 1, v 2 ) + c(v 1, v 2) and, for r real, r(v 1, v 2 ) = (rv 1, rv 2 ) so c(r(v 1, v 2 )) = (rv 1, rv 2 ) = rc(v 1, v 2 ) Hence c is real-linear Thus we ve checked the hint The second statement that we are to prove, that (v 1, v 2 ) null (S λi) k only if (v 1, v 2 ) null (S λi) k contains the first statement (λ an eigenvalue only if λ is) in the case k = 1, so we only prove the more general claim Suppose (S λi) k (v 1, v 2 ) = (, ) Then (, ) = c((s λi) k (v 1, v 2 )) We show that c((s λi) k (v 1, v 2 )) = (S λi) k c(v 1, v 2 ), which suffices to prove the result since then (S λi) k (v 1, v 2 ) = (, ) The proof is by induction In the base case k = it is certainly true that c((s λi) (v 1, v 2 )) = c(v 1, v 2 ) = (S λi) c(v 1, v 2 ) Moreover, we know from the hint that c(s(v 1, v 2 )) = S(c(v 1, v 2 )) and c(λi(v 1, v 2 )) = λi(c(v 1, v 2 )) so that, for all (v 1, v 2 ) W, c((t λi)(v 1, v 2 )) = (T λi)(c(v 1, v 2 )) (we ve used the real-linearity of c) Assuming inductively that c((t λi) k (v 1, v 2 )) = (T λi) k c(v 1, v 2 ), we have that c((t λi) k+1 (v 1, v 2 )) = (T λi)c((t λi) k (v 1, v 2 )) = (T λi)(t λi) k c(v 1, v 2 ) = (T λi) k+1 c(v 1, v 2 ) Thus our claim holds by induction, which completes the proposition 7