GQE ALGEBRA PROBLEMS

Similar documents
Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

MAT Linear Algebra Collection of sample exams

1. General Vector Spaces

j=1 x j p, if 1 p <, x i ξ : x i < ξ} 0 as p.

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Review problems for MA 54, Fall 2004.

HOMEWORK PROBLEMS FROM STRANG S LINEAR ALGEBRA AND ITS APPLICATIONS (4TH EDITION)

MATH 115A: SAMPLE FINAL SOLUTIONS

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

x 3y 2z = 6 1.2) 2x 4y 3z = 8 3x + 6y + 8z = 5 x + 3y 2z + 5t = 4 1.5) 2x + 8y z + 9t = 9 3x + 5y 12z + 17t = 7

Math Linear Algebra II. 1. Inner Products and Norms

Conceptual Questions for Review

Problem Set (T) If A is an m n matrix, B is an n p matrix and D is a p s matrix, then show

MATH 235. Final ANSWERS May 5, 2015

Eigenvalues and Eigenvectors A =

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

MATRICES ARE SIMILAR TO TRIANGULAR MATRICES

5.) For each of the given sets of vectors, determine whether or not the set spans R 3. Give reasons for your answers.

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

2.3. VECTOR SPACES 25

Math 102, Winter Final Exam Review. Chapter 1. Matrices and Gaussian Elimination

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

18.06 Problem Set 8 - Solutions Due Wednesday, 14 November 2007 at 4 pm in

The following definition is fundamental.

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

Linear Algebra Lecture Notes-II

W2 ) = dim(w 1 )+ dim(w 2 ) for any two finite dimensional subspaces W 1, W 2 of V.

Elementary linear algebra

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Linear Algebra. Workbook

October 25, 2013 INNER PRODUCT SPACES

Part IA. Vectors and Matrices. Year

Then x 1,..., x n is a basis as desired. Indeed, it suffices to verify that it spans V, since n = dim(v ). We may write any v V as r

A Brief Outline of Math 355

Linear Algebra- Final Exam Review

Final Review Sheet. B = (1, 1 + 3x, 1 + x 2 ) then 2 + 3x + 6x 2

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Definitions for Quizzes

Math 25a Practice Final #1 Solutions

Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

MATH 31 - ADDITIONAL PRACTICE PROBLEMS FOR FINAL

LINEAR ALGEBRA REVIEW

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

1. Select the unique answer (choice) for each problem. Write only the answer.

6 Inner Product Spaces

LINEAR ALGEBRA QUESTION BANK

Chapter 3 Transformations

Chapter 4 Euclid Space

235 Final exam review questions

University of Colorado Denver Department of Mathematical and Statistical Sciences Applied Linear Algebra Ph.D. Preliminary Exam June 8, 2012

Preliminary/Qualifying Exam in Numerical Analysis (Math 502a) Spring 2012

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

Problems in Linear Algebra and Representation Theory

Mathematics Department Stanford University Math 61CM/DM Inner products

Linear Algebra Highlights

Chapter 5 Eigenvalues and Eigenvectors

Symmetric and anti symmetric matrices

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

CHAPTER VIII HILBERT SPACES

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Linear Algebra: Matrix Eigenvalue Problems

Numerical Linear Algebra

Algebra II. Paulius Drungilas and Jonas Jankauskas

( 9x + 3y. y 3y = (λ 9)x 3x + y = λy 9x + 3y = 3λy 9x + (λ 9)x = λ(λ 9)x. (λ 2 10λ)x = 0

Study Guide for Linear Algebra Exam 2

Lecture notes: Applied linear algebra Part 1. Version 2

There are six more problems on the next two pages

Math 24 Spring 2012 Sample Homework Solutions Week 8

1 Last time: least-squares problems

The University of Texas at Austin Department of Electrical and Computer Engineering. EE381V: Large Scale Learning Spring 2013.

Chapter 2 Linear Transformations

MTH 2032 SemesterII

Topic 2 Quiz 2. choice C implies B and B implies C. correct-choice C implies B, but B does not imply C

PRACTICE PROBLEMS FOR THE FINAL

Math Linear Algebra Final Exam Review Sheet

Math 520 Exam 2 Topic Outline Sections 1 3 (Xiao/Dumas/Liaw) Spring 2008

Chapters 5 & 6: Theory Review: Solutions Math 308 F Spring 2015

(v, w) = arccos( < v, w >

LINEAR ALGEBRA 1, 2012-I PARTIAL EXAM 3 SOLUTIONS TO PRACTICE PROBLEMS

[Disclaimer: This is not a complete list of everything you need to know, just some of the topics that gave people difficulty.]

MIT Final Exam Solutions, Spring 2017

Linear Algebra 2 Final Exam, December 7, 2015 SOLUTIONS. a + 2b = x a + 3b = y. This solves to a = 3x 2y, b = y x. Thus

MATH 369 Linear Algebra

2.2. Show that U 0 is a vector space. For each α 0 in F, show by example that U α does not satisfy closure.

Solutions to Final Practice Problems Written by Victoria Kala Last updated 12/5/2015

Vectors in Function Spaces

Linear algebra and applications to graphs Part 1

LINEAR ALGEBRA W W L CHEN

DEPARTMENT OF MATHEMATICS

REVIEW FOR EXAM III SIMILARITY AND DIAGONALIZATION

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

1 9/5 Matrices, vectors, and their applications

MATHEMATICS 217 NOTES

Review 1 Math 321: Linear Algebra Spring 2010

k is a product of elementary matrices.

Linear algebra 2. Yoav Zemel. March 1, 2012

Throughout these notes we assume V, W are finite dimensional inner product spaces over C.

(a) If A is a 3 by 4 matrix, what does this tell us about its nullspace? Solution: dim N(A) 1, since rank(a) 3. Ax =

Transcription:

GQE ALGEBRA PROBLEMS JAKOB STREIPEL Contents. Eigenthings 2. Norms, Inner Products, Orthogonality, and Such 6 3. Determinants, Inverses, and Linear (In)dependence 4. (Invariant) Subspaces 3 Throughout this document there are [bracketed comments] inserted into the original problems. By and large these serve two purposes, namely to make references to previous problems that are somehow similar or related, and to make note of details in problems that are ambiguous, unclear, or incorrect. To make these comments easier to locate they are accompanied by a bullet in the right margin, as above.. Eigenthings Problem. (Fall 208). Let V be the vector space [over C? ] of (infinitely differentiable) functions f : R R spanned by the set B = { sin(x), cos(x), sin(2x), cos(2x) }. (a) Show that B is a basis for V. d (b) Let dx : V V be the linear transformation corresponding to ordinary differentiation with respect to x. Find the matrix M that represents d dx with respect to the ordered basis B above. (c) Find one eigenfunction in V for d dx. Problem.2 (Fall 208). Let T : R 3 R 3 be an invertible linear transformation. Show that there exists a line L in R 3 passing through the origin such that T (L) = L. Problem.3 (Spring 208). Let A be an n n anti-hermitian 2 matrix, i.e., A = A. (a) Show that every eigenvalue of A must be purely imaginary. (b) Show that if λ and λ 2 are distinct eigenvalues of A with eigenvectors u and v respectively, then u v = 0. Updated: March 0, 209. Feel free to figure out why I ask this. 2 Sometimes called skew-hermitian

GQE ALGEBRA PROBLEMS 2 Problem.4 (Fall 207). Let A denote an n n matrix with entries belonging to the field of complex numbers. Let P (λ) denote the characteristic polynomial of A. Show that det(a) = P (0). Given P (λ) and assuming P (0) 0, show that A has an inverse, A, and find an expression for the characteristic polynomial of A, R(λ). Hint. For the last part, one could examine the determinant of A(A λi). Problem.5 (Spring 207). Construct a 3 3 real symmetric matrix A such that the eigenvalues of A are with multiplicity two and with multiplicity one and where x = and y = 2 are eigenvectors that correspond to the eigenvalue. Problem.6 (Spring 207). Let u and v be vectors in R n such that u T v = 0. Find the eigenvalues and describe the bases for the corresponding eigenspaces for the n n matrix A = vu T. Hint. Consider A 2. Problem.7 (Spring 206). Consider the n n symmetric matrix H = I 2uu T, where u is a unit vector in R n, i.e., u T u =. (a) Show that H is an orthogonal matrix. (b) Show that u is an eigenvectors of H and find the corresponding eigenvalue. (c) If v is an nonzero vector perpendicular to u, show that v is an eigenvector of H and find the corresponding eigenvalue. How many times is that eigenvalue repeated? (d) Compute the sum of the diagonal entries of H and the sum of the eigenvalues of H. Problem.8 (Spring 206). Consider a 3 3 matrix A with eigenvalues λ i and corresponding linearly independent eigenvectors x i, i =, 2, 3. (a) Suppose λ = λ 2. Choose the true statement from (i), (ii), and (iii) below and give an explanation for your answer. (i) A can be diagonalised. (ii) A cannot be diagonalised. (iii) I need more information to decide. (b) From the eigenvectors and eigenvalues, give a formula for A and explain each part carefully. (c) Suppose λ = 2, λ 2 = 5, x = [ ] T, and x2 = [ 2 ] T. Choose λ 3 and x 3 so that A is symmetric and positive semidefinite but not positive definite. Problem.9 (Spring 206). Let A be an n n skew-symmetric matrix; that is A T = A. (a) Show that if λ is an eigenvalue of A, then λ is also an eigenvalue of A. [Cf. Problem.3 (a).] (b) Show that if A is invertible, then n must be an even number. (c) Construct a counterexample to show that if n is an even number, A may not be invertible.

GQE ALGEBRA PROBLEMS 3 Related Problem. Let A be an n n real skew-symmetric matrix. (a) Show that the matrices I A and I + A are invertible. (b) Prove that B = (I A)(I + A) is an orthogonal matrix. Related Problem. Show that the determinant of an n n skew-symmetric matrix is 0 if n is odd. Spring 206 was a very eigenfull qualifier, apparently. Maybe this is why Fall 206 had nothing eigenlike. Problem.0 (Fall 205). Suppose that n > 0 is an even integer. Let A be an n-by-n matrix whose entries a ij are given by { if i + j is even, a ij = 0 if i + j is odd. Find the eigenvalues of A and, for each eigenvalue, find one corresponding eigenvector. It is not necessary to normalize the eigenvectors. Problem. (Spring 205). Let A M n n (R) be symmetric. Prove that eigenvectors of A corresponding to distinct eigenvalues are orthogonal. Problem.2 (Fall 204). Let I be the n n identity matrix and J be the n n matrix all of whose entries equal. (a) Find the eigenvalues with multiplicities of ai + bj. Hint. What are the eigenvalues of J? (b) Use part (a) to determine det(ai + bj) in terms of a, b, and n. Problem.3 (Fall 204). For each of the following, construct a 2 2 matrix with entries from the real numbers and the requested properties: (a) A matrix with no zero entries that is diagonalisable, but not invertible. (b) A matrix that is invertible, but not diagonalisable (even over the complex field). (c) A matrix with no real eigenvalues. (d) A matrix, other than the identity and the zero matrices, that is equal to the square of itself, i.e. a matrix M such that M = M 2. (e) A matrix with no zero entries whose transpose is also its inverse. Problem.4 (Spring 204). Prove that a real symmetric 2 2 matrix has repeated eigenvalues if and only if it is a scalar multiple of the identity matrix. Problem.5 (Spring 204). Let A R n n be a real symmetric matrix. (a) Prove that eigenvectors corresponding to distinct eigenvalues of A are orthogonal; i.e., if Au = λ u and Au 2 = λ 2 u 2, and λ λ 2, then u T u 2 = 0. [Cf. Problem..] (b) Use part (a) to show that if A has n distinct eigenvalues, there exists an orthogonal matrix U such that AU = UΛ where Λ is an n n diagonal matrix. (c) Use the results from parts (a) and (b) to compute A 00 when [ ] 3 A =. 3

GQE ALGEBRA PROBLEMS 4 Solution. Note before we begin that a matrix A being symmetric means A T = A. (a) Consider the scalar product (Au ) T u 2. We can move the matrix to the second component of the scalar product by transposing, like so: (Au ) T u 2 = u T A T u 2 = u T (A T u 2 ). The left-hand side above is equal to λ u T u 2, whereas the right-hand side is λ 2 u T u 2. Subtracting one of these from the other we therefore get λ u T u 2 λ 2 u T u 2 = (λ λ 2 )u T u 2 = 0, whence u T u 2 must be zero since λ λ 2 0. (b) By part (a), if A has n distinct eigenvalues, say λ, λ 2,..., λ n, it must also have n orthogonal eigenvectors, call them u, u 2,..., u n. Now let U be the n n matrix with u, u 2,..., u n as columns, and let Λ be the diagonal n n matrix with diagonal elements λ, λ 2,..., λ n. Comparing AU and UΛ column by column we have precisely Au i = u i λ i for each i =, 2,..., n, as desired. Note for the record that the calculations in part (a) do not by themselves make U an orthogonal matrix a matrix being orthogonal means not only that the columns are orthogonal to one another, but moreover that they are orthonormal, whence we would have to make sure that all our eigenvectors u i are normalised by dividing them by their norm. (c) By diagonalising we can easily compute astronomical powers of a matrix; suppose A = U = ΛU. Then A n = (UΛU ) n = UΛU } UΛU {{ UΛU } = UΛ n U, n times and fortunately powers of a diagonal matrix are computed by simply powering up the diagonal elements. Hence we need to find the eigenvalues of A, and find two corresponding eigenvectors. First the eigenvalues: [ ] λ 3 det(a λi) = det = ( λ) 2 9 = ( λ 3)( λ+3) = (2+λ)(4 λ) 3 λ making the eigenvalues λ = 2 and λ 2 = 4. For corresponding eigenvectors we have, for example, [ [ ] [ 3 = 2 3 ] ] and so take Hence U = [ ] [ ] 3 3 [ ], U = 2 A 00 = UΛ 00 U = 2 = 4 [ ], [ ], and Λ = [ ] 2 0. 0 4 [ ] [ ] [ ] ( 2) 00 0 0 4 00.

GQE ALGEBRA PROBLEMS 5 Problem.6 (Fall 203). Let V be a complex inner product space with inner product, and let L(V, V ) be the vector space of all linear operators on V. Let ψ L(V, V ) be the linear unitary operator, i.e., ψψ = ψ ψ = I, where ψ is the adjoint operator of ψ under the inner product satisfying ψx, y = x, ψ y for all x, y V, and where I is the identity operator in L(V, V ). (a) Show that if λ is an eigenvalue of ψ, i.e., ψx = λx for some x 0, then λ =. (b) Demonstrate that if µ is an eigenvalue of ψ, then µ is an eigenvalue of ψ. (c) Let λ and µ be distinct eigenvalues of ψ with associated eigenvectors x and y, respectively. Prove that x is orthogonal to y, i.e., x, y = 0. Problem.7 (Spring 203). Let B and Q be n n real matrices, where Q is invertible. (a) Show that B and its transpose, B T, have the same characteristic polynomial (and hence the same eigenvalues). (b) If Q BQ is a diagonal matrix, show that the columns of (Q ) T are a basis for R n consisting of eigenvectors of B T. [Cf. Problem.5 (b).] Problem.8 (Spring 202). For A R n n, suppose A T c = µx and Ay = λy. Show that either µ = λ or x and y are orthogonal vectors in R n. [Cf. Problems. and.5.] Problem.9 (Fall 20). Let 0.5 0 0 0 0 A = 0.3 0.8, u = 5, and b = 6. 0.2 0.2 0 0 (a) What is the determinant of A? (b) Show that u is an eigenvector of A with eigenvalue. (c) What are the other eigenvalues of A? Hint. These eigenvalues λ should satisfy 0.9 λ 0.9. (d) Find lim k Ak b, if the limit exists. Problem.20 (Spring 20). Let V be a finite dimensional vector space over the real numbers, and let A: V V be a linear mapping. Assume that A 2 = I, where I denotes the identity mapping on V. (a) Prove that A has no real eigenvalues. (b) Let x V, x 0. Show that x and Ax are linearly independent. (c) Prove that V must have an even dimension. Problem.2 (Fall 200). Let T be the linear transformation R n R n defined by T (x, x 2,..., x n ) = ( x n, x,..., x n ). (a) Find the matrix representation A of T in the standard basis for R n. (b) Find the inverse of A, if it exists. (c) Find all the real numbers c such that A ci is invertible. [This is somehow an eigenrelated thing.] Problem.22 (Fall 200). Suppose A is an n n complex matrix with n distinct eigenvalues. If AB = BA for some matrix B, show that B is diagonalisable.

GQE ALGEBRA PROBLEMS 6 Hint. If A = S DS with D diagonal, show and use the fact that SBS commutes with D. Problem.23 (Spring 200). Let A and B be two linear transformations from R n to R n such that A 2 = A and B 2 = B. (a) Find the smallest set (it is finite) that is guaranteed to contain all of the eigenvalues of A. (b) Show that A and B have the same range if and only if AB = B and BA = A. Problem.24 (Fall 2009). Given the following matrix A = a b c d e, 0 f find the entries a, b, c, d, e, and f so that all of the following results hold: (i) the top-left block is a matrix with eigenvalue 2; (ii) the top-left 2 2 block is a matrix with eigenvalues 3 and 3; and (iii) the top-left 3 3 block is a matrix with eigenvalues 0,, and 2. Problem.25 (Spring 2009). Let T : M n n (R) M n n (R) be the linear transformation defined by T (A) = A T. You may assume that the only eigenvalues of T are and. Find a basis for each eigenspace. Problem.26. Let A and B be n n matrices satisfying the relation AB BA = 2B. Let λ be an eigenvalue of A and let v be an eigenvector corresponding to λ. Show that there exists a positive integer k such that B k v = 0. 2. Norms, Inner Products, Orthogonality, and Such Problem 2. (Spring 208). Let M be the set of all n n real matrices. A M, define f(a) = max A ij. i,j n Prove f is a norm on M. Problem 2.2 (Spring 208). Let P 3 be the vector space of [real?] polynomial functions of degree at most 3 defined on the closed interval [, ]. (a) Show that the set B = {, x, x 2, x 3 } is a basis for P 3 and determine the dimension of P 3. (b) Consider the operator φ: P 3 P 3 R defined by φ(f, g) = f(x)g(x) dx. Show that φ is an inner product for P 3. (c) Find an orthogonal basis B for P 3 (i.e., a basis where φ(f, g) = 0 for any two distinct f and g in B ) with respect to the inner product φ defined in part (b). Problem 2.3 (Fall 207). Let be a norm on R m and M an m n real-valued matrix. Define f : R n R by f(x) = Mx. What are the necessary conditions on M for f(x) to be a norm on R n? For

GQE ALGEBRA PROBLEMS 7 Problem 2.4 (Fall 206). Consider the function f : C n C n C defined by f(x, y) := x T Ay where x, y are vectors in C n, A is a given n n complex-valued matrix, and y indicates the complex conjugate of y. If we wish f to be an inner product on C n over C, what properties must the matrix A satisfy? Problem 2.5 (Fall 205). Let P 2 (R) be the vector space of real polynomials of degree less than or equal to 2. Define the inner product on P 2 (R) by f, g = f(x)g(x) dx. Let W be the subspace of P 2 (R) spanned by { x, x 2 }. (a) Find the orthogonal complement, W, of W. (b) Express the polynomial 2 + 3x + 4x 2 as f(x) + g(x) with f(x) W and g(x) W. Solution. (a) Note how dim P 2 (R) = 3 (since there is a constant term, a linear term, and a quadratic term) and dim W = 2. Hence dim W = dim P 2 (R) dim W = 3 2 =, meaning that W = span{ a + bx + cx 2 } for some polynomial a + bx + cx 2 orthogonal to both x and x 2. Hence we require and 0 = x, a + bx + cx 2 = a x, + b x, x + c x, x 2 0 = x 2, a + bx + cx 2 = a x 2, + b x 2, x + c x 2, x 2. Now since the scalar product in this case is the integral over a symmetric interval, x, = x, x 2 = 0 since the corresponding integrands are odd. Additionally x, x = x 2, = x 2 dx = x3 3 = 2 3 and x 2, x 2 = x 4 dx = x5 5 = 2 5 whence our two equations become meaning b = 0, and 0 = 2 3 b, 0 = 2 3 a + 2 5 c, which we solve for, say, a to get a = 3/5c, so a + bx + cx 2 = 3 5 c + cx2 = c( 3 5 + x2 ) Thus finally taking, say, c = 5/3, we have W = span{ 5 3 x2 }. (b) To write 2 + 3x + 4x 2 in the way desired, let us find a polynomial in W that has constant term 2 this way what remains for the component in W will have only linear and quadratic terms, which is doable since W = span{ x, x 2 }. Hence take g(x) = 2 ( 5 3 x2) = 2 0 3 x2,

GQE ALGEBRA PROBLEMS 8 whence f(x) = 2 + 3x + 4x 2 g(x) = 3x + 4x 2 0 3 x2 = 3x + 2 3 x2 0 3 x2 = 3x + 2 3 x2. Problem 2.6 (Fall 204). Let A be any m n matrix with entries from R. (a) Prove that every vector in the null space of A is orthogonal to every vector in the range of A T. (b) Prove that every vector v in R n can be written as v = u + w where u is in the null space of A and w is in the range of A T. Problem 2.7 (Spring 204). Let A R n n. Show that the linear transformation y = Ax on R n is an isometry, i.e., it preserves (Euclidean) length, if and only if A is an orthogonal matrix, i.e., A T A = AA T = I. Problem 2.8 (Fall 203). Let ψ (x),..., ψ n (x) be [real?] continuous functions on [a, b] with a < b and suppose b a ψ i (x)ψ j (x) dx = δ ij, i, j =, 2,..., n, where δ ij = if i = j and δ ij = 0 if i j. n (a) Let f(x) = α k ψ k (x). Prove α k = b a f(x)ψ k(x) dx, k =, 2,..., n. k= (b) Is the collection of functions ψ (x),..., ψ n (x) linearly independent on [a, b]? Justify your answer. b n (c) Prove f(x) 2 dx = αk. 2 a k= Problem 2.9 (Spring 203). Let P (C) represent the complex polynomials of degree one or less. The letter i represents the square root of negative one. Define an inner product on P (C) by f, g = /2 fg dt. [As written I claim this is not an /2 inner product. Why?] Let T : P (C) P (C) be defined by T (f) = tf(i) f (t). (a) Show that T is a linear transformation. (b) Find an orthonormal basis for P (C). (c) Find the matrix representation of T relative to the basis from part (b). (d) Is T Hermitian? Solution. First let us explain why the aforementioned integral is not a inner product. The problem lies entirely in the lack of a conjugate on one of the functions in the integrand despite these being complex valued polynomials. Take, for instance, f(t) = i. Then f, f = /2 /2 i 2 dt = /2 /2 ( ) dt =, or in other words the inner product of something with itself is not necessarily nonnegative, which violates one of the properties of inner products. This is solved by replacing g by g in the definition. (That said it will not actually end up mattering for the particular inner products we need to compute in this problem.)

GQE ALGEBRA PROBLEMS 9 (a) A transformation being linear means that it preserves the vector space addition and scalar multiplication, so consider T (af + g) = t(af + g)(i) (af + g) (t) = atf(i) + tg(i) af (t) g (t) hence T is linear. = a(tf(i) f (t)) + tg(i) g (t) = at (f) + T (g), (b) A good way to find an orthonormal basis of a vector space is to start with any basis and orthogonalise it using the Gram-Schmidt process. Let us for that purpose start with the basis {, t }, and orthogonalise t as t, t, = t 0 = t, since the inner product, t is the integral of an odd function on a symmetric interval and hence vanishes. In other words and t where orthogonal all along, and it remains simply to normalise the two. The polynomial is already normalised since On the other hand t, t = /2 /2 tt dt =, = /2 /2 /2 /2 t 2 dt = 2 dt =. /2 /2 t 2 dt = t3 3 /2 /2 = 2, so the norm of t is / 2, whence normalised it becomes 2t = 2 3t. Hence our orthonormal basis is B = {, 2 3t }. (c) To find the matrix representation of a transformation in any basis we need only look at the image of the basis elements under that representations. We therefore compute T () = t 0 = t and so T (2 3t) = t2 3i 2 3, [ 0 2 3 [T ] B = i 2 3 (d) Finally a matrix being Hermitian means its conjugate transpose equals itself, which is not the case for [T ] B since [ 0 ] [ ] [T ] B = 2 3 0 2 3 2 = [T ] 3 i i B 2 3 and hence not for the linear transformation T either. Problem 2.0 (Fall 202). Suppose R 3 has the inner product u, v = u v + 2u 2 v 2 + 3u 3 v 3. Using this inner product, do the following: (a) Find an orthonormal basis which spans the subspace spanned by u = (,, ) and u 2 = (,, 0). (b) Find the orthogonal projection of the vector (, 2, 5) onto the space spanned by u and u 2 from part (a) of this problem. ].

GQE ALGEBRA PROBLEMS 0 Problem 2. (Fall 202). Let { x, x 2,..., x n } be a linearly independent set in R n. Prove or give a counterexample to the following statement: There exists a constant c > 0 such that α x +... + α n x n R n c( α +... + α n ) for any α i R, i =,..., n. In the above (a,..., a n ) R n any (a,..., a n ) R n. = ( n i= a2 i )/2 for Problem 2.2 (Fall 20). Let V be the vector space of all continuous real valued functions defined on (, ). (a) Prove that {, e x, e 2x } are linearly independent. (b) Define the inner product f, g = 0 f(x)g(x) dx, f, g V. Find an element in V which is orthogonal to e x. [As written, is this really an inner product on V? I claim not so.] Problem 2.3 (Spring 20). In this problem the inner product and norm symbols refer to the standard inner product and norm on R n : /2 n n x, y = x j y j, x =. j= Let x, y be linearly independent vectors in R n. Define a function φ: R R by φ(t) = x + ty 2. (a) Show that φ is a quadratic polynomial that has no real zeros. (b) Deduce that x, y < x y. (c) Prove the Cauchy-Schwarz-Bunyakovsky inequality in R n : for all v, w R n, v, w v w, j= with equality if and only of v and w are linearly dependent. Problem 2.4 (Fall 200). Let A, B R m n and A T be the transpose of A. Show that A, B = tr(a T B) is an inner product on R m n over R. Problem 2.5 (Fall 200). Let A be an m n real matrix. Show that the nullspace of the transpose of A equals the orthogonal complement of the range (or column space) of A; that is show [Cf. Problem 2.6 (a).] N(A T ) = R(A). Problem 2.6 (Fall 2009). Define the linear transformation z = Ax, with A R n n and x R n. Prove that A is an orthogonal matrix if and only if the distance between x and y is invariant under the transformation, i.e. x y 2 = A(x y) 2. [Cf. Problem 2.7.] x 2 j

GQE ALGEBRA PROBLEMS Problem 2.7 (Fall 2009). Let P 2 be the vector space of all polynomials of degree 2 over R with basis {, x, x 2 }, and define f, g = f(0)g(0) + f()g() + f(2)g(2). (a) Prove that, is an inner product on P 2. (b) Find an orthogonal basis for P 2. Problem 2.8 (Spring 2009). Let V be a finite-dimensional inner product space over the complex numbers. For x V we define the norm of x by x = x, x. A linear transformation T : V V is Hermitian if T = T, where T is the adjoint of T. Show that if T is Hermitian then (a) for all x V, T (x) + ix 2 = T (x) 2 + x 2 ; (b) use (a) to deduce that T + ii is invertible, where I is the identity transformation. 3. Determinants, Inverses, and Linear (In)dependence Problem 3. (Spring 208). Let v,..., v n be n linearly independent column vectors in R n. Suppose A is an n n matrix. Define w i = Av i, i =, 2,..., n. Prove that the vectors w, w 2,..., w n are linearly independent if and only if det A 0. Problem 3.2 (Fall 207). Let A denote an n n matrix with entries belonging to the field of complex numbers. Let P (λ) denote the characteristic polynomial of A. Show that det(a) = P (0). Given P (λ) and assuming P (0) 0, show that A has an inverse, A, and find an expression for the characteristic polynomial of A, R(λ). Hint. For the last part, one could examine the determinant of A(A λi). Problem 3.3 (Spring 207). Let A be an n n matrix such that for all i, j n, a ij = or a ij =. Prove that the determinant of A is an integer divisible by 2 n. Problem 3.4 (Fall 206). Let n 3 be an integer and let x x x V n (x) =,,, x,,, R n....... x x (a) For what values of x does V n (x) fail to be a basis for R n? (b) For each such value of x found in (a), what is the dimension of span(v n (x))? Related Problem. Compute the determinant of the following n n-matrix: 0 2 3 n 0 2 n 2 2 0 n 3 A = 3 2 0 n 4......... n n 2 n 3 n 4 0

GQE ALGEBRA PROBLEMS 2 Problem 3.5 (Spring 206). Let A be a 3 3 matrix and v be a 3-dimensional column vector. Suppose that { v, Av, A 2 v } is a linearly independent set and that A 3 v = 3Av 2A 2 v. Prove that the matrix B = [ v Av A 4 v ] is invertible. Problem 3.6 (Spring 205). Let A M n n (R) and suppose that A = QR, where Q M m n (R) and R M n n (R). Prove that if the columns of A are linearly independent, then R must be invertible. Problem 3.7 (Spring 205). Let A M m n (R) with rank(a) = r. Prove there exist invertible matrices P M m m (R) and Q M n n (R) such that [ ] Ir 0 P AQ =, 0 0 where I r denotes the r r identity matrix. Problem 3.8 (Fall 204). Let R n n denote the vector space of n n real matrices under addition and scalar multiplication. Recall that P R n n is a permutation matrix if each column and each row of P has a single entry equal to and n entries equal to 0. For instance, 0 0 0 P = 0 0 0 0 0 0 0 0 0 is a permutation matrix in R 4 4. Show that the set of permutation matrices in R n n is linearly dependent for all n 4. (It is actually true for n 3.) Problem 3.9 (Fall 203). Let J = (c ij ) be the n n matrix with c ij = 0 for all i, j =, 2,..., n, except c i i+ = for i =, 2,..., n, where n 2 is a positive integer. (a) Prove J n = 0. (b) Let A = I + J + J 2 +... + J n. Prove that A is invertible and find A. Problem 3.0 (Spring 203). Let B be the n n matrix whose (i, j)-entry is α k where k = min{ i, j }. Prove that for n 2 the determinant of B is given by (α k+ α k ). n α k= Problem 3. (Spring 203). Let T : R n R m be a linear transformation. If { v, v 2,..., v p } is a linearly dependent subset of R n, show that { T (v ), T (v 2 ),..., T (v p ) } is a linearly dependent subset of R m. [Cf. Problem 3..] Problem 3.2 (Spring 202). Recall that for any matrices C, D R n n, det(cd) = (det C)(det D). Let A R n n be given. (a) Suppose B R n n is obtained by replacing a row in A with the sum of that row and the multiple of another row. Show that det A = det B. (b) Suppose B R n n is obtained by switching two rows in A. Show that det B = det A.

GQE ALGEBRA PROBLEMS 3 (c) Prove that the operations in (a) and (b) can be used to obtain an upper triangular matrix U such that det U = det A. Problem 3.3 (Fall 20). Let A and B be n n matrices and I be the n n identity matrix. Prove I BA is invertible if I AB is invertible. Problem 3.4 (Spring 20). For n =, 2, 3,..., let A n denote the n n matrix whose (i, j) entry is a ij = min{ i, j }. For example, A 4 = 2 2 2 2 3 3. 2 3 4 State and prove a formula for det(a n ) that is valid for all n. Problem 3.5 (Spring 200). Let A n be the n n tridiagonal matrix 0 0 0 0 0 0 0 0 0 0 A n =......... 0 0 0 0 0 0 0 0 whose entries satisfy i = j i = j (A n ) ij = i = j + 0 otherwise. Show that for all n, det A n+2 = det A n+ + det A n. 4. (Invariant) Subspaces Problem 4. (Fall 208). Let T : V V be a linear operator. A subspace W V is called T -invariant if T W W ; that is x W implies T (x) W. (a) Prove that the null space of T is T -invariant. (b) Let v W be an eivenvector of T with real eigenvalue λ, and W = span{ v }. Show that W is a T -invariant subspace of V. (c) Let W be T -invariant. Assume that T is invertible. Show that W is also T -invariant. [This requires W to be a finite dimensional subspace of V. More on this below.] Solution. The first two parts of this problem are straight forward computation. The third less so. (a) The null space of T is N(T ) = ker T = { x V : T (x) = 0 }. To show that it is T -invariant, hence take x in the null space, i.e. T (x) = 0. We now need to ensure that the image 0 is in N(T ), which is clearly true since T (0) = 0 (since T is linear). (b) We need to show both that W is a subspace of V and that it is T -invariant. The first point is not very tricky: a subspace is a set that contains the zero vector, is closed under vector addition, and is closed under scalar multiplication. Clearly

GQE ALGEBRA PROBLEMS 4 0 W since 0 = 0 v W. Similarly if x, y W, then there are scalars c and d such that x = c v and y = d v, so x + y = cx + dy = (c + d)v W. Finally if d is any scalar, then dx = d(cv) = (dv)v W. To see that W is T -invariant, simply let x W, i.e. x = cv as above, and consider its image under T. We get T (x) = T (cv) = ct (v) by linearity of V. Now v is an eigenvector of T with eigenvalue λ, so T (v) = λv, hence the above is cλv W, and we are done. (c) As mentioned in the problem there are some subtleties to this problem. Since W is given to be T -invariant, we know that for any x W we have T (x) = y W. Our goal, then, is to conclude from this that T (x) W as well. The natural step to try is to take what we know above and apply T to it, i.e. let y = T (x) W, then T (y) = T (T (x)) = x W. There is a flaw in this argument, however, namely that y was not chosen arbitrarily; it works perfectly fine if y is specifically in the image of T to start with, but we do not know that the image of T is all of W. In other words, we don t know that T restricted to W is surjective. The key argument then is that since T is invertible on V, it is injective on V (since an invertible linear transformation maps only 0 to 0) and hence also injective when restricted to the subspace W. Now if W is finite dimensional, this injectivity of T restricted to W implies surjectivity. 3 Hence T restricted to W is surjective, so if y W, there exists some x W such that T (x) = y, and now we can freely apply the above argument. Counterexample. Finite dimensionality is not just a trick to be able to use the above argument; it s essential for this result. To see this, let V = R Z be the vector space of infinite sequences of real numbers in two directions, i.e. we can write v V as (..., v 2, v, v 0, v, v 2,...). Moreover let T : V V be the righthand shift operator, i.e. T (v) i = v i. Note that T is invertible since the lefthand shift operator is its inverse. Then the subspace W of all sequences w with w i = 0 for i 0 is invariant under T. However W is not T -invariant, since if we take any sequence w W with w 0, then T (w) 0 = w 0, so T (w) W. Problem 4.2 (Spring 207). Let P 3 [x] = { ax 3 + bx 2 + cx + d : a, b, c, d R } be the set of polynomials of degree 3 or less with real coefficients. (a) Show that is a subspace of P 3 [x]. (b) Find the dimension of W. W = { p(x) P 3 [x] : p() = 0 } 3 This is a general and quite powerful result. If V is a finite dimensional vector space and T : V V is a linear operator, then the following are equivalent: T is invertible; T is injective; T is surjective. feel free to prove this the rank-nullity theorem will probably come in handy. This is a type of result that occurs in many settings; generally speaking injectivity and surjectivity are the same on finite structures, such as functions on finite sets, homomorphisms on finite fields, continuous maps between finite dimensional connected compact manifolds of equal dimension, etc.

GQE ALGEBRA PROBLEMS 5 Problem 4.3 (Spring 205). Let C be the continuous functions on the interval [0, ]. Let W = { f W : f(/4) = 0 }. Prove that W is a subspace of V. [Cf. Problem 4.2 (a).] Problem 4.4 (Fall 203). Let n > be an integer. Let V be the vector space F n n of all n n matrices over a field F. Let M V be a fixed vector. Define the set W = { A V : AM = MA }. (a) Prove W is a subspace of V. (b) Consider the special case where F is the field of complex numbers and n = 2. Let [ ] i M =. i Find a basis for W. Problem 4.5 (Spring 203). Let v be a fixed real n vector. Let W be the set of all n n real matrices B such that Bv = 0. Show that W is a subspace of the n n real matrices. Problem 4.6 (Fall 202). Let V be the vector space of all square n n matrices over the real numbers. Prove or disprove that the following spaces are subspaces of V : (a) W = { A V : A = A T }, (b) H = { A V : AA = A }. Problem 4.7 (Spring 20). In the space of all 2 2 matrices over the real numbers, find a basis for the subspace of all matrices whose row sums and column sums are all equal. Problem 4.8 (Fall 200). Let W and W 2 be subspaces of a vector space V such that W W 2 is also a subspace of V. Prove that either W W 2 or W 2 W. [The converse of this is also true.] Problem 4.9 (Spring 200). Let V be a vector space over R and let V V and V 2 V be two subspaces of V. Show that (a) V + V 2 = { w : w = v + v 2, v V, v 2 V 2 } is a subspace of V, (b) There exists a vector v V such that v V and v V 2. Problem 4.0 (Fall 2009). Let V be a vector space of sequences { z n } n=0 over the complex numbers, and let S be the subset of all sequences satisfying z n+2 z n+ 6z n = 0 for all n 0. (a) Prove that S is a subspace of V. (b) Prove that dim S = 2. Hint. z n+2 = z n+ + 6z n, so the values of z 0 and z determine z n. (c) Prove that the two sequences { ( 2) n } n=0 and { 3 n } n=0 form a basis for S.