A Proof of Fundamental Theorem of Algebra Through Linear Algebra due to Derksen. Anant R. Shastri. February 13, 2011

Similar documents
FUNDAMENTAL THEOREM of ALGEBRA through Linear Algebra. 18th August Anant R. Shastri I. I. T. Bombay

Quantum Computing Lecture 2. Review of Linear Algebra

Spectral Theorem for Self-adjoint Linear Operators

6 Inner Product Spaces

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

A = 3 1. We conclude that the algebraic multiplicity of the eigenvalues are both one, that is,

A Little Beyond: Linear Algebra

MATH 205 HOMEWORK #3 OFFICIAL SOLUTION. Problem 1: Find all eigenvalues and eigenvectors of the following linear transformations. (a) F = R, V = R 3,

LINEAR ALGEBRA REVIEW

MATH 235. Final ANSWERS May 5, 2015

Dimension. Eigenvalue and eigenvector

Foundations of Matrix Analysis

Symmetric and anti symmetric matrices

Linear Algebra using Dirac Notation: Pt. 2

Notes on the matrix exponential

MATHEMATICS 217 NOTES

ZEROS OF POLYNOMIAL FUNCTIONS ALL I HAVE TO KNOW ABOUT POLYNOMIAL FUNCTIONS

THE MINIMAL POLYNOMIAL AND SOME APPLICATIONS

Math 489AB Exercises for Chapter 2 Fall Section 2.3

4 Vector Spaces. 4.1 Basic Definition and Examples. Lecture 10

Linear Algebra Practice Problems

A linear algebra proof of the fundamental theorem of algebra

A linear algebra proof of the fundamental theorem of algebra

1.4 Solvable Lie algebras

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

The converse is clear, since

Properties of Linear Transformations from R n to R m

LINEAR ALGEBRA BOOT CAMP WEEK 2: LINEAR OPERATORS

Online Exercises for Linear Algebra XM511

Minimum Polynomials of Linear Transformations

Lecture Summaries for Linear Algebra M51A

MATH 583A REVIEW SESSION #1

Math 314H Solutions to Homework # 3

Linear Algebra Lecture Notes-II

Math 108b: Notes on the Spectral Theorem

Topic 1: Matrix diagonalization

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

First we introduce the sets that are going to serve as the generalizations of the scalars.

MATH 423 Linear Algebra II Lecture 33: Diagonalization of normal operators.

Linear Algebra. Workbook

ALGEBRA 8: Linear algebra: characteristic polynomial

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

MATH 315 Linear Algebra Homework #1 Assigned: August 20, 2018

LINEAR ALGEBRA REVIEW

Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that

Linear algebra and applications to graphs Part 1

Math 408 Advanced Linear Algebra

Math Linear Algebra Final Exam Review Sheet

Eigenvalues and Eigenvectors 7.2 Diagonalization

Notes on basis changes and matrix diagonalization

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Eigenvectors. Prop-Defn

Topics in linear algebra

JUST THE MATHS UNIT NUMBER 9.8. MATRICES 8 (Characteristic properties) & (Similarity transformations) A.J.Hobson

HOMEWORK 7 solutions

GENERALIZED EIGENVECTORS, MINIMAL POLYNOMIALS AND THEOREM OF CAYLEY-HAMILTION

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Solutions to Exam I MATH 304, section 6

Mathematical Methods wk 2: Linear Operators

GROUPS. Chapter-1 EXAMPLES 1.1. INTRODUCTION 1.2. BINARY OPERATION

22.3. Repeated Eigenvalues and Symmetric Matrices. Introduction. Prerequisites. Learning Outcomes

Final A. Problem Points Score Total 100. Math115A Nadja Hempel 03/23/2017

Travis Schedler. Thurs, Oct 27, 2011 (version: Thurs, Oct 27, 1:00 PM)

Linear Algebra (Math-324) Lecture Notes

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

MATH 20F: LINEAR ALGEBRA LECTURE B00 (T. KEMP)

MATRICES. knowledge on matrices Knowledge on matrix operations. Matrix as a tool of solving linear equations with two or three unknowns.

Linear Algebra: Matrix Eigenvalue Problems

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Linear Algebra Practice Problems

LINEAR ALGEBRA BOOT CAMP WEEK 4: THE SPECTRAL THEOREM

MATH 511 ADVANCED LINEAR ALGEBRA SPRING 2006

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA

Repeated Eigenvalues and Symmetric Matrices

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

2.3. VECTOR SPACES 25

Lecture 2: Linear operators

3 Matrix Algebra. 3.1 Operations on matrices

From Lay, 5.4. If we always treat a matrix as defining a linear transformation, what role does diagonalisation play?

Math 217: Eigenspaces and Characteristic Polynomials Professor Karen Smith

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

1. General Vector Spaces

Announcements Wednesday, November 01

The Jordan Normal Form and its Applications

1. Foundations of Numerics from Advanced Mathematics. Linear Algebra

Math 346 Notes on Linear Algebra

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

BASIC ALGORITHMS IN LINEAR ALGEBRA. Matrices and Applications of Gaussian Elimination. A 2 x. A T m x. A 1 x A T 1. A m x

Assignment 1 Math 5341 Linear Algebra Review. Give complete answers to each of the following questions. Show all of your work.

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

CSL361 Problem set 4: Basic linear algebra

MTH 464: Computational Linear Algebra

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Math 121 Practice Final Solutions

1 Invariant subspaces

Applied Mathematics 205. Unit V: Eigenvalue Problems. Lecturer: Dr. David Knezevic

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

Econ Slides from Lecture 7

Transcription:

A Proof of Fundamental Theorem of Algebra Through Linear Algebra due to Derksen February 13, 2011

MA106-Linear Algebra 2011

MA106-Linear Algebra 2011 We present a proof of the Fundamental Theorem of Algebra (FTA) Every non constant polynomial in one variable with coefficients in C has a root in C.

MA106-Linear Algebra 2011 We present a proof of the Fundamental Theorem of Algebra (FTA) Every non constant polynomial in one variable with coefficients in C has a root in C. through a sequence of easily do-able exercises. The proof uses elementary linear algebra except that we begin with the intermediate value theorem in proving that every odd degee real polynomial has a real root.

Based on an article in

Based on an article in Amer. Math. Monthly- 110,(2003), pp. 620-623.

Sketch of the Proof In what follows K will denote any field. However, we need worry about only two cases here: K is either R or C.

We begin with a basic result in real analysis:

We begin with a basic result in real analysis: Intermediate Value Theorem

We begin with a basic result in real analysis: Intermediate Value Theorem which immediately yields:

We begin with a basic result in real analysis: Intermediate Value Theorem which immediately yields: Every odd degree polynomial p(t) R[t] has a real root. It should be noted that there is no purely algebraic proof of FTA which does not use IVT. Indeed, all proofs of FTA use IVT explicitly or implicitly. The simple reason is that IVT is equivalent to any other axiom that is used in the construction of real numbers.

We begin with a basic result in real analysis: Intermediate Value Theorem which immediately yields: Every odd degree polynomial p(t) R[t] has a real root. It should be noted that there is no purely algebraic proof of FTA which does not use IVT. Indeed, all proofs of FTA use IVT explicitly or implicitly. The simple reason is that IVT is equivalent to any other axiom that is used in the construction of real numbers. From now onwards we only use linear algebra.

Sketch of the Proof Companion Matrix Let p(t) = t n + a 1 t n 1 + + a n be a monic polynomial of degree n. Its companion matrix C p is defined to be the n n matrix C p = 0 1 0 0 0 0 0 1 0 0.... 0 0 1 a n a n 1 a 1.

Sketch of the Proof Companion Matrix Let p(t) = t n + a 1 t n 1 + + a n be a monic polynomial of degree n. Its companion matrix C p is defined to be the n n matrix C p = 0 1 0 0 0 0 0 1 0 0.... 0 0 1 a n a n 1 a 1. Ex.1. Show that the characteristic polynomial of C p is p, i.e., det(c p ti n ) = ( 1) n p(t).

Ex. 2. Show that every non constant polynomial p(t) K[t] of degree n has a root in K iff every endomorphism K n K n has an eigenvalue in K

Ex. 2. Show that every non constant polynomial p(t) K[t] of degree n has a root in K iff every endomorphism K n K n has an eigenvalue in K iff every n n matrix over K has an eigenvalue in K.

Ex. 2. Show that every non constant polynomial p(t) K[t] of degree n has a root in K iff every endomorphism K n K n has an eigenvalue in K iff every n n matrix over K has an eigenvalue in K. Recall that by an endomorphism of a vector space V we mean a linear transformation V V.

Ex. 2. Show that every non constant polynomial p(t) K[t] of degree n has a root in K iff every endomorphism K n K n has an eigenvalue in K iff every n n matrix over K has an eigenvalue in K. Recall that by an endomorphism of a vector space V we mean a linear transformation V V. Ex. 3 Show that every R-linear map f : R 2n+1 R 2n+1 has a real eigenvalue.

Sketch of the Proof For positive integers r, let S 1 (K, r) denote the following statement: Every endomorphism A : K n K n has an eigenvector for all n not divisible by 2 r. Also let S 2 (K, r) denote the statement Any two commuting endomorphisms K n K n have a common eigenvector for all n not divisible by 2 r.

Sketch of the Proof For positive integers r, let S 1 (K, r) denote the following statement: Every endomorphism A : K n K n has an eigenvector for all n not divisible by 2 r. Also let S 2 (K, r) denote the statement Any two commuting endomorphisms K n K n have a common eigenvector for all n not divisible by 2 r. We shall prove one result over any field K and two special results for K = R and C.

Sketch of the Proof For positive integers r, let S 1 (K, r) denote the following statement: Every endomorphism A : K n K n has an eigenvector for all n not divisible by 2 r. Also let S 2 (K, r) denote the statement Any two commuting endomorphisms K n K n have a common eigenvector for all n not divisible by 2 r. We shall prove one result over any field K and two special results for K = R and C.

Sketch of Proof (a) S 1 (K, r) = S 2 (K, r), r 1.

Sketch of Proof (a) S 1 (K, r) = S 2 (K, r), r 1. (b) S 2 (R, 1) = S 1 (C, 1).

Sketch of Proof (a) S 1 (K, r) = S 2 (K, r), r 1. (b) S 2 (R, 1) = S 1 (C, 1). (c) S 1 (C, r) = S 1 (C, r + 1), r 1.

Sketch of Proof (a) S 1 (K, r) = S 2 (K, r), r 1. (b) S 2 (R, 1) = S 1 (C, 1). (c) S 1 (C, r) = S 1 (C, r + 1), r 1. Begin with IVT which is the same as S 1 (R, 1). Putting K = R in (a) we get S 2 (R, 1).Now (b) gives S 1 (C, 1) and repeated application of (c) gives S 1 (C, k) for all k 1.

Sketch of Proof (a) S 1 (K, r) = S 2 (K, r), r 1. (b) S 2 (R, 1) = S 1 (C, 1). (c) S 1 (C, r) = S 1 (C, r + 1), r 1. Begin with IVT which is the same as S 1 (R, 1). Putting K = R in (a) we get S 2 (R, 1).Now (b) gives S 1 (C, 1) and repeated application of (c) gives S 1 (C, k) for all k 1. Given n, write n = 2 k l where l is odd. Then S 1 (C, k + 1) implies that every polynomial of degree n has a root. And that completes the proof of FTA.

Ex. 4. Prove statement (a): S 1 (K, r) = S 2 (K, r).

Ex. 4. Prove statement (a): S 1 (K, r) = S 2 (K, r). We shall prove this by induction on the dimension n of the vector space.

Ex. 4. Prove statement (a): S 1 (K, r) = S 2 (K, r). We shall prove this by induction on the dimension n of the vector space. For n = 1 this is clear, since any non zero vector is an eigenvector for all endomorphisms.

Ex. 4. Prove statement (a): S 1 (K, r) = S 2 (K, r). We shall prove this by induction on the dimension n of the vector space. For n = 1 this is clear, since any non zero vector is an eigenvector for all endomorphisms. So assume that the statement is true for smaller values of n and let A, B be two commuting n n matrices over K, and n is not divisible by 2 r.

S 1 (K, r) implies S 2 (K, r) Let λ be an eigenvalue of A and put V 1 = N (A λ I ), V 2 = R(A λ I ). Since B commutes with A, it follows that B commutes with A λi ) also. Therefore B restricts to endomorphisms β : V 1 V 1 and β : V 2 V 2. (Refer: Tut. sheet 9.18.)

S 1 (K, r) implies S 2 (K, r) Let λ be an eigenvalue of A and put V 1 = N (A λ I ), V 2 = R(A λ I ). Since B commutes with A, it follows that B commutes with A λi ) also. Therefore B restricts to endomorphisms β : V 1 V 1 and β : V 2 V 2. (Refer: Tut. sheet 9.18.) By rank-nullity theorem, dim V 1 + dim V 2 = n and hence dim V 1 or dim V 2 is not divisible by 2 r.

S 1 (K, r) implies S 2 (K, r) Let λ be an eigenvalue of A and put V 1 = N (A λ I ), V 2 = R(A λ I ). Since B commutes with A, it follows that B commutes with A λi ) also. Therefore B restricts to endomorphisms β : V 1 V 1 and β : V 2 V 2. (Refer: Tut. sheet 9.18.) By rank-nullity theorem, dim V 1 + dim V 2 = n and hence dim V 1 or dim V 2 is not divisible by 2 r. If dim V 1 is not divisible by 2 r then β : V 1 V 1 has an eigenvector which is also the eigenvector for A.

S 1 (K, r) implies S 2 (K, r) Let λ be an eigenvalue of A and put V 1 = N (A λ I ), V 2 = R(A λ I ). Since B commutes with A, it follows that B commutes with A λi ) also. Therefore B restricts to endomorphisms β : V 1 V 1 and β : V 2 V 2. (Refer: Tut. sheet 9.18.) By rank-nullity theorem, dim V 1 + dim V 2 = n and hence dim V 1 or dim V 2 is not divisible by 2 r. If dim V 1 is not divisible by 2 r then β : V 1 V 1 has an eigenvector which is also the eigenvector for A. If not, then dim V 2 < n and not divisible by 2 r. Now we consider α, β : V 2 V 2 which are restrictions of A and B and hence are commuting endomorphisms. By induction we are through there is a common eigen vector v V 2 V for α and β which is then a common eigen vector for A and B as well.

Ex. 5 Show that the space HERM n (C) of all complex Hermitian n n matrices is a R vector space of dimension n 2.

Ex. 5 Show that the space HERM n (C) of all complex Hermitian n n matrices is a R vector space of dimension n 2. (Refer to Exercise 8.5 and a question in the Quiz.)

Ex. 5 Show that the space HERM n (C) of all complex Hermitian n n matrices is a R vector space of dimension n 2. (Refer to Exercise 8.5 and a question in the Quiz.) Recall that a matrix A is hermitian if it is equal to its transpose conjugate A. and the next one is from 8.6. Ex. 6 Given A M n (C), the mappings α A (B) = 1 2 (AB +BA ); β A (B) = 1 2ı (AB BA ) define R-linear maps HERM n (C) HERM n (C). Show that α A, β A commute with each other.

Answer: 4ıα A β A (B) = 2α A (AB BA ) = [A(AB BA ) + (AB BA )A ] = [A(AB + BA ) (AB + BA )A ] = 2ıβ A (AB + BA ) = 4ıβ A α A (B).

Ex. 7. If α A and β A have a common eigenvector then A has an eigenvalue in C.

Ex. 7. If α A and β A have a common eigenvector then A has an eigenvalue in C. Answer: If α A (B) = λb and β A (B) = µb then consider AB = (α A + ıβ A )(B) = (λ + ıµ)b.

Ex. 7. If α A and β A have a common eigenvector then A has an eigenvalue in C. Answer: If α A (B) = λb and β A (B) = µb then consider AB = (α A + ıβ A )(B) = (λ + ıµ)b. Since B is an eigenvector, at least one of the column vectors of B, say u 0.

Ex. 7. If α A and β A have a common eigenvector then A has an eigenvalue in C. Answer: If α A (B) = λb and β A (B) = µb then consider AB = (α A + ıβ A )(B) = (λ + ıµ)b. Since B is an eigenvector, at least one of the column vectors of B, say u 0. It follows that Au = (λ + ıµ)u and hence λ + ıµ is an eigenvalue for A.

Ex.8. Prove statement (b): S 2 (R, 1) = S 1 (C, 1). (Exercise 11.4)

Ex.8. Prove statement (b): S 2 (R, 1) = S 1 (C, 1). (Exercise 11.4) Solution: By Ex.6, given A M n (C), α A, β A : Herm n (C) Herm n (C) are two commuting endomorphisms of the real vector space Herm n which is of dimension n 2, by ex. 5.

Ex.8. Prove statement (b): S 2 (R, 1) = S 1 (C, 1). (Exercise 11.4) Solution: By Ex.6, given A M n (C), α A, β A : Herm n (C) Herm n (C) are two commuting endomorphisms of the real vector space Herm n which is of dimension n 2, by ex. 5. If n is odd, so is n 2.

Ex.8. Prove statement (b): S 2 (R, 1) = S 1 (C, 1). (Exercise 11.4) Solution: By Ex.6, given A M n (C), α A, β A : Herm n (C) Herm n (C) are two commuting endomorphisms of the real vector space Herm n which is of dimension n 2, by ex. 5. If n is odd, so is n 2. By Ex.4, it follows that α A, β A have a common eigenvector.

Ex.8. Prove statement (b): S 2 (R, 1) = S 1 (C, 1). (Exercise 11.4) Solution: By Ex.6, given A M n (C), α A, β A : Herm n (C) Herm n (C) are two commuting endomorphisms of the real vector space Herm n which is of dimension n 2, by ex. 5. If n is odd, so is n 2. By Ex.4, it follows that α A, β A have a common eigenvector. By Ex. 7, A has an eigenvalue.

Ex. 9. Show that the space Sym n (K) of symmetric n n matrices forms a subspace of dimension n(n + 1)/2 of M n (K). (Exercise 8.5)

Ex. 9. Show that the space Sym n (K) of symmetric n n matrices forms a subspace of dimension n(n + 1)/2 of M n (K). (Exercise 8.5) The next one is Ex. 11.5.

Ex. 9. Show that the space Sym n (K) of symmetric n n matrices forms a subspace of dimension n(n + 1)/2 of M n (K). (Exercise 8.5) The next one is Ex. 11.5. Ex. 10 Given A M n (K), show that φ A : B 1 2 (AB + BAt ); ψ A : B ABA t define two commuting endomorphisms of Sym n (K). Show that if B is a common eigenvector of φ A, ψ A, then (A 2 + aa + b I n )B = 0 for some a, b K; further if K = C, conclude that A has an eigenvalue.

Answer: The first part is easy.

Answer: The first part is easy. To see the second part, suppose and φ A (B) = AB + BA t = λb ψ A (B) = ABA t = µb

Answer: The first part is easy. To see the second part, suppose and φ A (B) = AB + BA t = λb ψ A (B) = ABA t = µb Multiply first relation by A on the left and use the second to obtain which is the same as A 2 B + µb λab = 0 (A 2 λa + µ I n )B = 0.

For the last part, first observe that since B is an eigenvector there is at least one column vector v which is non zero. Therefore (A 2 + aa + b I n )v = 0.

For the last part, first observe that since B is an eigenvector there is at least one column vector v which is non zero. Therefore (A 2 + aa + b I n )v = 0. Now write A 2 + aa + bi n = (A λ 1 I n )(A λ 2 I n ). If (A λ 2 I n )v = 0, then λ 2 is an eigenvalue of A and we are through.

For the last part, first observe that since B is an eigenvector there is at least one column vector v which is non zero. Therefore (A 2 + aa + b I n )v = 0. Now write A 2 + aa + bi n = (A λ 1 I n )(A λ 2 I n ). If (A λ 2 I n )v = 0, then λ 2 is an eigenvalue of A and we are through. Otherwise put u = (A λ 2 I n )v 0. Then (A λ 1 I n )u = 0 and hence λ 1 is an eigenvalue of A.

Ex. 11 Prove statement (c) S 1 (C, r) = S 1 (C, r + 1). Hence conclude that S 1 (C, r) is true for all r 1.

Ex. 11 Prove statement (c) S 1 (C, r) = S 1 (C, r + 1). Hence conclude that S 1 (C, r) is true for all r 1. Answer: Let n = 2 k l where 1 k r and l is odd. Let A M n (C). Then φ A, ψ A are two mutually commuting operators on Sym n (C) which is of dimension n(n + 1)/2 = 2 k 1 l(2 k l + 1) which is not divisible by 2 r.

Ex. 11 Prove statement (c) S 1 (C, r) = S 1 (C, r + 1). Hence conclude that S 1 (C, r) is true for all r 1. Answer: Let n = 2 k l where 1 k r and l is odd. Let A M n (C). Then φ A, ψ A are two mutually commuting operators on Sym n (C) which is of dimension n(n + 1)/2 = 2 k 1 l(2 k l + 1) which is not divisible by 2 r. By Ex. 10, with K = C along with the induction hypothesis, it follows that A has an eigenvalue.

We summerise what we did.

We summerise what we did. In order to prove that every non constant polynomial over complex numbers has a root we note that

We summerise what we did. In order to prove that every non constant polynomial over complex numbers has a root we note that by Ex.2, it is enough to show that every n n complex matrix has an eigenvalue.

We summerise what we did. In order to prove that every non constant polynomial over complex numbers has a root we note that by Ex.2, it is enough to show that every n n complex matrix has an eigenvalue. By Ex.8, this holds for odd n. This implies that S 1 (C, 1). By Ex. 11 applied repeatedly, we get S 1 (C, r) for all r. Given any polynomial of degree n, write n = 2 r l, where l is odd. Then S 1 (C, r + 1) implies that every polynomial of degree n has a root.

THANK YOU FOR YOUR ATTENTION