Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since

Similar documents
ALGEBRA QUALIFYING EXAM PROBLEMS LINEAR ALGEBRA

Selected exercises from Abstract Algebra by Dummit and Foote (3rd edition).

Topics in linear algebra

NONCOMMUTATIVE POLYNOMIAL EQUATIONS. Edward S. Letzter. Introduction

Math 113 Midterm Exam Solutions

Math 121 Homework 5: Notes on Selected Problems

Algebra Exam Syllabus

Lecture Summaries for Linear Algebra M51A

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

Bare-bones outline of eigenvalue theory and the Jordan canonical form

Math Linear Algebra Final Exam Review Sheet

Online Exercises for Linear Algebra XM511

GRE Subject test preparation Spring 2016 Topic: Abstract Algebra, Linear Algebra, Number Theory.

Math 121 Homework 4: Notes on Selected Problems

Linear Algebra. Workbook

TENSOR PRODUCTS II KEITH CONRAD


Honors Algebra 4, MATH 371 Winter 2010 Assignment 4 Due Wednesday, February 17 at 08:35

The Jordan Canonical Form

0.1 Rational Canonical Forms

University of Colorado at Denver Mathematics Department Applied Linear Algebra Preliminary Exam With Solutions 16 January 2009, 10:00 am 2:00 pm

1. What is the determinant of the following matrix? a 1 a 2 4a 3 2a 2 b 1 b 2 4b 3 2b c 1. = 4, then det

(f + g)(s) = f(s) + g(s) for f, g V, s S (cf)(s) = cf(s) for c F, f V, s S

235 Final exam review questions

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

LINEAR ALGEBRA BOOT CAMP WEEK 1: THE BASICS

Homework For each of the following matrices, find the minimal polynomial and determine whether the matrix is diagonalizable.

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

Review problems for MA 54, Fall 2004.

No books, notes, any calculator, or electronic devices are allowed on this exam. Show all of your steps in each answer to receive a full credit.

2 b 3 b 4. c c 2 c 3 c 4

Math 4A Notes. Written by Victoria Kala Last updated June 11, 2017

NATIONAL UNIVERSITY OF SINGAPORE DEPARTMENT OF MATHEMATICS SEMESTER 2 EXAMINATION, AY 2010/2011. Linear Algebra II. May 2011 Time allowed :

Determinants. Beifang Chen

SUPPLEMENT TO CHAPTERS VII/VIII

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

M.6. Rational canonical form

ELE/MCE 503 Linear Algebra Facts Fall 2018

Definition (T -invariant subspace) Example. Example

ALGEBRA QUALIFYING EXAM PROBLEMS

Chapter 2: Linear Independence and Bases

LINEAR ALGEBRA SUMMARY SHEET.

1. Let m 1 and n 1 be two natural numbers such that m > n. Which of the following is/are true?

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Math 110 Linear Algebra Midterm 2 Review October 28, 2017

Solutions to the August 2008 Qualifying Examination

Chapter 3. Vector spaces

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Linear Algebra Highlights

Infinite-Dimensional Triangularization

MATH PRACTICE EXAM 1 SOLUTIONS

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

MAT Linear Algebra Collection of sample exams

Homework 6 Solutions. Solution. Note {e t, te t, t 2 e t, e 2t } is linearly independent. If β = {e t, te t, t 2 e t, e 2t }, then

Cheat Sheet for MATH461

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

DIAGONALIZATION. In order to see the implications of this definition, let us consider the following example Example 1. Consider the matrix

22M: 121 Final Exam. Answer any three in this section. Each question is worth 10 points.

Chapter 7. Linear Algebra: Matrices, Vectors,

Computations/Applications

MATH 5b: Solutions to Homework Set 7 March 2015

MATH 235. Final ANSWERS May 5, 2015

Math 113 Winter 2013 Prof. Church Midterm Solutions

Problems in Linear Algebra and Representation Theory

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Math 25a Practice Final #1 Solutions

First we introduce the sets that are going to serve as the generalizations of the scalars.

IDEAL CLASSES AND RELATIVE INTEGERS

Solution to Homework 1

ADVANCED COMMUTATIVE ALGEBRA: PROBLEM SETS

NOTES on LINEAR ALGEBRA 1

This MUST hold matrix multiplication satisfies the distributive property.

Vector Spaces and Linear Transformations

JORDAN AND RATIONAL CANONICAL FORMS

Lecture notes - Math 110 Lec 002, Summer The reference [LADR] stands for Axler s Linear Algebra Done Right, 3rd edition.

1 Linear Algebra Problems

Introduction to modules

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

APPENDIX: MATHEMATICAL INDUCTION AND OTHER FORMS OF PROOF

MATH 115A: SAMPLE FINAL SOLUTIONS

BENJAMIN LEVINE. 2. Principal Ideal Domains We will first investigate the properties of principal ideal domains and unique factorization domains.

MATH 213 Linear Algebra and ODEs Spring 2015 Study Sheet for Midterm Exam. Topics

Algebra Homework, Edition 2 9 September 2010

Math 554 Qualifying Exam. You may use any theorems from the textbook. Any other claims must be proved in details.

Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that

UNIVERSAL IDENTITIES. b = det

Chapter 2:Determinants. Section 2.1: Determinants by cofactor expansion

Then since v is an eigenvector of T, we have (T λi)v = 0. Then

MATH 323 Linear Algebra Lecture 12: Basis of a vector space (continued). Rank and nullity of a matrix.

Algebra Qualifying Exam Solutions. Thomas Goller

The converse is clear, since

EXERCISES AND SOLUTIONS IN LINEAR ALGEBRA

2.3. VECTOR SPACES 25

Remark By definition, an eigenvector must be a nonzero vector, but eigenvalue could be zero.

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

MATH SOLUTIONS TO PRACTICE MIDTERM LECTURE 1, SUMMER Given vector spaces V and W, V W is the vector space given by

1. General Vector Spaces

MATH 320: PRACTICE PROBLEMS FOR THE FINAL AND SOLUTIONS

Transcription:

MAS 5312 Section 2779 Introduction to Algebra 2 Solutions to Selected Problems, Chapters 11 13 11.2.9 Given a linear ϕ : V V such that ϕ(w ) W, show ϕ induces linear ϕ W : W W and ϕ : V/W V/W : Solution. That ϕ W is a linear map W W follows from the definition of subspace. The map ϕ is ϕ(v + W ) = ϕ(v) + W, which is well-defined since v + W = v + W v v W ϕ(v v ) W ϕ(v) + W = ϕ(v ) + W. The linearity of ϕ follows from that of ϕ. If ϕ W and ϕ are nonsingular, show that ϕ is nonsingular: Solution. Suppose ϕ(v) = 0. The definition of ϕ shows that ϕ(v + W ) = 0. Since ϕ is nonsingular, v + W = 0 in V/W and thus v W. Since ϕ W is nonsingular, ϕ(v) = 0 and v W imply v = 0. The converse holds if dim V < : Solution. For a finite-dimensional space, a nonsingular linear map is an automorphism. Since V has finite dimension, so do W and V/W. We suppose ϕ is nonsingular. Evidently ϕ W is nonsingular; suppose now ϕ(v + W ) = 0. From the definition of ϕ we conclude that (ϕ) W, and since ϕ W is an automorphism there is a w W such that ϕ(w) = ϕ(v). Then ϕ(v w) = 0, and as ϕ is nonsingular, v w = 0, i.e. v = w and then v W. It follows that v + W = 0 in V/W, as required. The last statement is false in the infinite-dimensional case: Consider V = K N, ϕ((v i )) = (w i ) with w 0 = 0, w i = v i 1 for i > 0. Then the set W of (v i ) such that v 0 = 0 is a subspace of V stable under ϕ. Furthermore ϕ is nonsingular, but ϕ is identically 0. 11.2.11 Given ϕ : V V such that ϕ 2 = ϕ, (a) Show Im(ϕ) Ker(ϕ) = 0: Solution. Suppose v Im(ϕ) Ker(ϕ). Then ϕ(v) = 0 and v = ϕ(w) for some w V ; consequently as required. (b) Show that V = Im(ϕ) Ker(ϕ): v = ϕ(w) = ϕ 2 (w) = ϕ(v) = 0

Solution. We have shown Im(ϕ) Ker(ϕ) = 0, so by the recognition theorem for direct sums we need only show V = Im(ϕ + Ker(ϕ). In fact for any v V, v = ϕ(v) + (v ϕ(v)); clearly ϕ(v) Im(ϕ) and shows that v ϕ(v) Kerϕ. ϕ(v ϕ(v)) = ϕ(v) ϕ 2 (v) = ϕ(v) ϕ ( v) = 0 (c) Show that V has a basis for which the matrix of ϕ is diagonal, with all diagonal entries equal to 0 or 1: Solution. Pick a basis {v i } i I of Im(ϕ) and {w j } j J of Kerϕ. By part (b) we know that {v i, w j } i I, j J is a basis of V. Since ϕ(v i ) = v i and ϕ j (w j ) = 0 for all i and j, the matrix of ϕ is diagonal, with 1s and 0s on the diagonal. 11.3.3 V is a vector space of finite dimension, S V is a subset of the dual. (a) Show Ann(S) is a subspace of V : Solution. If v, w Ann(S) then v(x) = w(x) = 0 for all x S, and consequently (av + bw)(x) = av(x) + bw(x) = 0 for all x S and scalars a, b. It follows that av + bw Ann(S). (b) If W 1, W 2 V are subspaces, show Ann(W 1 + W 2 ) = Ann(W 1 ) Ann(W 2 ) and Ann(W 1 W 2 ) = Ann(W 1 ) + Ann(W 2 ): Solution. Since W 1 W 1 +W 2 and W 2 W 1 +W 2, Ann(W 1 +W 2 ) Ann(W 1 ) and Ann(W 1 + W 2 ) Ann(W 2 ), so finally Ann(W 1 + W 2 ) Ann(W 1 ) Ann(W 2 ). Conversely if v Ann(W 1 ) Ann(W 2 ) and x W 1, y W 2 then v(x) = v(y) = 0 and thus v(x + y) = 0. Since x and y were arbitrary, v Ann(W 1 + W 2 ). Notice that this part doesn t require V to have finite dimension. For the second equality we first choose a basis x 1,..., x r of W 1 W 2, and extend it to bases x 1,..., x r, y 1,..., y s of W 1 and x 1,..., x r, z 1,..., z s of W 2. Then the set of x 1,..., x r, y 1,..., y s, z 1,..., z s is independent (check this!), and may be extended to a basis x 1,..., x r, y 1,..., y s, z 1,..., z s, w 1..., w u of V. If we identify V V, we see that there is a basis x 1,..., x r, y 1,..., y s, z 1,..., z s, w 1..., w u of V for which the dual basis of V is x 1,..., x r, y 1,..., y s, z 1,..., z s, w 1..., w u. By (a), an element v = i a i x i + j b j y j + k c k z k + l +d l w l of V is in Ann(W 1 W 2 ) if and only if a i = 0 for all i. If this is so, v = v 1 + v 2

where v 1 = j b j y j Ann(W 1 ), v 2 = k c k z k + l +d l w l Ann(W 2 ). Since v was arbitrary, Ann(W 1 W 2 ) Ann(W 1 ) + Ann(W 2 ). For the reverse inclusion it suffices to show Ann(W 1 ) Ann(W 1 W 2 ) and Ann(W 2 ) Ann(W 1 W 2 ), but this follows from W 1 W 2 W 1 and W 1 W 2 W 2. (c) With the same notation, show W 1 = W 2 if and only if Ann(W 1 ) = Ann(W 2 ): Solution. It suffices to show the if part. By part (b) the hypothesis implies that Ann(W 1 W 2 ) = Ann(W 1 + W 2 ). We first claim that if W W V and Ann(W ) = Ann(W ) then W = W. If not, there is a v W \ W. Let x 1,..., x r be a basis of W ; since v W the set x 1,..., x r, v is independent and may be extended to a basis of V. Then the element v of the dual basis of V vanishes on W but not on W, and so Ann(W ) Ann(W ). Applying this to the spaces W 1 W 2 W 1 + W 2 V we see that in fact W 1 W 2 = W 1 + W 2 V and consequently W 1 = W 2. (d) Show Ann(S) = Ann(Span(S)): Solution. Clearly Ann(Span(S)) Ann(S). If v Ann(S) then s(v) = 0 for all s S and thus i a is i (v) = 0 if s i S. Thus v Ann(Span(S)). (e) If {v 1,..., v n } is a basis of V and S = {v1,..., vk }, show Ann(S) = Span(v k+1,..., v n ): Solution. If v = i a iv i then v i (v) = a i, so v Ann(S) if and only if a 1 = = a k = 0, i.e. v Span(v k+1,..., v n ). 11.3.4 If V has infinite dimension with basis A and A = {v v A}, show that A does not span V : Solution. Let v V be such that v (v) = 1 for all v A (there is a unique v with this property). If v Span(A ), there is a finite set T A such that v Span(T ). For v A \ T we then have v (v) = 0, a contradiction. Therefore v Span(A ). 11.4.3 If R is commutative with 1 and A is an n n matrix such that Ax = 0, x = (x 1 x 2 x n ) T, show that det(a)x i = 0 for all i: Solution. Write A in block form as A = (v 1 v 2 v n ) with column vectors v i R n. Then Ax = i x iv i = 0, and det(v 1 v 2 0 v n ) = 0 where the 0 is in column i. Since the determinant is a linear function of its

columns, we may write this x j det A j = 0 j where A j is the matrix obtained by replacing the ith column with the jth. If j i then A j has two identical columns, and then det(a j ) = 0. When j = i, A i = A, so the equation says that x i det(a) = 0. 12.1.4 Suppose R is an integral domain, M is an R-module, N is a submodule of rank r, and M/N has rank s; show M has rank r + s: Solution. There are independent elements n 1,..., n r N such that N = i Rn i is free and N/N is torsion, and elements m 1,..., m s M such that m 1 + N,..., m s + N are independent in M/N, M = j R(m j + N) M/N is free and (M/N)/M is torsion. We will show that n 1,..., n r, m 1,..., m s is an independent set and that if M = i Rm i + N then M/M is torsion; this will show that the rank of M is r + s. Suppose i a im i + j b jn j = 0. Then i b i(m i + N) = 0, and we see that b i = 0 for all i. Then j b jn j = 0, and it follows that b j = 0 for all j. Suppose now m M is nonzero. There is a nonzero r R such that r(m+n) M, and if we write r(m+n) = j a j(m j +N) then rm j a jm j N. There is then a nonzero s R such that s(rm i a i m i ) = N and it follows that srm M. Since R is an integral domain rs 0 and we have show that M/M is torsion. 12.1.13 Suppose R is a PID and M is a finitely generated R-module; describe M/Tor(M): Solution. We first observe that for any family {M i } i I of R-modules and M = i M i, Tor(M) = Tor(M i ). i In fact for (m i ) M, r R we have r(m i ) = 0 if and only if rm i = 0, from which it follows that Tor(M) i Tor(M i). Suppose now (m i ) i Tor(M i ) is nonzero. There is a finite set J I such that i J implies m i = 0. For i J choose nonzero r i R such that r i m i = 0 for i J; then if r = i J r i, rm i = 0 for all i J and thus r(m i ) = 0. Since R is an integral domain, r 0. Thus (m i ) Tor(M). If d is a nonzero element of R, Tor(R/(d)) = R/(d), while Tor(R) = 0. If finally

M i R/(d i) R n with d i 0, Tor(M) = i Tor(R/(d i )) Tor(R) n R n. 12.2.4 Show that two 3 3 matrices (coefficients in a field) with the same minimal and characteristic polynomials are similar: Solution. Suppose A and B are 3 3 matrices with characteristic polynomial Φ, and minimal polynomial m. 1. If m has degree 3, it is the only invariant factor for A and B, and so they are similar. 2. If m has degree 2 there is a linear polynomial f such that fm = Φ. Then A, B both have f, m for invariant divisors, and are similar. 3. If m is linear, Φ = f 3 and both A, B have invariant factors f, f, f. So they are similar. Give a counterexample for 4 4 matrices: Solution. Let l = X a be any linear polynomial; there are 4 4 matrices A, B such that A has invariant factors l, l, l 2 and B has invariant factors l 2, l 2. Then A and B both have minimal polynomial l 2 and characteristic polynomial l 4, but are not similar since their invariant factors are different. 12.2.7 Find the eigenvalues of the matrix 0 1 0 0 0 0 1 0 0 0 0 1. 1 0 0 0 Solution. Observe that the matrix is the transpose of the companion matrix of X 4 1, and so has the same characteristic polynomial, namely X 4 1. The problem does not specify the field! There are two cases: (a) If the field has characteristic 0 or p with p odd, the characteristic polynomial has 4 distinct roots ±1, ±i where i is a root of X 2 + 1 in some extension of the field of degree 1 or 2. (b) if the field has characterstic 2, the characteristic polynomial is (X 1) 4 which has 1 as a fourfold root. 12.2.8 Show that the characteristic polynomial of the companion matrix of a polynomial f(x) is f(x) itself:

Solution. Say f(x) = a 0 + a 1 x + + x n, so the characteristic polynomial is x 0 a 0 1 x 0 a 1 0 1 x 0 a 2........ 1 a n 1 + x The assertion is clear if n = 1, so we use induction on n. Using the cofactor expansion for the first row, we see this is x 0 a 1 1 x 0 0 1 x 0 a 2 x....... + ( 1) n+1 a. 0 0 1 x 0.. 1 a n 1 + x...... x 1 The first term is x times the characteristic polynomial of the companion matrix of a 1 +a 2 x+ +x n 1, which by induction is that polynomial itself. The second determinant is upper triangular, equal to ( 1) n 1. The assertion follows. 12.2.10 Find all similarity classes of 6 6 matrices over Q with minimal polynomial (X + 2) 2 (X 1). Solution. It suffices to find the possible invariant factors that yield this minimal polynomial. The characteristic polynomial has degree six and the last invariant factor has degree three, so the degrees of the others must add up to three. Here s how the possibilities break down: If there s only one other invariant factor, it s necessarily equal to (X + 2) 2 (X 1). The rational canonical form is 1 0 0 1 0 0 (entries not indicated are 0). If there are two other invariant factors, they have degrees one and two and successively divide each other (and (X + 2) 2 (X 1) as well). The possibilities are 1. X 1, (X 1)(X + 2), (X + 2) 2 (X 1) 2. X + 2, (X 1)(X + 2), (X + 2) 2 (X 1)

3. X + 2, (X + 2) 2, (X + 2) 2 (X 1) The corresponding rational canonical forms are 1 2 0 2 0 2 1 1 1 1 1 0 0 1 0 0 2 0 4 1 4. 1 0 0 If there are three other invariant factors they are all linear and equal. The possibilities are 1. X 1, X 1, X 1, (X + 2) 2 (X 1) 2. X + 2, X + 2, X + 2, (X + 2) 2 (X 1) and the rational canonical forms are 1 2 1 1 1 0 0 2 2 1 0 0 12.2.11 Find all similarity classes of 6 6 matrices over Q with characteristic polynomial (X 4 1)(X 2 1): Solution. Since (X 4 1)(X 2 1) = (X 2 +1)(X +1) 2 (X 1) 2 is the irreducible factorization in Q[X], the possible invariant factors are (X 4 1)(X 2 1) (X + 1)(X 1), (X 2 + 1)(X + 1)(X 1) X + 1, (X 2 + 1)(X + 1)(X 1) 2 X 1, (X 2 + 1)(X + 1) 2 (X 1) The corresponding rational canonical forms, in order, are 0 1 0 1 1 0 0 1 0 1 0 1 0 1 1 0 0 1 0 0 1 0 1 1 0 0 1 0 1 0

1 0 1 1 0 1 1 0 0 1 0 0 1 1 1 0 1 1 0 1 1 0 0 1 0 0 1 1 12.3.22 If A M n n (C) satisfies A 3 A = 0, then A is diagonalizable. Solution. The minimal polynomial divides X 3 X = X(X + 1)(X 1), and since all elementary divisors must divide this, the elementary divisors are all linear. This implies that A is diagonalizable: if (X a) is one of the elementary divisors, the corresponding summand of C n is C[X]/(X a) C and has dimension one. Is this true over an arbitrary field? Solution. It s false for a field of characteristic 2, for then the minimal polynomial factors X(X + 1) 2. If, say, this is both the minimal and characteristic polynomial (so that n = 3) the Jordan decomposition is a block of size one with eigenvalue 0, and a block of size two with eigenvalue one. Note, however, that if the characteristic is p 2, the situation is the same as for C. 12.3.23 There is no A M 3 3 (Q) with A 8 = I but A 4 I. Solution. The condition is that the minimal polynomial divides X 8 1 but not X 4 1. Since X 8 1 = (X 4 1)(X 4 + 1), this means that every irreducible factor of the minimal polynomial divides X 4 + 1. Since the latter is irreducible over Q, the minimal polynomial must then have degree at least four, which is absurd for a 3 3 matrix. 12.3.31 A nilpotent matrix is similar to a block diagonal matrix whose blocks have 1s on the first superdiagonal and 0s elsewhere. Solution. If A is nilpotent, say A k = 0, then Av = λv implies 0 = A k v = λ k v, so if v 0 then λ = 0. In other words, 0 is the only eigenvalue of A. Since the eigenvalues are all in the field, A is similar to a matrix in Jordan normal form, and this has the asserted type. 13.1.4 The map a + b 2 a b 2 is an isomorphism of Q( 2) with itself.

Solution. Say ϕ(a+b ) = a b 2. This is evidently a bijection. Furthermore ϕ((a + b 2) + (c + d 2)) = ϕ((a + c) + (b + d) 2) = (a + c) (b + d) 2 = (a b 2) + (c d 2) = ϕ(a + b 2) + ϕ(c + d 2) and ϕ((a + b 2)(c + d 2)) = ϕ((ac + 2bd) + (ad + bc) 2) = (ac + 2bd) (ad + bc) 2 = (a b 2)(c d 2) = ϕ(a + b 2)ϕ(c + d 2). 13.1.7 The polynomial X 3 nx + 2 is irreducible in Q[X] if n 1, 3, 5. Solution. If it s not irreducible it must have a linear factor, and thus a root in Q. Since it s monic, root in Q is integral and divides the constant term 2. So it suffices to substitute X = ±1, ±2 and see what this says about n: 1 3 n + 2 = 0 n = 3 ( 1) 3 n( 1) + 2 = 0 n = 1 2 3 2n + 2 = 0 n = 5 ( 2) 3 ( 2)n + 2 = 0 n = 3. 13.2.4 Find the degree over Q of 2 + 3 and 1 + 2 1/3 + 2 2/3. Solution. 2 + 3 is a root of (X 2) 2 3 = 0, so it has degree at most 2. If it had degree 1 then 3 Q, which is absurd. Let α = 1 + 2 1/3 + 2 2/3. This is contained in Q(2 1/3 ) which has degree 3 over Q, so the degree of α is either 1 or 3. If it has degree 1, then α and α 2 are in Q. Then 1 + 0 2 1/3 + 0 2 2/3 = 1 1 + 1 2 1/3 + 1 2 2/3 = α 5 + 4 2 1/3 + 3 2 2/3 = α 2 0 1 1 say that (1, 2 1/3, 2 2/3 ) is a solution of a 3 3 system with matrix 1 1 1 5 4 3 and whose known side is entirely rational. Since the matrix is rational and

invertible (in fact it has determinant 1), this implies that 2 1/3 and 2 2/3 are rational, which is false. Thus α has degree 3. 13.2.7 Show that Q( 2, 3) = Q( 2 + 3). Conclude that [Q 2 + 3 : Q] = 4. Find an irreducible polynomial satisfied by 2 + 3: Solution. Since 2, 3 Q( 2, 3) we have 2 + 3 Q( 2, 3) and thus Q( 2 + 3) Q( 2, 3). To show the reverse inclusion it s actually easier to show first that [Q( 2, 3) : Q] = 4. First, [Q( 2) : Q] = 2 and 3 is a root of a quadratic with coeffients in Q( 2) (in fact in Q) so the degree of 3 over Q( 2) is either 1 or 2. If it s 1, 3 = a + b 2 with a, b Q. Squaring this yields 3 = a 2 + 2b 2 + 4ab 2, which forces ab = 0 since 2 Q. But if a = 0, 3/2 is rational and if b = 0, 3 is rational. Thus 3 has degree 2 over Q( 2) and it follows that [Q( 2, 3) : Q] = 4. Now 1, 2 is a basis of Q( 2)/Q, and 1, 3 is a basis of Q( 2, 3)/Q( 2), so 1, 2, 3, 6 is a basis of Q( 2, 3)/Q. Since Q(α) Q( 2, 3), α = 2 + 3 has degree 1, 2 or 4 over Q, and Q(α) = Q( 2, 3) if and only if its degree is 4. If it has degree 1, 2 + 3 is rational and so is ( 2 + 3) 2 = 5 + 2 6, implying that 6 is rational. If α = 2 + 3 has degree 2, then 1, α and α 2 must be linearly dependent over Q. But a + bα + cα 2 = 0 is (a + 5c) + b 2 + b 3 + 2c 6 = 0 which implies a = b = c = 0 since 1, 2, 3, 6 are linearly independent over Q. Therefore α has degree 4 over Q, and Q(α) = Q( 2, 3). Since α has degree 4 over Q it s enough to find a monic polynomial of degree 4 in Q[X] of which α is a root. In fact the characteristic polynomial of the map Q(α) Q(α) given by x αx is such a polynomial, and if we use the basis 1, 2, 3, 6 the matrix is 0 2 3 0 1 0 0 3 1 0 0 2 0 1 1 0 and the polynomial is X 2 3 0 Φ(X) = 1 X 0 3 1 0 X 2. 0 1 1 X