h r t r 1 (1 x i=1 (1 + x i t). e r t r = i=1 ( 1) i e i h r i = 0 r 1.

Similar documents
Two Remarks on Skew Tableaux

Chapter 2:Determinants. Section 2.1: Determinants by cofactor expansion

Chapter 4 - MATRIX ALGEBRA. ... a 2j... a 2n. a i1 a i2... a ij... a in

Math 240 Calculus III

Linear Systems and Matrices

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

ENGR-1100 Introduction to Engineering Analysis. Lecture 21. Lecture outline

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

Determinants Chapter 3 of Lay

Two Remarks on Skew Tableaux

ENGR-1100 Introduction to Engineering Analysis. Lecture 21

Lecture 6 : Kronecker Product of Schur Functions Part I

Review for Exam Find all a for which the following linear system has no solutions, one solution, and infinitely many solutions.

Lecture Summaries for Linear Algebra M51A

1 Determinants. 1.1 Determinant

Math Linear Algebra Final Exam Review Sheet

Determinants. Beifang Chen

c c c c c c c c c c a 3x3 matrix C= has a determinant determined by

Linear Algebra: Lecture notes from Kolman and Hill 9th edition.

Formula for the inverse matrix. Cramer s rule. Review: 3 3 determinants can be computed expanding by any row or column

Lecture Notes in Linear Algebra

Row-strict quasisymmetric Schur functions

Smith Normal Form and Combinatorics

MATH2210 Notebook 2 Spring 2018

Evaluating Determinants by Row Reduction

Linear Algebra M1 - FIB. Contents: 5. Matrices, systems of linear equations and determinants 6. Vector space 7. Linear maps 8.

Math 304 Fall 2018 Exam 3 Solutions 1. (18 Points, 3 Pts each part) Let A, B, C, D be square matrices of the same size such that

MATRIX ALGEBRA AND SYSTEMS OF EQUATIONS. + + x 1 x 2. x n 8 (4) 3 4 2

A Murnaghan-Nakayama Rule for k-schur Functions

LINEAR ALGEBRA REVIEW

Shifted symmetric functions I: the vanishing property, skew Young diagrams and symmetric group characters

MATH 213 Linear Algebra and ODEs Spring 2015 Study Sheet for Midterm Exam. Topics

k=1 ( 1)k+j M kj detm kj. detm = ad bc. = 1 ( ) 2 ( )+3 ( ) = = 0

Littlewood Richardson polynomials

1 On the Jacobi-Trudi formula for dual stable Grothendieck polynomials

Components and change of basis

MATHEMAGICAL FORMULAS FOR SYMMETRIC FUNCTIONS. Contents

MATH Topics in Applied Mathematics Lecture 12: Evaluation of determinants. Cross product.

Cylindric Young Tableaux and their Properties

and let s calculate the image of some vectors under the transformation T.

Equality: Two matrices A and B are equal, i.e., A = B if A and B have the same order and the entries of A and B are the same.

1 Matrices and Systems of Linear Equations. a 1n a 2n

Determinantal Identities for Modular Schur Symmetric Functions

MATH 240 Spring, Chapter 1: Linear Equations and Matrices

Combinatorics for algebraic geometers

22m:033 Notes: 3.1 Introduction to Determinants

Linear Algebra 1 Exam 1 Solutions 6/12/3

MATH 106 LINEAR ALGEBRA LECTURE NOTES

ANSWERS. E k E 2 E 1 A = B

On Böttcher s mysterious identity

Determinants. Recall that the 2 2 matrix a b c d. is invertible if

MATH 2050 Assignment 8 Fall [10] 1. Find the determinant by reducing to triangular form for the following matrices.

Operators on k-tableaux and the k-littlewood Richardson rule for a special case. Sarah Elizabeth Iveson

II. Determinant Functions

Notes on the Matrix-Tree theorem and Cayley s tree enumerator

COINCIDENCES AMONG SKEW SCHUR FUNCTIONS

Matrix Algebra Determinant, Inverse matrix. Matrices. A. Fabretti. Mathematics 2 A.Y. 2015/2016. A. Fabretti Matrices

Therefore, A and B have the same characteristic polynomial and hence, the same eigenvalues.

The Combinatorics of Symmetric Functions: (3 + 1)-free Posets and the Poset Chain Conjecture

Determinants. Samy Tindel. Purdue University. Differential equations and linear algebra - MA 262

Combinatorial Structures

Chapter 4. Matrices and Matrix Rings

LINEAR ALGEBRA WITH APPLICATIONS

Lecture 1. (i,j) N 2 kx i y j, and this makes k[x, y]

Determinants by Cofactor Expansion (III)

2 b 3 b 4. c c 2 c 3 c 4

Chapter 2. Square matrices

Matrices and Linear Algebra

Properties of the Determinant Function

Lesson 3. Inverse of Matrices by Determinants and Gauss-Jordan Method

Lecture 8: Determinants I

MATH 1210 Assignment 4 Solutions 16R-T1

Homework 5 M 373K Mark Lindberg and Travis Schedler

Online Exercises for Linear Algebra XM511

MAC Module 3 Determinants. Learning Objectives. Upon completing this module, you should be able to:

Undergraduate Mathematical Economics Lecture 1

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

NOTES FOR MATH 740 (SYMMETRIC FUNCTIONS)

Math 215 HW #9 Solutions

Matrices. In this chapter: matrices, determinants. inverse matrix

0.1 Rational Canonical Forms

COMPOSITION OF TRANSPOSITIONS AND EQUALITY OF RIBBON SCHUR Q-FUNCTIONS

The Littlewood-Richardson Rule

Linear Algebra and Vector Analysis MATH 1120

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra Primer

Chapter 4. Determinants

1 Last time: least-squares problems

Chapter 3. Determinants and Eigenvalues

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

DETERMINANTS. , x 2 = a 11b 2 a 21 b 1

Here are some additional properties of the determinant function.

Lecture 10: Determinants and Cramer s Rule

IMPORTANT DEFINITIONS AND THEOREMS REFERENCE SHEET

Linear Algebra Primer

Linear Algebra: Lecture Notes. Dr Rachel Quinlan School of Mathematics, Statistics and Applied Mathematics NUI Galway

Foundations of Matrix Analysis

Linear Algebra Practice Final

Math Camp Lecture 4: Linear Algebra. Xiao Yu Wang. Aug 2010 MIT. Xiao Yu Wang (MIT) Math Camp /10 1 / 88


Transcription:

. Four definitions of Schur functions.. Lecture : Jacobi s definition (ca 850). Fix λ = (λ λ n and X = {x,...,x n }. () a α = det ( ) x α i j for α = (α,...,α n ) N n (2) Jacobi s def: s λ = a λ+δ /a δ (3) e k = s ( k ) = i <i 2 < <i k x i x i2 x ik (4) h k = s (k) = i i 2 i k x i x i2 x ik.2. Lecture 2: The Jacobi-Trudi Formula. Theorem. [Fundamental Thm Of Symmetric Functions] As rings, Z[x,...,x n ] Sn Z[e,...,e n ]. Proof. Need to show every symmetric polynomial can be written as a polynomial function of the e i s. Any monomial in the e i s is of the form e i e i2 e ij so might as well write indices in decreasing order since they commute. Define e λ := e λ e λ2 e λp for any partition λ = (λ λ p > 0). Consider the expansion of e λ into monomials. Each monomial can be thought of as a filling of λ where row i is filled in increasing order and represents a monomial chosen from e λi for i p. For example, if λ = (3, 2,, ) then two valid fillings are S = 2 3 2 T = 4 7 9 2 9 We use the notation x T to mean the monomial determined by the content of T. In the example, x S = x 4 x 2 2x 3 and x T = x 2 x 2 x 4 x 7 x 2 9. Thus, e λ = e λ e λ2 e λp = fillings T of λ with rows strictly increasing. x T = m λ + µ a λ,µ m µ where the sum is over all partitions µ of size λ which are larger than λ (conjugate of λ) in reverse lexicographic order.

2 From this expansion we see that the a λ,µ s are non-negative integers in a matrix with s down the diagonal. This matrix is lower triangular if the partitions of the same size as λ are ordered in rev lex order. Such a matrix is invertible, so the {e λ : λ is a partition with λ n } form a basis of Z[X] Sn. We conclude that every symmetric polynomial is expressible in a unique way in terms of the e i s. This implies Z[X] Sn Z[e,...,e n ] and that the e i s are algebraically independent. Corollary.. The e i s are algebraically independent so they minimally generate the ring of symmetric polynomials. Remark.2. There are relations in Z[x,...,x n ] Sn among the h i s for i since there are an infinite number of them. HW Express h n+ (x,...,x n ) in terms of the h i s for i n. Computer exploration might help here. Interestingly, the h i s become algebraically independent if n approaches infinity. Define the ring Λ = inverse limit as n Z[x,..., x n ] Sn. Thus Z[x,..., x n ] Sn = Λ xi =0 i>n. Define generating functions for the homogeneous and elementary symmetric functions in Λ by H(t) = h r t r = ( x r 0 i t) i= E(t) = r 0 e r t r = ( + x i t). i= Note, H(t) = E( t) =. Equating coefficients of t r we get r (.) ( ) i e i h r i = 0 r. i=0 Define the ring homomorphism ω : Λ Λ by ω(e r ) = h r. By (.), we see that ω is an involution proving the following proposition. Proposition.3. The ring Λ = Z[h, h 2,...] = Z[e, e 2,...]. Going back to the symmetric polynomial situation again, since Schur functions are symmetric they must be expressible as polynomials in the e i s. This gives us another way to define Schur functions.

. FOUR DEFINITIONS OF SCHUR FUNCTIONS 3 Theorem 2. [Jacobi-Trudi Formula] (aka Giambelli Formula) Let λ = (λ λ p ) and let q = λ. Then (.2) (.3) s λ = det (h λi i+j) i,j p = det ( ) e λ i i+j i,j q where by definition e i = h i = 0 if i < 0 and e 0 = h 0 =. Example: Assume λ = (4, 2, 2) so λ = (3, 3,, ). Then s λ is the determinant of the matrix e 3 e 4 e 5 e 6 e 2 e 3 e 4 e 5 0 e e 2 0 0 e HW 2: Assume A, B SL N (F) are inverses of each other. Then for any two k- subsets I, J [N] we have det(a I,J ) = ( ) Σ(I)+Σ(J) det(b J c I c). Here A I,J is the submatrix of A taking entries only in rows indexed by elements of I and columns indexed by elements of J. Also, write Σ(I) for the sum of the elements in I. Note, the matrix on the left will usually be a different size from the matrix on the right. This identity generalizes the formula for the inverse matrix in terms of cofactors B i,j = ( ) i+j det(a {j} c {i} c). HW 3: For any matrix X = (x i,j ), we have det(x i,j ) = det(( ) i j x i,j ). Proof. Let s prove the second equality first (due to Aitken). Let N be an integer larger than either p or q. N = p + q is a good choice. Set H = (h i j ) 0 i,j N E = ( ( ) i j e i j )0 i,j N. Both E and H are lower triangular with s down the diagonal so det(h) = det(e) =. Furthermore, because of (.) we know E and H are inverses of each other. Thus HW 2 applies. Consider the minor of H indexed by rows I = {λ i +p i : i p} and columns J = {p j : j p}. This minor is exactly (.2). Then by HW 2, we have that this minor is equal to ( ) Σ(I)+Σ(J) times its complementary cofactor in E t. What are the indices for the cofactor in terms of λ? Draw the Ferrers diagram for λ in a p by q rectangle. Starting at the lower left corner, number the up and

4 right steps around the SW perimeter of λ by 0,, 2,..., p + q = N. The up steps occur at {λ i + p i : i p}, i. e. λ + δ p. The complementary set are the horizontal steps. These horizontal steps are in bijection with the up steps for λ in the transposed picture. The map is given by complementing the numbers in value so the set is given by I c = {N λ i (q i) : i q} = {p λ i + i : i q}. Similarly, J c = {p + j : j q}. So by HW 2, det (h λi i+j) i,j p = det( ) λ (( ) λ i i+j e λ i i+j ) i,j q Distributing the ( ) λ through each row we get an equivalent form ( ( ) i+j e λ i i+j ) = ( eλ i i+j). by HW 3. Then Now the proof of the first equality: s λ = det (h λi i+j). Let e (k) r = e r (x,..., x k,...,x n ) for each k, r n. Define n E (k) (t) = r=0 e (k) r t r = i k( + x i t). H(t) E (k) ( t) = ( x k t). Picking out the coefficient of t a on both sides gives n h a n+j ( ) n j e (k) n j = xa k. j= Let α = (α,...,α n ) N n (weak composition) and define H α = (h αi n+j) i,j n and M = ( ) ( ) n i e k n i Then H α M = ( ) x α i j. Taking determinants of both sides gives det(h α )det(m) = det ( ) x α i j = aα. We find the det(m) by making a judicious choice of α: det(h δ ) = det (h n i n+j ) = det (h j i ) = i,k n since its upper triangular with s along the diagonal. So det(m) = a δ. Thus a α /a δ = det(h δ ) and taking α = λ + δ gives s λ = a λ+δ a δ = det(h λ+δ ) = det (h λi +n i n+j) = det (h λi i+j) i,k n.

. FOUR DEFINITIONS OF SCHUR FUNCTIONS 5 Compare with statement of the theorem. It is a p p matrix. To finish the proof just note the block lower triangular form of the n n matrix here. N.B. The final form of the Jacobi-Trudi determinants does not depend on the number of variables n as long as n is larger than the length of the first row or column of the partition. Thus, the expansion of Schur functions into homogeneous or elementary symmetric functions is stable under the inverse limit. This gives us a way to define Schur functions in Λ. (.4) (.5) Corollary.4. ω(s λ ) = s λ. One more big result comes from the Jacobi s definition of Schur functions. Theorem 3. [Pieri s Formula] s λ e k = s λ h k = µ/λ vertical strip of size k s µ µ/λ horizontal strip of size k where a vertical strip is a skew shape with no two cells in the same row and a horizontal strip has no two cells in a column. For example, if λ = (3, 3, ) then the shapes that occur all with multiplicity one in Pieri s formula are s µ So s (3,3,) e 3 = s (4,4,2) + s (4,4,,) + s (4,3,,,) + s (3,3,2,,) + s (3,3,,,,). N.B. The Pieri formula completely determines the way Schur functions multiply since we know the Jacobi-Trudy formula. In fact, we can use this rule alone to define the Schur functions.

6 Proof. Expand ( ) a λ+δ e k = sgn(w)x w(λ+δ) x i x i2 x ik w S n i < <i k n = sgn(w)x w(λ+δ) x w(i )x w(i2 ) x w(ik ) w S n = χ {0,} n χ =k i < <i k n a λ+χ+δ. Note that if any λ + χ + δ appearing the last sum is not strictly decreasing then it must have two equal parts so that term vanishes. The remaining non-vanishing terms all correspond with adding a vertical strip to λ. The second formulation follows by applying ω. HW 4: Say that µ is an even partition if all of its parts are even numbers. Show ( ) ( n ) s µ e k = s λ. µ even k=0 λ