A hyperfactorial divisibility Darij Grinberg *long version*

Similar documents
arxiv:math/ v1 [math.co] 13 Jul 2005

Linear Systems and Matrices

RHOMBUS TILINGS OF A HEXAGON WITH TWO TRIANGLES MISSING ON THE SYMMETRY AXIS

EVALUATION OF A FAMILY OF BINOMIAL DETERMINANTS

MATH Mathematics for Agriculture II

Chapter 1: Systems of linear equations and matrices. Section 1.1: Introduction to systems of linear equations

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

The 4-periodic spiral determinant

A matrix is a rectangular array of. objects arranged in rows and columns. The objects are called the entries. is called the size of the matrix, and

A FIRST COURSE IN LINEAR ALGEBRA. An Open Text by Ken Kuttler. Matrix Arithmetic

Elementary Row Operations on Matrices

[ Here 21 is the dot product of (3, 1, 2, 5) with (2, 3, 1, 2), and 31 is the dot product of

Linear Algebra. Matrices Operations. Consider, for example, a system of equations such as x + 2y z + 4w = 0, 3x 4y + 2z 6w = 0, x 3y 2z + w = 0.

This is an auxiliary note; its goal is to prove a form of the Chinese Remainder Theorem that will be used in [2].

ACI-matrices all of whose completions have the same rank

arxiv:math/ v1 [math.co] 5 Oct 1998

Math 240 Calculus III

A dual of MacMahon s theorem on plane partitions

0.1 Rational Canonical Forms

Notes on Mathematics

We could express the left side as a sum of vectors and obtain the Vector Form of a Linear System: a 12 a x n. a m2

Noncommutative Abel-like identities

SOLUTION OF GENERALIZED LINEAR VECTOR EQUATIONS IN IDEMPOTENT ALGEBRA

Lecture Summaries for Linear Algebra M51A

Chapter 7. Linear Algebra: Matrices, Vectors,

MATH 2030: MATRICES. Example 0.2. Q:Define A 1 =, A. 3 4 A: We wish to find c 1, c 2, and c 3 such that. c 1 + c c

MAC Module 2 Systems of Linear Equations and Matrices II. Learning Objectives. Upon completing this module, you should be able to :

MATRICES. a m,1 a m,n A =

2 Two-Point Boundary Value Problems

CHAPTER 6. Direct Methods for Solving Linear Systems

Matrix Algebra 2.1 MATRIX OPERATIONS Pearson Education, Inc.

Using q-calculus to study LDL T factorization of a certain Vandermonde matrix

Linear Algebra and Matrix Inversion

UNIT 1 DETERMINANTS 1.0 INTRODUCTION 1.1 OBJECTIVES. Structure

ICS 6N Computational Linear Algebra Matrix Algebra

A linear algebra proof of the fundamental theorem of algebra

Linear Equations in Linear Algebra

A linear algebra proof of the fundamental theorem of algebra

1 Matrices and matrix algebra

Math 121 Homework 2 Solutions

A Catalan-Hankel Determinant Evaluation

Enumerating some symmetry classes of rhombus tilings of holey hexagons

MULTISECTION METHOD AND FURTHER FORMULAE FOR π. De-Yin Zheng

Matrix Operations. Linear Combination Vector Algebra Angle Between Vectors Projections and Reflections Equality of matrices, Augmented Matrix

For comments, corrections, etc Please contact Ahnaf Abbas: Sharjah Institute of Technology. Matrices Handout #8.

Matrices BUSINESS MATHEMATICS

CS 246 Review of Linear Algebra 01/17/19

Fundamentals of Engineering Analysis (650163)

CHAPTER I. Rings. Definition A ring R is a set with two binary operations, addition + and

Counting Matrices Over a Finite Field With All Eigenvalues in the Field

I = i 0,

Collected trivialities on algebra derivations

Chapter 5. Linear Algebra. A linear (algebraic) equation in. unknowns, x 1, x 2,..., x n, is. an equation of the form

3 Matrix Algebra. 3.1 Operations on matrices

Linear Algebra: Lecture notes from Kolman and Hill 9th edition.

OHSx XM511 Linear Algebra: Solutions to Online True/False Exercises

HOMEWORK 7 solutions

REU 2007 Apprentice Class Lecture 8

Introduction to Matrix Algebra

Elementary Linear Algebra. Howard Anton & Chris Rorres

Matrices. Math 240 Calculus III. Wednesday, July 10, Summer 2013, Session II. Matrices. Math 240. Definitions and Notation.

Matrix Algebra. Matrix Algebra. Chapter 8 - S&B

Lecture 3: Matrix and Matrix Operations

Matrices. 1 a a2 1 b b 2 1 c c π

Math 321: Linear Algebra

6.1 Matrices. Definition: A Matrix A is a rectangular array of the form. A 11 A 12 A 1n A 21. A 2n. A m1 A m2 A mn A 22.

1 - Systems of Linear Equations

Chapter 1: Systems of Linear Equations and Matrices

On Boolean functions which are bent and negabent

INSTITIÚID TEICNEOLAÍOCHTA CHEATHARLACH INSTITUTE OF TECHNOLOGY CARLOW MATRICES

Matrices and Vectors

Intrinsic products and factorizations of matrices

Phys 201. Matrices and Determinants

Groebner Bases, Toric Ideals and Integer Programming: An Application to Economics. Tan Tran Junior Major-Economics& Mathematics

ANALOGUES OF THE TRIPLE PRODUCT IDENTITY, LEBESGUE S IDENTITY AND EULER S PENTAGONAL NUMBER THEOREM

Rings, Integral Domains, and Fields

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =

Sparsity of Matrix Canonical Forms. Xingzhi Zhan East China Normal University

Chapter 2 Notes, Linear Algebra 5e Lay

MATH 326: RINGS AND MODULES STEFAN GILLE

Ring Theory Problems. A σ

MATRICES AND MATRIX OPERATIONS

18.312: Algebraic Combinatorics Lionel Levine. Lecture 22. Smith normal form of an integer matrix (linear algebra over Z).

Necessary And Sufficient Conditions For Existence of the LU Factorization of an Arbitrary Matrix.

Question: Given an n x n matrix A, how do we find its eigenvalues? Idea: Suppose c is an eigenvalue of A, then what is the determinant of A-cI?

MAT 2037 LINEAR ALGEBRA I web:

Math 4377/6308 Advanced Linear Algebra

CS100: DISCRETE STRUCTURES. Lecture 3 Matrices Ch 3 Pages:

Pascal Eigenspaces and Invariant Sequences of the First or Second Kind

MOMENTS OF INERTIA ASSOCIATED WITH THE LOZENGE TILINGS OF A HEXAGON

= W z1 + W z2 and W z1 z 2

HIGHER-ORDER DIFFERENCES AND HIGHER-ORDER PARTIAL SUMS OF EULER S PARTITION FUNCTION

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Linear Algebra Tutorial for Math3315/CSE3365 Daniel R. Reynolds

2: LINEAR TRANSFORMATIONS AND MATRICES

CLASS 12 ALGEBRA OF MATRICES

MATH 2331 Linear Algebra. Section 2.1 Matrix Operations. Definition: A : m n, B : n p. Example: Compute AB, if possible.

Solutions to Assignment 3

Math Camp II. Basic Linear Algebra. Yiqing Xu. Aug 26, 2014 MIT

Linear and Bilinear Algebra (2WF04) Jan Draisma

Transcription:

A hyperfactorial divisibility Darij Grinberg *long version* Let us define a function H : N N by n H n k! for every n N Our goal is to prove the following theorem: Theorem 0 MacMahon We have H b + c H c + a H a + b H a H b H c H a + b + c for every a N, every b N and every c N Remark: Here, we denote by N the set 0,, 2, } and not the set, 2, 3, }, as some authors do Before we come to the proof, first let us make some definitions: Notations [ ] j For any matrix A, we denote by A the entry in the j-th column and i the i-th row of A [This is usually denoted by A ij or by A i,j ] Let R be a ring Let u N and v N, and let a i,j be an element of R for every i, j, 2,, u}, 2,, v} Then, we denote by a i,j j v i u the u v matrix A R u v which satisfies [ ] j A a i i,j for every i, j, 2,, u}, 2,, v} Let R be a commutative ring with unity Let P R [X] be a polynomial Let j N Then, we denote by coeff j P the coefficient of the polynomial P before X j In particular, this implies coeff j P 0 for every j > deg P Thus, for every P R [X] and every d N satisfying deg P d, we have P X d coeff k P X k m If n and m are two integers, then the binomial coefficient n defined by m m m m n +, if n 0; n! n 0, if n < 0 m It is well-known that Z for all n Z and m Z n Q is

We recall a fact from linear algebra: Theorem Vandermonde erminant Let R be a commutative ring with unity Let m N Let a, a 2,, a m be m elements of R Then, a j j m i a i a j i m i,j,2,,m} 2 ; We are more interested in a corollary - and generalization - of this fact: Theorem 2 generalized Vandermonde erminant Let R be a commutative ring with unity Let m N For every j, 2,, m}, let P j R [X] be a polynomial such that deg P j j Let a, a 2,, a m be m elements of R Then, m P j a i j m i m coeff j P j a i a j j i,j,2,,m} 2 ; Both Theorems and 2 can be deduced from the following lemma: Lemma 3 Let R be a commutative ring with unity Let m N For every j, 2,, m}, let P j R [X] be a polynomial such that deg P j j Let a, a 2,, a m be m elements of R Then, P j a i j m i m m coeff j P j j a j i j m i m Proof of Lemma 3 For every j, 2,, m}, we have P j X m coeff k P j X k since deg P j j m, since j m Thus, for every i, 2,, m} and j, 2,, m}, we have m m m P j a i coeff k P j a k i a k i coeff k P j a k i coeff k P j k here we substituted k for k in the sum Hence, P j a i j m i m a j j m i coeff i m i P j j m i m 2 according to the definition of the product of two matrices Here is the proof of 2 in more ail: The definition of the product of two matrices yields a j i j m i m This proves 2 coeff i P j j m i m m a k i coeff k P j k 2 P ja i by j m i m P j a i j m i m

But the matrix coeff i P j j m i m is upper triangular since coeff i P j 0 for every i, 2,, m} and j, 2,, m} satisfying i > j 2 ; hence, coeff i P j j m i m m coeff j P j since the erminant of an upper j triangular matrix equals the product of its diagonal entries Now, 2 yields a P j a i j m j j m i m i coeff i m i P j j m i m a j j m i coeff i m i P j j m i m j a j i j m i m m coeff j P j j m coeff j P j j m a j coeff j P j i j m i m and thus, Lemma 3 is proven Proof of Theorem For every j, 2,, m}, define a polynomial P j R [X] by P j X j X a k Then, P j is a monic polynomial of degree j k since P j is a product of j monic polynomials of degree each 3 In other words, deg P j j and coeff j P j for every j, 2,, m} Obviously, deg P j j yields deg P j j for every j, 2,, m} Thus, Lemma 3 yields P j a i j m i m m coeff j P j j a j i, j m i m 3 But the matrix P j a i j m i m is lower triangular since P j a i 0 for every i, 2,, m} and j, 2,, m} satisfying i < j 4 ; hence, P j a i j m i m m P j a j since the erminant of a lower triangular matrix equals the product j 2 because i > j yields i > j, thus i > deg P j since deg P j j and therefore coeff i P j 0 3 because X a k is a monic polynomial of degree for every k, 2,, j }, and we have P j X j k X a k 4 because i < j yields i j since i and j are integers and thus P j a i 0 j k a i a k since the factor of the product j k since P j X j k X a k a i a k for k i equals a i a i 0 3

of its diagonal entries Thus, 3 rewrites as m m a j P j a j coeff j P j i j j Since m coeff j P j m, this simplifies to j j Thus, a j i j m i m m a j P j a j i j m P j a j j m j m i m j j m i m j k j,2,,m} k,2,,j } k,2,,m}; k<j a j a k j j since P j X X a k yields P j a j a j a k j,2,,m} i,j,2,,m} 2 ; j<i k,2,,m}; k<j a i a j k a j a k j,k,2,,m} 2 ; k<j k a j a k here we renamed j and k as i and j in the product a i a j i,j,2,,m} 2 ; Hence, Theorem is proven Proof of Theorem 2 Lemma 3 yields m a P j a i j m j j m i m coeff j P j i i m j m coeff j P j j Hence, Theorem 2 is proven A consequence of Theorem 2 is the following fact: i,j,2,,m} 2 ; a i a j by Theorem i,j,2,,m} 2 ; a i a j 4

Corollary 4 Let R be a commutative ring with unity Let m N Let a, a 2,, a m be m elements of R Then, j j m a i k a i a j k i m i,j,2,,m} 2 ; Proof of Corollary 4 For every j, 2,, m}, define a polynomial P j R [X] by P j X j X k Then, P j is a monic polynomial of degree j k since P j is a product of j monic polynomials of degree each 5 In other words, deg P j j and coeff j P j for every j, 2,, m} Obviously, deg P j j yields deg P j j for every j, 2,, m} Thus, Theorem 2 yields P j a i j m i m Since P j a i j P j X j k j k m coeff j P j j i,j,2,,m} 2 ; a i a j a i k for every i, 2,, m} and j, 2,, m} since X k, this rewrites as a i k k j m i m m coeff j P j j Since m coeff j P j m, this simplifies to j j j j m a i k k i m Hence, Corollary 4 is proven We shall need the following simple lemma: Lemma 5 Let m N Then, i,j,2,,m} 2 ; i,j,2,,m} 2 ; i j H m i,j,2,,m} 2 ; a i a j a i a j 5 because X k is a monic polynomial of degree for every k, 2,, j }, and we have P j X j k X k 5

Proof of Lemma 5 We have i,j,2,,m} 2 ; i j i,j 0,,,m } 2 ; i+>j+ i,j 0,,,m } 2 ; since i+>j+ is equivalent to i + j + i j here we substituted i + and j + for i and j in the product i j i,j 0,,,m } 2 ; i 0,,,m } j 0,,,m }; i 0,,,m } i 0,,,m } i 0,,,m } j 0,,,m }; j N; j m and since j 0,,,m } is equivalent to j N and j m i j j N; j m and j N; since the assertion j m and is equivalent to because if, then j m since i 0,,,m } yields i m j N; i j N; j0 j<i i j i j i 0,,,m } j0 i i j here we substituted i j for j in the second product i 0,,,m } } } m i0 H m i! m i0 i! m i j i 0,,,m } j i! k! here we renamed i as k in the product Hence, Lemma 5 is proven Now let us prove Theorem 0: Let a N, let b N and let c N 6

We have H a + b + c H b + c H c + a b+c c+a a+b+c H a + b k! a+b Ha+b k! a+b+c ka+b c a + b + i! i k! H a + b a+b+c ka+b here we substituted a + b + i for k in the product, 4 b k! k! Hb b+c kb k! H b b+c kb k! H b k! c b + i! here we substituted b + i for k in the product, 5 a k! k! Ha Next, we show a lemma: c+a ka k! H a c+a ka k! H a i c a + i! here we substituted a + i for k in the product 6 Lemma 6 For every i N and j N satisfying i and j, we have a j + b + i a + b + i! a + i! b + j! a + i k Proof of Lemma 6 Let i N and j N be such that i and j One of the following two cases must hold: Case : We have 0 Case 2: We have < 0 In Case, we have a + i! a+i k k a+i j k a+i j! k j! a + i k k a+i ka+i j+ k k! i a+i ka+i j+ here we substituted a + i k for k in the product, k 7

so that Now, a + b + i j a + i!! a + i k 7 k a + b + i!! a + b + i! a + b + i!! b + j! since a + b + i b + j a + b + i! a + i! a + i! b + j!! j a + b + i! a + i! b + j! a + i k by 7 Hence, Lemma 6 holds in Case In Case 2, we have j a + i k k so that 0, a+i k a + i k a+i ka+i k a + i k } } a+i a+i0 since < 0 yields a + i < j a + b + i 0 since < 0 a + b + i! a + i! b + j! 0 j ka+i+ j a+i k k j a + b + i! a + i! b + j! a + i k k a + i k Hence, Lemma 6 holds in Case 2 Hence, in both cases, Lemma 6 holds Thus, Lemma 6 always holds, and this completes the proof of Lemma 6 Another trivial lemma: Lemma 7 Let R be a commutative ring with unity Let u N and v N, and let a i,j be an element of R for every i, j, 2,, u}, 2,, v} a Let α, α 2,, α u be u elements of R Then, αi, if j i; j u i u a i,j j v i u α ia i,j j v i u 8

b Let β, β 2,, β v be v elements of R Then, a i,j j v i u βi, if j i; j v i v a i,j β j j v i u c Let α, α 2,, α u be u elements of R Let β, β 2,, β v be v elements of R Then, αi, if j i; j u a i,j j v βi, if j i; i u i u j v i v α i a i,j β j j v i u Thus, d Let α, α 2,, α u be u elements of R Let β, β 2,, β v be v elements of R If u v, then u v α i a i,j β j j v i u α i β i a i,j j v i u i Proof of Lemma 7 a For every i, 2,, u} and j, 2,, v}, we have u l l,2,,u} l,2,,u}; li l,2,,u}; li α i a i,j αi, if l i; 0, if l i αi, if l i; 0, if l i α i, since li α i a l,j + l,2,,u}; l i a l,j i l,2,,u} a l,j + 0 a l,j l,2,,u}; l i αi, if l i; 0, if l i l,2,,u}; li αi, if l i; 0, if l i 0, since l i a l,j a l,j α i a l,j l i} α i a l,j } } 0 since i, 2,, u} yields l, 2,, u} l i} i} αi, if j i; j u a i,j j v i u i u u l αi, if l i; 0, if l i a l,j j v i u α i a i,j j v i u, and thus, Lemma 7 a is proven 9

b For every i, 2,, u} and j, 2,, v}, we have v βl, if j l; a i,l a 0, if j l i,l Thus, l l,2,,v} l,2,,v}; lj l,2,,v}; lj a i,j β j a i,l a i,l β l + a i,j j v βi, if j i; i u βl, if j l; 0, if j l β l, since lj yields jl l,2,,v}; l j l,2,,v} + l,2,,v}; l j a i,l 0 a i,l l,2,,v}; lj βl, if j l; 0, if j l βl, if j l; 0, if j l 0, since l j yields j l a i,l β l a i,l β l l j} } } 0 since j, 2,, v} yields l, 2,, v} l j} j} j v i v and thus, Lemma 7 b is proven c We have j u αi, if j i; v a i,l l a i,j j v i u } i u } α i a i,j j v i u by Lemma 7 a α i a i,j j v i u βi, if j i; βl, if j l; 0, if j l βi, if j i; j v i v j v i u j v α i a i,j β j j v i u i v a i,j β j j v i u, by Lemma 7 b applied to α i a i,j instead of a i,j Thus, Lemma 7 c is proven j u αi, if j i; αi, if j i; d The matrix is diagonal since i u 0 for every i, 2,, u} and j, 2,, u} satisfying j i Since the erminant of a diagonal matrix equals the product of its diagonal entries, this yields j u αi, if j i; u αi, if i i; u α 0, if i i i i u i i α i, since ii Similarly, βi, if j i; j v i v v β i i 0

Lemma 7 c yields α i a i,j β j j v i u αi, if j i; j u i u a i,j j v i u βi, if j i; j v i v Thus, if u v, then α i a i,j β j j v i u j u j v αi, if j i; a i,j j v i u βi, if j i; i u i v j u αi, if j i; j v a i,j j v βi, if j i; i u i u i v u α i i u v α i β i i i a i,j j v i u v β i i Thus, Lemma 7 d is proven Now, let us prove Theorem 0: Proof of Theorem 0 Let a N, b N and c N We have a j c + b + i i c j j c a + b + i! a + i! b + j! a + i k by Lemma 6 k i c j j c a + b + i! a + i k a + i! b + j! c i a + b + i! a + i! k c i b + i! j i c by Lemma 7 d, applied to R Q, u c, v c, a i,j a + i k k j a + b + i! α i and β i Since a + i! b + i! j j c a + i k a + i a + j k i c i j i,j,2,,c} 2 ; k j c i c a + i k, by Corollary 4, applied to R Z, m c and a i a + i for every i, 2,, c} i j H c by Lemma 5, applied to m c, i,j,2,,c} 2 ;

this becomes a + b + i since Now, j c i c c i c a + b + i! H c a + i! b + i! i 8 H a H b H c H a + b + c H b + c H c + a H a + b c H a H b H c H a + b a + b + i! i c c H b b + i! H a a + i! H a + b i by 4, 5 and 6 c a + b + i! i c c H c b + i! a + i! i i c a + b + i! i c c H c a + i! b + i! i i a + b + i! c c i a + i! i b + i! c c a + b + i! H c a + i! b + i! i i a j c + b + i by 8 9 Z i c i j c a + b + i Z c c In other words, i c H b + c H c + a H a + b H a H b H c H a + b + c Thus, Theorem 0 is proven Remarks Theorem 0 was briefly mentioned with a combinatorial interpretation, but without proof on the first page of [] It also follows from the formula H a H b H c H a + b + c 2 in [3] since c a + b + i! i! H b + c H c + a H a + b i a + i! b + i!, or, equivalently, the formula 27 in [4] It is also generalized in [2], Section 429 where one has to consider the limit x 2 We can prove more: 2

Theorem 8 For every a N, every b N and every c N, we have H a H b H c H a + b + c H b + c H c + a H a + b a + b + i j c i c We recall a useful fact to help us in the proof: a + b j c i c Theorem 9, the Vandermonde convolution identity Let x Z and y Z Let q Z Then, x + y x y q k q k k Z The sum on the right hand side is an infinite sum, but only finitely many of its addends are nonzero Proof of Theorem 8 For every i, 2,, c} and every j, 2,, c}, we have a + b + i a + b i k k k Z by Theorem 9, applied to x a + b, y i and q a + b i a j + l a j + l l Z l Z here we substituted a j + l for k in the sum a + b i a j + l i l i l Z; 0 i l i is true l Z; 0 i l i l Z; l i since 0 i l i is equivalent to l i l Z; l i l a + b a j + l a + b a j + l i + i l i l + l Z; 0 i l i is false l Z; 0 i l i is false a + b 0 a j + l } } 0 a + b a j + l l Z; l i i l i i l 0, since i 0 and 0 i l i is false a + b i a j + l i l } } i a + b i l a j + l i i a + b c i a + b i l a j + l i l a j + l l i here we replaced the sign by an c sign, since all addends for l > i l l i are zero as 0 for l > i, since i l < 0 for l > i and since c i i l 3

Thus, j c a + b + i i c c l j c i a + b i l a j + l i c j c i i j i c a + b a j + i a + b j c i a + b i j i c j c i c j c i c 0 j c i i Now, the matrix is lower triangular since 0 for every i j i c i j i, 2,, m} and j, 2,, m} satisfying i < j 6 Since the erminant of an lower triangular matrix equals the product of its diagonal entries, this yields i j c m j m i j i c j j j } } j j Now, a + b + i j c i c Combined with 9, this yields 0 i j c a + b i j i c by j c i c by 0 i j c a + b i j i c H a H b H c H a + b + c H b + c H c + a H a + b a j c + b + i i c i 6 because i < j yields i j < 0 and thus i j a + b 4 0 j c i c a + b j c i c j c i c

Thus, Theorem 8 is proven 3 We notice a particularly known consequence of Corollary 4: Corollary 0 Let m N Let a, a 2,, a m be m integers Then, ai j m H m a i a j 2 j i m i,j,2,,m} 2 ; In particular, H m i,j,2,,m} 2 ; a i a j Proof of Corollary 0 Corollary 4 applied to R Z yields j j m a i k a i a j 3 k i m i,j,2,,m} 2 ; Now, for every i, 2,, m} and j, 2,, m}, we have ai j j j! a i k j! j! j a i k k a i k j a i k here we substituted k for k in the product j a i k a i k j j! k k j! 4 5

Therefore, ai j m j a i k j i m j! k m m j j m a i k i! i i k i m a i a j m i,j,2,,m} 2 ; i! so that i by 3 by Lemma 7 d, applied to R Q, u m, v m, a i,j j a i k, α i and β i k i! m a i a j, i! i i,j,2,,m} 2 ; Thus, i,j,2,,m} 2 ; a i a j since ai j Z ai j ai j j m i m j m i m m i! i m k! Hm j m i m here we substituted k for i in the product ai j m H m j H m j m i m i,j,2,,m} 2 ; i m a i a j Z Thus, Corollary 0 is proven Corollary Let m N Let a, a 2,, a m be m integers Then, j m ai H m a i a j j i m i,j,2,,m} 2 ; 6

Proof of Corollary The equality 2 applied to a i + instead of a i yields ai + j j m i m H m i,j,2,,m} 2 ; i,j,2,,m} 2 ; a i + a j + a i a j a i a j Since a i + a i for every i, 2,, m}, this rewrites as j m ai H m a i a j j This proves Corollary i m i,j,2,,m} 2 ; References [] Theresia Eisenkölbl, Rhombus tilings of a hexagon with three fixed border tiles, J Combin Theory Ser A 88 999, 368-378; arxiv:math/97226v2 [mathco] http://arxivorg/abs/math/97226v2 [2] Percy Alexander MacMahon, Combinatory Analysis, vol 2, Cambridge University Press, 96; reprinted by Chelsea, New York, 960 http://wwwarchiveorg/ails/combinatoryanaly02macmuoft 7 [3] Christian Krattenthaler, Advanced Determinant Calculus: A Complement, Linear Algebra Appl 4 2005, 68-66; arxiv:math/0503507v2 [mathco] http://arxivorg/abs/mathco/0503507v2 [4] Christian Krattenthaler, Advanced Determinant Calculus, S\ eminaire Lotharingien Combin 42 999 The Andrews Festschrift, paper B42q, 67 pp; arxiv:math/9902004v3 [mathco] http://arxivorg/abs/math/9902004v3 7 See also: Percy Alexander MacMahon, Combinatory Analysis, vol, Cambridge University Press, 95, http://wwwarchiveorg/ails/combinatoryanal0macmuoft; Percy Alexander MacMahon, An introduction to Combinatory analysis, Cambridge University Press, 920, http://wwwarchiveorg/ails/ introductiontoco00macmrich 7