ml then either k + m < n or k + n < m and one of the above scalar

Similar documents
STRONG NONNEGATIVE LINEARIZATION OF ORTHOGONAL POLYNOMIALS

Positivity of Turán determinants for orthogonal polynomials

Bounds on Turán determinants

Introduction to orthogonal polynomials. Michael Anshelevich

Nonnegative linearization for little q-laguerre polynomials and Faber basis

AN INEQUALITY OF TURAN TYPE FOR JACOBI POLYNOMIALS

Kaczmarz algorithm in Hilbert space

Hermite Interpolation and Sobolev Orthogonality

A lower estimate for the Lebesgue constants of linear means of Laguerre expansions

YALE UNIVERSITY DEPARTMENT OF COMPUTER SCIENCE

/ dr/>(t)=co, I TT^Ti =cf' ; = l,2,...,» = l,...,p.

Interpolation and Cubature at Geronimus Nodes Generated by Different Geronimus Polynomials

Vector spaces. DS-GA 1013 / MATH-GA 2824 Optimization-based Data Analysis.

96 CHAPTER 4. HILBERT SPACES. Spaces of square integrable functions. Take a Cauchy sequence f n in L 2 so that. f n f m 1 (b a) f n f m 2.

Solutions: Problem Set 3 Math 201B, Winter 2007

SUMS OF ENTIRE FUNCTIONS HAVING ONLY REAL ZEROS

INFINITE-DIMENSIONAL JACOBI MATRICES ASSOCIATED WITH JULIA SETS

Elementary linear algebra

A classification of sharp tridiagonal pairs. Tatsuro Ito, Kazumasa Nomura, Paul Terwilliger

NEW GENERATING FUNCTIONS FOR CLASSICAL POLYNOMIALS1. J. w. brown

Orthogonal polynomials

ON LINEAR COMBINATIONS OF

DS-GA 1002 Lecture notes 0 Fall Linear Algebra. These notes provide a review of basic concepts in linear algebra.

Notes on Special Functions

8. Diagonalization.

Moore Penrose inverses and commuting elements of C -algebras

Lecture Notes 1: Vector spaces

October 25, 2013 INNER PRODUCT SPACES

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

1 Last time: least-squares problems

n r t d n :4 T P bl D n, l d t z d th tr t. r pd l

NORMAL GENERATION OF UNITARY GROUPS OF CUNTZ ALGEBRAS BY INVOLUTIONS. 1. Introduction

Recall the convention that, for us, all vectors are column vectors.

PR D NT N n TR T F R 6 pr l 8 Th Pr d nt Th h t H h n t n, D D r r. Pr d nt: n J n r f th r d t r v th tr t d rn z t n pr r f th n t d t t. n

A determinant characterization of moment sequences with finitely many mass-points

0 t b r 6, 20 t l nf r nt f th l t th t v t f th th lv, ntr t n t th l l l nd d p rt nt th t f ttr t n th p nt t th r f l nd d tr b t n. R v n n th r

Measures, orthogonal polynomials, and continued fractions. Michael Anshelevich

1 angle.mcd Inner product and angle between two vectors. Note that a function is a special case of a vector. Instructor: Nam Sun Wang. x.

, > --1. [P(" )(x) (1 x) (1 x)dx. ( )(0) t( )P, (cos 0)(sin 0/2)+1/2(cos 0/2) +1/ A TRANSPLANTATION THEOREM FOR JACOBI SERIES

5 Compact linear operators

which arises when we compute the orthogonal projection of a vector y in a subspace with an orthogonal basis. Hence assume that P y = A ij = x j, x i

Uniqueness of the Solutions of Some Completion Problems


Measures, orthogonal polynomials, and continued fractions. Michael Anshelevich

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing

Spectral Theory of Orthogonal Polynomials

Homework 11 Solutions. Math 110, Fall 2013.

Order and type of indeterminate moment problems and the Valent conjecture. Ryszard Szwarc (joint work with Christian Berg) Copenhagen, August, 2015

Math 24 Spring 2012 Questions (mostly) from the Textbook

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

ORTHOGONAL POLYNOMIALS OF DISCRETE VARIABLE AND BOUNDEDNESS OF DIRICHLET KERNEL

14 EE 2402 Engineering Mathematics III Solutions to Tutorial 3 1. For n =0; 1; 2; 3; 4; 5 verify that P n (x) is a solution of Legendre's equation wit

ANOTE ON DIRECTLY ORDERED SUBSPACES OF R n. 1. Introduction

Multiple Orthogonal Polynomials

12x + 18y = 30? ax + by = m

OPSF, Random Matrices and Riemann-Hilbert problems

Markov operators, classical orthogonal polynomial ensembles, and random matrices

MAXIMAL SEPARABLE SUBFIELDS OF BOUNDED CODEGREE

Multiple orthogonal polynomials. Bessel weights

Ir O D = D = ( ) Section 2.6 Example 1. (Bottom of page 119) dim(v ) = dim(l(v, W )) = dim(v ) dim(f ) = dim(v )

Recall that any inner product space V has an associated norm defined by

MATH 1120 (LINEAR ALGEBRA 1), FINAL EXAM FALL 2011 SOLUTIONS TO PRACTICE VERSION

No books, no notes, no calculators. You must show work, unless the question is a true/false, yes/no, or fill-in-the-blank question.

Applied Linear Algebra in Geoscience Using MATLAB

MATHEMATICS 217 NOTES

POLYNOMIAL EQUATIONS OVER MATRICES. Robert Lee Wilson. Here are two well-known facts about polynomial equations over the complex numbers

Linear transformations preserving the strong q-log-convexity of polynomials

MATH 23a, FALL 2002 THEORETICAL LINEAR ALGEBRA AND MULTIVARIABLE CALCULUS Solutions to Final Exam (in-class portion) January 22, 2003

Introduction to Orthogonal Polynomials: Definition and basic properties

Properties of Transformations

Chapter 5.3: Series solution near an ordinary point

Linear Algebra- Final Exam Review

4 4 N v b r t, 20 xpr n f th ll f th p p l t n p pr d. H ndr d nd th nd f t v L th n n f th pr v n f V ln, r dn nd l r thr n nt pr n, h r th ff r d nd

Math 350 Fall 2011 Notes about inner product spaces. In this notes we state and prove some important properties of inner product spaces.

Solution to Homework 8, Math 2568

P AC COMMUTATORS AND THE R TRANSFORM

arxiv:math/ v1 [math.ca] 21 Mar 2006

ANOTHER CLASS OF INVERTIBLE OPERATORS WITHOUT SQUARE ROOTS

Gaussian interval quadrature rule for exponential weights

Linear Algebra 1. M.T.Nair Department of Mathematics, IIT Madras. and in that case x is called an eigenvector of T corresponding to the eigenvalue λ.

GENERATING FUNCTIONS FOR THE JACOBI POLYNOMIAL M. E. COHEN

(1) * "?; y«= hfï? ~ A'í>v + r^>>

INNER PRODUCT SPACE. Definition 1

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 1 Introduction to Linear Algebra

ULTRASPHERICAL TYPE GENERATING FUNCTIONS FOR ORTHOGONAL POLYNOMIALS

a 11 a 12 a 11 a 12 a 13 a 21 a 22 a 23 . a 31 a 32 a 33 a 12 a 21 a 23 a 31 a = = = = 12

Math Linear Algebra II. 1. Inner Products and Norms


Linear algebra II Homework #1 solutions A = This means that every eigenvector with eigenvalue λ = 1 must have the form

Rodrigues-type formulae for Hermite and Laguerre polynomials

ON PERFECT SUBFIELDS OVER WHICH A FIELD IS SEPARABLY GENERATED

Math 61CM - Solutions to homework 6

Strong and Weak Weighted Convergence of Jacobi Series

(iii) E, / at < Eygj a-j tfand only if E,e; h < Ejey bj for any I, J c N.

46 D b r 4, 20 : p t n f r n b P l h tr p, pl t z r f r n. nd n th t n t d f t n th tr ht r t b f l n t, nd th ff r n b ttl t th r p rf l pp n nt n th

Solutions to practice questions for the final

Q & A in General Topology, Vol. 16 (1998)

Chap 3. Linear Algebra

Inner Product Spaces An inner product on a complex linear space X is a function x y from X X C such that. (1) (2) (3) x x > 0 for x 0.

Inverse Polynomial Images which Consists of Two Jordan Arcs An Algebraic Solution

Transcription:

SIAM J. MATH. ANAL. Vol. 23, No. 4, pp. 959-964, July 1992 1992 Society for Industrial and Applied Mathematics 010 ORTHOGONAL POLYNOMIALS AND A DISCRETE BOUNDARY VALUE PROBLEM I* RYSZARD SZWARC" Abstract. Let {P.}.= be a system of orthogonal polynomials with respect to a measure/z on the real line. Sufficient conditions are given under which any product P,,Pm is a linear combination of Pk S with positive coefficients. Key words, orthogonal polynomials, recurrence formula AMS(MOS) subject classifications. 33A65, 39A70 Let us consider the following problem: we are given a probability measure Ix on the real line R all of whose moments x2"dix(x) are finite. Let {P, (x)} be an orthonormal system in L2(R, dix) obtained from the sequence 1, x, x2, by the Gram-Schmidt procedure. We assume that the support of Ix is an infinite set so that 1, x, x2,.., are linearly independent. Clearly P, is a polynomial of degree n which is orthogonal to all polynomials of degree less than n. It can be taken to have positive leading coefficients. The product P,Pm is a polynomial of degree n + m and it can be expressed uniquely as a linear combination of polynomials Po, P1,"" ", P.P,,,= Y c(n, m, k)pk k=0 with real coefficients c(n, m, k). Actually, if k <In-m then c(n, m, k)=0. This is because Hence if k < In c(n, m, k)=(p.pm, Pk)L(d.)=(P., PPk)L=(d,)=(P,,,, products vanishes. Finally we get ml then either k + m < n or k + n < m and one of the above scalar (1) P,Pm=., c(n, m, k)pk. k=ln-ml We ask when the coefficients c(n, m, k) are nonnegative for all n, m, k 0, 1, 2,. The positivity of coefficients c(n, m, k) (called also the linearization coefficients) gives rise to a convolution structure on ll(n) and if some additional boundedness condition is satisfied then with this new operation resembles of the circle (see [2]). Analogously to (1), we have (2) xp ]/npn+l +- npn + OlnPn_ for n O, 1, 2,.. (we apply the convention a0 y_ 0). The coefficients a, and y, are strictly positive. If the measure tx is symmetric, i.e., dix(x)= dix(-x), then/3, 0. When P, are normalized so that IIP, IIL2(,) 1 then we can check easily that a,+= y,. Hence, if we put A. y. we get (3) xpn=lnpn+l +npn +in_lpn_l forn=0, 1,2,... * Received by the editors January 1, 1990; accepted for publication August 16, 1991.? Mathematical Institute, University of Wroctaw, pl. Grunwaldzki 2/4, 50-384 Wroctaw, Poland. 959

960 RYSZARD SZWARC Favard [4] proved that the converse is also true, i.e., any system of polynomials satisfying (3) is orthonormal with respect to a probability measure/ (not necessarily unique). In case of bounded sequences A, and/3, we can recover the measure /z in the following way. Consider a linear operator L on 12(N) given by (4) La, A,a,+ + fl.a. + A._a,_, n O, 1, 2,. Then L is a self-adjoint operator on l-(n). Let de(x) be the spectral resolution associated with L. Then the system {P,} is orthonormal with respect to the measure d(x) d(e(x)6o, 60). The statement of the positivity of c(n, m, k) does not require orthonormalization of the polynomials P,. We can as well consider another normalization, i.e., let P, tr,p. where tr, is a sequence of positive numbers. The problem of positive coefficients in the product of P s is equivalent to that of P s. Moreover, it is easy to check that the polynomials P, satisfy the recurrence relation of the form (5) xp. T.P.+ + fl.p. + otnpn_ for n 0, 1, 2,. and the unique relation connecting a., y. and the coefficients A. from (3) is a.+ly. the sequence of diagonal coefficients/3, remains unchanged. From this observation it follows that if polyn.omials /3. satisfy (5) then after appropriate renormalization the polynomials P. c.p. satisfy (6) xp. On+lPn+ + fl.p. + "yn_lpn_l Consider the particular case of monic normalization, i.e., assume that the leading coefficient of any P, is 1. Then the recurrence formula is (7) xp,, P.+, + fl.p. + h_,p._,. In 1970 Askey proved the following theorem concerning the monic case. THEOREM (Askey 1 ]). Let P, satisfy (6) and let the sequences A, and ft, be increasing (A, _-> 0); then the linearization coefficients in the formula P,P,.= Y. c(n, m, k)pk k=ln-ml are nonnegative. This theorem applies to the Hermite, Laguerre, and Jacobi polynomials with a +/3 _-> 1 (see [7]). However, it does not cover the symmetric Jacobi polynomials with a =/3 when -1/2_-< a -< (and, in particular, the Legendre polynomials when a =/3 0). Recall that the problem of positive linearization for Jacobi polynomials was completely solved by Gaspar in [5] and [6]. In particular, c(n, m, k) are positive for a->_ fl and a+fl+l_-->0. The aim of this paper is to give a generalization of Askey s result so it would cover the symmetric Jacobi polynomials for a->_-1/2. One of the results is as follows. THEOREM 1. Ifpolynomials Pn satisfy xp. e.p.+, + #.P. + and (i) a., ft., and a. + y. are increasing sequences y., a. >-0), (ii) a.<- y. for n=o, 1,2,..., then c(n. rn, k) >= 0 (see (1)). It is remarkable that the assumptions on a. s and 3. s are separated from that on A2

DISCRETE BOUNDARY VALUE PROBLEM 961 Before giving a proof let us explain how Askey s theorem can be derived from Theorem 1. If polynomials /3, satisfy the assumptions of Askey s theorem then after orthonormalization of P. s we get the system of polynomials P, satisfying (3), i.e., xp.=a.p.++.p.+a._p._ forn=0, 1,2,. -, and if h, and/3, are increasing then putting a, h,_l and % h, we can see easily that the assumptions of Theorem 1 are also satisfied. Example. Consider the symmetric Jacobi polynomials R ) normalized by R (1)- 1. They satisfy the following recurrence formula: In this case xr(,) an n + 2a + 1 o(,) n an+ "t-.tn_ 2n +2a + 1 2n +2a + 1 n n+2a+l 2n+2a+l % 2n+2a+l /3.=0. Observe that a. + % 1 and a. is increasing when a ->_ -. We have also a % when Instead of showing Theorem 1 we will prove a more general result. THEOREM 2. Let polynomials P. satisfy xp. %P.+l + fl.p.. + a.p.-1 polynomials P. q.p. satisfy and let for some sequence of positive numbers xp. y P.+, +.P. + 1" Assume also that (i) fl fl. for any m n, (ii) a < a. for any m < n, (iii) a + y a + y for any m < n 1, (iv) a N y for any m n. en the linearization coefficients c(n, m, k) in the formula P.Pm +m c(n, m, k)pk are nonnegative. Setting a a. and y y., we can easily see that Theorem 2 implies Theorem 1. Proof First observe that we have a.+ly. a.+y. Moreover, by the remarks preceding (6) we may assume that P. and P. satisfy xp. an+ipn+i +.P. + xp. a.+,p.+ +.P. +.- lp.-, The rest of the proof will follow from the maximum principle for a discrete boundary value problem. Let L and L be linear operators acting on sequences {a.}.u by the rule (8) +,

962 RYSZARD SZWARC Let Lm and L. denote the operators acting on complex functions u(n, m), n, m N, as L and L but with respect to the m- or n-variable treating the other variable as a parameter. Let us consider the following problem" N x N (n, m) u(n, m) C and (Ln-Lm)u--O (9) u(n,o)>-o. - -- THEOREM 3. Assume that a, > 0 for n >- 1 (wefollow the convention ao a O) and B for any m n, (ii) am a for any m < n, (iii) a,, + Ym <= a, + 3" for any m < n 1, (iv) a, <= 3", for any m <-- n. Then u(n, m) >= 0 for m n. Proof On the contrary, assume that u is negative at some points. Let (n, m + 1) be the lowest point in the domain ((s, t): s t} for which u(n, m+ 1)<0. It means that u(s, t) is nonnegative if t m. Consider the rectangular triangle with vertices A(n, m), B(n-m, O) and C(n+ m, 0), as illustrated in Fig. 1. (i) m -- m x) n o I I I o m m c m m o FIG. All lattice points in AABC we divide into two subsets" 1, consisting of the points (k, l) such that k- n m(mod 2), and the rest 1]2. In the figure the points of are marked by while the points of -2 are marked by U]. Let 3 denote the lattice points connecting (n-m-1, 0) and (n, m+ 1) (except (n, m/ 1)) and f14 denote those which connect (n+ m+ 1, 0) with (n, m+ 1) (except (n, m+ 1)). The points of 3 and f4 are marked by and O, respectively. Assume that (L -Lm)u=O. Thus (,,,y)a, (L -Lm)u(x,y)=O. If we calculate the terms (L,, L,,)u(x, y) 0 and we sum them up we will obtain a sum of the values of the function u(s, t) with some coefficients c,, where (s, t) runs throughout the sets 1 U 2 U 3 U 4 U {(n, m + 1)}. Namely, 0= E (Ln Lm)u(x, y) (x,y) 1" 4 i=1 (s,t) c.,u(s, t)+ Cn,m+lU(n, m + 1). It is not hard to compute the coefficients c,, so we just list them below.

DISCRETE BOUNDARY VALUE PROBLEM 963 (i) (s, t)fl; (ii) (s, t)f2; cs.,=a s+t -(a,+t,). (iii) (s, t) f3 c,, y - (iv) (s, t) 4; Cs, t-- Olts--Olt (V) en,m+l=--am+l By the assumptions of the theorem all coefficients c,t are nonnegative while C.,m+I is strictly negative. Since u(s, t)>-o for (s, t)ol.j -2 [-J 3 [,.J "4 and u(n, m+ 1) <0 then the sum we were dealing with cannot be zero. It gives a contradiction. Let us return to the proof of Theorem 2. Let P. and P. satisfy (8) and P. for a strictly positive sequence tr.. If then P,,Pr,, E c(n, m, k)pk, k=ln-m P.P,. E 6(n,m,k)Pk, k=ln-ml where 6(n, m, k) show that (n, m, k) 0 for n > m. Since r.p. c(n, m, k)tr,. Therefore in order to prove c(n, m, k) => 0 it suces to L(P.Pm)= XP.Pm Lm(P.P) and the polynomials P. are linearly independent then for any k the function u(n, m) 5(n, m, k) is a solution of (9). Obviously, u(n, 0) c(n, 0, k), if n k, 0 otherwise. In paicular, u(n, 0)0. Hence by eorem 3 we get u(n, m)= (n, m, k) O. completes the proof of Theorem 2. COROLLARY. Let polynomials P, satisfy xp,, (i) and a, be increasing (a, > 0 for n 1, ao O); (ii) a + Y a,+l + Y,-1 for m < n 1; (iii) a y, for m < n. en the linearization coecients c(n, m, k) in T,P,+I + fl,p, + a,p,-i and let (1) are nonnegative. Proo By remarks preceding (6) after appropriate renormalization of P, we obtain polynomials P satisfying (6). Then we get the required result by applying Theorem 2. Example. Consider Jacobi polynomials P. They satisfy the recurrence formula xp,)_ 2(n+l)(n+a+fl+l) (2n + a + fl + 1)(2n + a + fl + 2) n+l f12 2 + p",) (2n + a + fl)(2n + a + fl + 2) 2(n+a)(n+fl) + (2n+a+)(2n+a+fl+l) Applying the corollary yields that for a fl and a + fl 0 we get positive linearization coefficients. However, for a fl and a + fl < 0 the sequence This fl" 22 (2n + a + fl )(2n + a + fl + 2)

964 RYSZARD SZWARC is decreasing and we cannot apply any of the preceding results, although we know from [5] and [6] that the condition a +/3 + 1->0 is sufficient. In part II of this paper we will discuss the problem of positive linearization under assumption/3n is decreasing when starting from n 1. This is more delicate because assumptions on cn s and y s cannot be separated from those on fl s. REFERENCES R. ASKEY, Linearization of the product of orthogonal polynomials, in Problems in Analysis, R. Gunning, ed., Princeton University Press, Princeton, NJ, 1970, pp. 223-228. [2], Orthogonal Polynomials and Special Functions, Regional Conference Series in Applied Mathematics 21, Society for Industrial and Applied Mathematics, Philadelphia, PA, 1975. [3] R. ASKEY AND G. GASPER, Linearization of the product ofjacobi polynomials. III, Canad. J. Math., 23 (1971), pp. 119-122. [4] J. FAVARD, Sur les polynmes de Tchebycheff, C. R. Acad. Sci. Paris, 200 (1935), pp. 2052-2055. [5] G. GASPER, Linearization ofthe product ofjacobi polynomials. I, Canad. J. Math., 22 (1970), pp. 171-175. [6]., Linearization of the product of Jacobi polynomials. II, Canad. J. Math., 22 (1970), pp. 582-593. [7] G. SZEG5, Orthogonal Polynomials, Fourth ed., Amer. Math. Soc. Colloq. Publ. 23, American Mathematical Society, Providence, RI, 1975.