Fast algorithms based on asymptotic expansions of special functions Alex Townsend MIT Cornell University, 23rd January 2015, CAM Colloquium
Introduction Two trends in the computing era Tabulations Software Logarithms Gauss quadrature Special functions 1/22
Introduction Two trends in the computing era Tabulations Software Logarithms Gauss quadrature Special functions Classical analysis Numerical analysis = ~~ Fast multipole method Butterfly schemes Iterative algorithms Finite element method 1/22
Introduction A selection of my research Algebraic geometry p(x, y) q(x, y) = 0 [Nakatsukasa, Noferini, & T., 2015] K Approx. theory f(x, y) j=1 c j y r j (x) [Trefethen & T., 2014] 2/22
Introduction A selection of my research Algebraic geometry p(x, y) q(x, y) = 0 [Nakatsukasa, Noferini, & T., 2015] [Battels & Trefethen, 2004] K Approx. theory f(x, y) j=1 c j y r j (x) [Trefethen & T., 2014] 2/22
Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(ωx) Chebyshev polynomials T n (x) Legendre polynomials P n (x) Bessel functions J v (z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. 3/22
Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(ωx) Chebyshev polynomials T n (x) Legendre polynomials P n (x) Bessel functions J v (z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. 3/22
Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(ωx) Chebyshev polynomials T n (x) Legendre polynomials P n (x) Bessel functions J v (z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. 3/22
Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(ωx) Chebyshev polynomials T n (x) Legendre polynomials P n (x) Bessel functions J v (z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. 3/22
Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(ωx) Chebyshev polynomials T n (x) Legendre polynomials P n (x) Bessel functions J v (z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. 3/22
Introduction Many special functions are trigonometric-like Trigonometric functions cos(ωx), sin(ωx) Chebyshev polynomials T n (x) Legendre polynomials P n (x) Bessel functions J v (z) Airy functions Ai(x) Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc. 3/22
Introduction Asymptotic expansions of special functions Legendre polynomials P n (cos θ) ~ C n M 1 h m,n cos m + n + m=0 1 2 θ m + 1 2 (2 sin θ) m+1 2 π 2, n Thomas Stieltjes Bessel functions J 0 z ~ M 1 2 πz (cos z π 4 m=0 1 m a 2m z 2m sin(z π M 1 ( 1) m 4 ) a 2m+1 z 2m+1 ) m=0 z Hermann Hankel 4/22
Introduction Numerical pitfalls of an asymptotic expansionist J 0 z = M 1 2 πz (cos z π 4 m=0 1 m a 2m z 2m sin(z π M 1 ( 1) m 4 ) a 2m+1 z 2m+1 ) + R M (z) m=0 Fix M. Where is the asymptotic expansion accurate? 5/22
Introduction Talk outline Using asymptotics expansions of special functions: I. Computing Gauss quadrature rules. [Hale & T., 2013], [T., Trodgon, & Olver, 2015] II. Fast transforms based on asymptotic expansions. [Hale & T., 2014], [T., 2015] 6/22
Part I: Computing Gauss quadrature rules Part I: Computing Gauss quadrature rules P n (x) = Legendre polynomial of degree n 7/22
Part I: Computing Gauss quadrature rules Definition n-point quadrature rule: 1 1 f x dx n k=1 w k f x k Gauss quadrature rule: x k = kth zero of P n w k = 2(1 x k 2 ) -1 [P n (x k )] -2 1. Exactly integrates polynomials of degree 2n - 1. Best possible. 2. Computed by the Golub Welsch algorithm. [Golub & Welsch, 1969] 3. In 2004: no scheme for computing Gauss rules in O(n). [Bailey, Jeyabalan, & Li, 2004] 8/22
Quadrature error Execution time Part I: Computing Gauss quadrature rules Gauss quadrature experiments Quadrature error Execution time n n The Golub Welsch algorithm is beautiful, but not the state-of-the-art. There is another way... 9/22
Part I: Computing Gauss quadrature rules Newton s method Newton s method (with asymptotics and a change of variables) Solve P n (cos θ) (sin θ) 1/2 = 0, θ (0, π). Isaac Newton 1. Evaluation scheme for P n (cos θ) (sin θ) 1/2 2. Evaluation scheme for d dθ P n( cos θ) (sin θ) 1/21/2 3. Initial guesses for cos -1 (x k ) 10/22
Part I: Computing Gauss quadrature rules Newton s method with asymptotics and a change of variables Newton s method (with asymptotics and a change of variables) Solve P n (cos θ) (sin θ) 1/2 = 0, θ (0, π). Isaac Newton 1. Evaluation scheme for P n (cos θ) (sin θ) 1/2 P n (x) 2. Evaluation scheme for d dθ P n( cos θ) (sin θ) 1/2 ) 1/2 3. Initial guesses for cos -1 (x k ) 10/22
Part I: Computing Gauss quadrature rules Newton s method with asymptotics and a change of variables Newton s method (with asymptotics and a change of variables) Solve P n (cos θ) (sin θ) 1/2 = 0, θ (0, π). Isaac Newton 1. Evaluation scheme for P n (cos θ) (sin θ) 1/2 P n (x) 2. Evaluation scheme for d dθ P n( cos θ) (sin θ) 1/2 ) 1/2 3. Initial guesses for cos -1 (x k ) 10/22
Part I: Computing Gauss quadrature rules Newton s method with asymptotics and a change of variables Newton s method (with asymptotics and a change of variables) Solve P n (cos θ) (sin θ) 1/2 = 0, θ (0, π). Isaac Newton 1. Evaluation scheme for P n (cos θ) (sin θ) 1/2 2. Evaluation scheme for d P dθ n( cos θ) (sin θ) 1/2 3. Initial guesses for cos -1 (x k ) P n (cos θ) (sin θ) 1/2 10/22
Part I: Computing Gauss quadrature rules Newton s method with asymptotics and a change of variables Newton s method (with asymptotics and a change of variables) Solve P n (cos θ) (sin θ) 1/2 = 0, θ (0, π). Isaac Newton 1. Evaluation scheme for P n (cos θ) (sin θ) 1/2 2. Evaluation scheme for d P dθ n( cos θ) (sin θ) 1/2 3. Initial guesses for cos -1 (x k ) P n (cos θ) (sin θ) 1/2 10/22
Part I: Computing Gauss quadrature rules Newton s method with asymptotics and a change of variables Newton s method (with asymptotics and a change of variables) Solve P n (cos θ) (sin θ) 1/2 = 0, θ (0, π). Isaac Newton 1. Evaluation scheme for P n (cos θ) (sin θ) 1/2 2. Evaluation scheme for d P dθ n( cos θ) (sin θ) 1/2 3. Initial guesses for cos -1 (x k ) P n (cos θ) (sin θ) 1/2 10/22
Part I: Computing Gauss quadrature rules Evaluation scheme for P n Interior (in grey region): P n (cos θ) (sin θ) 1/2 P n (cos θ)(sin θ) 1/2 C n M 1 h m,n cos m + n + m=0 1 2 θ m + 1 2 2(2 sin θ) m π 2 Boundary: M 1 Am (θ) M 1 Bm (θ) P n (cos θ)(sin θ) 1/2 θ 1 2(J 0 (ρθ) m=0 ρ 2m + θj 1(ρθ) m=0 ρ 2m+1), ρ = n + 1 2 Theorem init 1 For ε > 0, n 200, and θ k = (k 2 ) π n [Baratella & Gatteschi, 1988], Newton converges and the nodes are computed to an accuracy of O(ε) in O(nlog(1/ε)) operations. 11/22
Part I: Computing Gauss quadrature rules Evaluation scheme for P n Interior (in grey region): P n (cos θ) (sin θ) 1/2 P n (cos θ)(sin θ) 1/2 C n M 1 h m,n cos m + n + m=0 1 2 θ m + 1 2 2(2 sin θ) m π 2 Boundary: M 1 Am (θ) M 1 Bm (θ) P n (cos θ)(sin θ) 1/2 θ 1 2(J 0 (ρθ) m=0 ρ 2m + θj 1(ρθ) m=0 ρ 2m+1), ρ = n + 1 2 Theorem init 1 For ε > 0, n 200, and θ k = (k 2 ) π n [Baratella & Gatteschi, 1988], Newton converges and the nodes are computed to an accuracy of O(ε) in O(nlog(1/ε)) operations. 11/22
Part I: Computing Gauss quadrature rules Evaluation scheme for P n Interior (in grey region): P n (cos θ) (sin θ) 1/2 P n (cos θ)(sin θ) 1/2 C n M 1 h m,n cos m + n + m=0 1 2 θ m + 1 2 2(2 sin θ) m π 2 Boundary: M 1 Am (θ) M 1 Bm (θ) P n (cos θ)(sin θ) 1/2 θ 1 2(J 0 (ρθ) m=0 ρ 2m + θj 1(ρθ) m=0 ρ 2m+1), ρ = n + 1 2 Theorem For n 200 and θ k init = (k 1 2 ) π n [Baratella & Gatteschi, 1988], Newton converges to the nodes in O(n) operations. 11/22
Max error Part I: Computing Gauss quadrature rules Initial guesses for the nodes Interior: Use Tricomi s Max error in the initial guesses initial guesses. [Tricomi, 1950] Boundary: Use Olver s initial guesses. [Olver, 1974] Ignace Bogaert There are now better initial guesses. [Bogaert, 2014] 12/22
Part I: Computing Gauss quadrature rules The race to compute high-order Gauss quadrature The race to compute high-order Gauss quadrature [T., 2015] 13/22
Quadrature error Part I: Computing Gauss quadrature rules Applications 1. Psychological: Gauss quadrature rules are cheap. 2. Experimental mathematics: High precision quadratures. [Bailey, Jeyabalan, & Li, 2004] 3. Cosmic microwave background: Noise removal by L 2 -projection. 1 1 dx 1 1+100 2 x 2 Generically, composite rules converge algebraically. 14/22
Part II: Fast transforms based on asymptotic expansions Part II: Fast transforms based on asymptotic expansions P N c = P 0 cos θ 0 P cos θ 0 P 0 cos θ P cos θ c 1 c, θ k = kπ N 1 15/22
Part II: Fast transforms based on asymptotic expansions Fast transforms as evaluation schemes DFT f k = cn z n n=0 k, z k = e 2πik/N DCT f k = cn cos(nθ k ), θ k = kπ n=0 z 2 z 1 z 0 DCT f k = cn T n=0 n (cosθ k ) DLT f k = cn P n=0 n (cosθ k ) θ 0 θ 1 DHT f k = cn J v j v,n r k, r k = j v,n n=0 j v,n, j v,n = nth root of J v z 16/22
Part II: Fast transforms based on asymptotic expansions Fast transforms as evaluation schemes DFT f k = cn z n n=0 k, z k = e 2πik/N DCT f k = cn cos(nθ k ), θ k = kπ n=0 z 2 z 1 z 0 DCT f k = cn T n=0 n (cosθ k ) DLT f k = cn P n=0 n (cosθ k ) θ 0 θ 1 DHT f k = cn J v j v,n r k, r k = j v,n n=0 j v,n, j v,n = nth root of J v z 16/22
Part II: Fast transforms based on asymptotic expansions Fast transforms as evaluation schemes DFT f k = cn z n n=0 k, z k = e 2πik/N DCT f k = cn cos(nθ k ), θ k = kπ n=0 z 2 z 1 z 0 DCT f k = cn T n=0 n (cosθ k ) DLT f k = cn P n=0 n (cosθ k ) θ 0 θ 1 DHT f k = cn J v j v,n r k, r k = j v,n n=0 j v,n, j v,n = nth root of J v z 16/22
Part II: Fast transforms based on asymptotic expansions Fast transforms as evaluation schemes DFT f k = cn z n n=0 k, z k = e 2πik/N DCT f k = cn cos(nθ k ), θ k = kπ n=0 z 2 z 1 z 0 DCT f k = cn T n=0 n (cosθ k ) DLT f k = cn P n=0 n (cosθ k ) θ 0 θ 1 DHT f k = cn J v j v,n r k, r k = j v,n n=0 j v,n, j v,n = nth root of J v z 16/22
Part II: Fast transforms based on asymptotic expansions Related work James Cooley Leslie Greengard Emmanuel Candes Daniel Potts Ahmed, Tukey, Gauss, Gentleman, Natarajan, Rao Alpert, Barnes, Hut, Martinsson, Rokhlin, Mohlenkamp, Tygert Boag, Demanet, Michielssen, O Neil, Ying Driscoll, Healy, Steidl, Tasche Many others: Candel, Cree, Johnson, Mori, Orszag, Suda, Sugihara, Takami, etc. Fast transforms are very important. 17/22
Part II: Fast transforms based on asymptotic expansions Asymptotic expansions as a matrix decomposition The asymptotic expansion P n (cos θ k ) = C n 1 M 1 cos m + n + 2 θ k m + 1 π 2 2 h m,n (2 sin θ k ) m + 1/2 + R M,n (θ k ) m=0 gives a matrix decomposition (sum of diagonally scaled DCTs and DSTs): P N = M 1 m=0 D u m C ND Chm + D vm 0 0 0 0 S N 2 0 0 0 0 D Chm + R M,N P N = P ASY N + R M,N 18/22
Part II: Fast transforms based on asymptotic expansions Be careful and stay safe Error curve: R M,n θ k = ε Fix M. Where is the asymptotic expansion accurate? R M,N = R M,n (θ k ) 2C n h M,n (2 sin θ k ) M+1/2 19/22
Part II: Fast transforms based on asymptotic expansions Partitioning and balancing competing costs Theorem The matrix-vector product f = P N c can be computed in O (N((log N) 2 / log log N) operations. 0 1 α too small 20/22
Part II: Fast transforms based on asymptotic expansions Partitioning and balancing competing costs Theorem The matrix-vector product f = P N c can be computed in O(N((log N) 2 / log log N) operations. 0 1 α too small 20/22
Part II: Fast transforms based on asymptotic expansions Partitioning and balancing competing costs Theorem The matrix-vector product f = P N c can be computed in O(N((log N) 2 / log log N) operations. 0 1 α too small 20/22
Part II: Fast transforms based on asymptotic expansions Partitioning and balancing competing costs Theorem The matrix-vector product f = P N c can be computed in O(N((log N) 2 / log log N) operations. 0 1 α too large 20/22
Part II: Fast transforms based on asymptotic expansions Partitioning and balancing competing costs Theorem The matrix-vector product f = P N c can be computed in O(N((log N) 2 / log log N) operations. 0 1 α = O (1/ log N) P N (j) c P N EVAL c O ( N (log N)2 log log N ) O (N (log N) 2 log log N ) 20/22
Execution time Part II: Fast transforms based on asymptotic expansions Numerical results 10 0 Direct approach Using asymptotics 0.15 0.1 Modification of rough signals 0.05 10-1 0-0.05 10-2 -0.1-0.15-0.2 10-3 10 2 10 3 10 4 N 10 5-0.25 0.3 0.32 0.34 x No precomputation. 1 1 P n x e iωt dx = i m 2π ω J m+ 1 ( ω) 2 21/22
Execution time Future work Fast spherical harmonic transform Spherical harmonic transform: Existing approaches 10 4 Precomputation Fast transform Direct approach f θ, φ = l α l m P l m (cos θ)e imφ 10 2 l=0 m= l 10 0 [Mohlenkamp, 1999] 10-2 [Rokhlin & Tygert, 2006] [Tygert, 2008] 10 3 10 4 N 10 5 A new generation of fast algorithms with no precomputation. 22/22
Thank you
References D. H. Bailey, K. Jeyabalan, and X. S. Li, A comparison of three high-precision quadrature schemes, 2005. P. Baratella and I. Gatteschi, The bounds for the error term of an asymptotic approximation of Jacobi polynomials, Lecture notes in Mathematics, 1988. Z. Battels and L. N. Trefethen, An extension of Matlab to continuous functions and operators, SISC, 2004. I. Bogaert, Iteration-free computation of Gauss Legendre quadrature nodes and weights, SISC, 2014. G. H. Golub and J. H. Welsch, Calculation of Gauss quadrature rules, Math. Comput., 1969. N. Hale and A. Townsend, Fast and accurate computation of Gauss Legendre and Gauss Jacobi quadrature nodes and weights, SISC, 2013. N. Hale and A. Townsend, A fast, simple, and stable Chebyshev Legendre transform using an asymptotic formula, SISC, 2014. Y. Nakatsukasa, V. Noferini, and A. Townsend, Computing the common zeros of two bivariate functions via Bezout resultants, Numerische Mathematik, 2014.
References F. W. J. Olver, Asymptotics and Special Functions, Academic Press, New York, 1974. A. Townsend and L. N. Trefethen, An extension of Chebfun to two dimensions, SISC, 2013. A. Townsend, T. Trogdon, and S. Olver, Fast computation of Gauss quadrature nodes and weights on the whole real line, to appear in IMA Numer. Anal., 2015. A. Townsend, The race to compute high-order Gauss quadrature, SIAM News, 2015. A. Townsend, A fast analysis-based discrete Hankel transform using asymptotics expansions, submitted, 2015. F. G. Tricomi, Sugli zeri dei polinomi sferici ed ultrasferici, Ann. Mat. Pura. Appl., 1950.