Improving ECMWF s IFS model by Nils Wedi wedi@ecmwf.int Anna Agusti-Panareda, Gianpaolo Balsamo, Peter Bauer, Peter Bechtold, Willem Deconinck, Mikhail Diamantakis, Mats Hamrud, Christian Kuehnlein, Martin Leutbecher, Sarah-Jane Lock, Sylvie Malardel, Kristian Mogensen, George Mozdzynski, Pirkka Ollinaho, Irina Sandu, Piotr Smolarkiewicz, and many other colleagues European Centre for Medium Range Weather Forecasts ECMWF July 27, 2016
Fit for the future Research and development into physics & dynamics coupling quantifying uncertainty Earth-System complexity novel and adaptive numerical methods A flexible support infrastructure for European NWP science and services to maximise energy-efficiency assure the best possible use of the available computing resources 2
Affordability the art and cost of computing (Bauer et al, 2015) 50 member ensemble 1 member Hard limit for entire HPC ~20MW? M electricity/year 2015/6 2025 Technology advances may move the energy scale by one order of magnitude, so we need to be ready! (Aurora 180PF, HPCwire April 2015) 3
Big Data challenge: 3Vs high Volume, high Velocity, high Variety Velocity: Exponentially growing data archive: 1995: 14 Tbytes / year 2015: 100 Tbytes / day Volume: Satellite observations 1996 2018 (projected) Variety: Increase in products 4
Is the global spectral transform model dead? Technology applied at ECMWF for 30+ years 5
SISL-semi-implicit semi-lagrangian FFT Fourier space Grid-point space -semi-lagrangian advection -physical parametrizations -products of terms No grid-staggering of prognostic variables Inverse FFT Fourier space LT Spectral space -horizontal gradients -semi-implicit calculations -horizontal diffusion Inverse LT FFT: Fast Fourier Transform, LT: Legendre Transform 6
MPI communication cost at large core counts 230K cores
Application performance XC30 MPI_send/recv 1.86 IFS data communication rates Alltoallv Alltoallv 1.41 1.14 MPI_send/recv 0.95 0.91 0.51 0.65 0.58 SEMI-LAGRANGIAN FFT (G2L, L2G) LEGENDRE (L2M, M2L) INSIDE SPECTRAL (S2M,M2S) rate 1279 (Tb/s) rate 1999 (Tb/s) TCo1279 on 360 nodes; TCo1999 on 720 nodes ; dt=450s; 48h forecast 8
Time-to-solution at 13km IFS use FV discretization (Michalakes et al, NGGPS AVEC report, 2015) 9
Scaling efficiency at 3km IFS (Michalakes et al, NGGPS AVEC report, 2015) 10
Time-to-solution 3km Operational need! IFS (adapted from Michalakes et al, NGGPS AVEC report, 2015) 11
RAPS14 performance IFS model performance on CRAY XC 30 16384 8192 4096 6835 8729 4433 128 128 128 256 256 128 2048 1024 512 1539 64 64 564 490 328 661 64 32 256 190 128 16 16 64 32 16 8 4 2 1 8 4 EC-Earth, AR5 EC-Earth, AR6 ECMWF ENSEMBLE ECMWF-HRES TL95 TL159 TL255 TL511 TL799 TL1023 TL1279 TCo639 TCo1279 8 4 2 1 Forecast Days per Day CRAY XC 30 Nodes 12
A new grid for ECMWF Equal area (MPI) parallel decomposition (1600 tasks) 6,599,680 points x 137 levels at ~9km just below 1 billion points EUROPEAN CENTRE FOR MEDIUM-RANGE WEATHER FORECASTS 13
Equations beyond the hydrostatic system Finite-volume module (FVM) O640 - Held-Suarez with real orography: surface pressure after 50 days of simulation Compressible equations provide the most efficient solution as well as flexibility on the solution procedure in time [Courtesy Piotr Smolarkiewicz, Christian Kühnlein]
Further reading: EUROPEAN CENTRE FOR MEDIUM-RANGE WEATHER FORECASTS 15
Traditional science workflow [Schulthess 2015]
Future science workflow Energy efficient SCalable Algorithms for weather Prediction at Exascale www.hpcescape.eu [Schulthess 2015]