Profiling and scalability of the high resolution NCEP model for Weather and Climate Simulations
|
|
- Valentine Morton
- 6 years ago
- Views:
Transcription
1 Profiling and scalability of the high resolution NCEP model for Weather and Climate Simulations Phani R, Sahai A. K, Suryachandra Rao A, Jeelani SMD Indian Institute of Tropical Meteorology Dr. Homi Bhabha Road, Pashan, Pune Abstract Coupled climate modeling has become one of the most challenging fields with the development of multicore architectures and its importance in daily weather and climate predictions. At IITM, PRITHVI cluster (~70TF), NCEP Climate Forecast System (CFS) and its atmospheric component (GFS) have been installed on IBM Power6 architecture. Here, we investigate the scalability of the MPI- OPenMP hybrid models (CFS, GFS) and the limitations of hybrid spectral models when going to high resolutions. Also, this paper looks into the performance and scaling with these two models (CFS, GFS) on the TeraFLOP cluster, varying the threads and describes the preliminary results for the high resolution simulations. Keywords-CFS; GFS; scalability; profiling I. INTRODUCTION With the advancement of science and multi-core architectures in High Performance Computing (HPC), the need and expectation of accurate weather and climate predictions for the Government agencies and agricultural organizations have increased. One of the daunting challenges for the weather forecasters and climate modelers is to have better scalable models [1-3]. Efforts are being made to improve weather forecast by running the models at high resolution with more sophisticated physics and radiation calls which is of very high cost. Apart from this, ensemble technique which is now being used for the seasonal and climate predictions in-turn make the experiments highly computationally expensive. Presently, on an IBM Power6, Global Forecast System (GFS) model with T574 spectral resolution (27 KM), takes ~24 minutes for a one day simulation on 128 cores, which is time consuming. We look at this issue how it can be scaled by varying the MPI tasks and the OpenMP threads. Also, the recent innovations on increasing the density of cores in each node/chip, the challenge lies in scaling the model on these architectures with the variation in the MPI and OpenMP tasks. The speed up and the model performance are two main issues which are very much governed by the system architectures [4,5]. The speedup can be improved by increasing the number of MPI tasks or the OpenMP threads in the MPI-OpenMP parallel implementation of the model. Recent results suggest that the work sharing between the cores in the MPI-OpenMP parallel implementation results in much improved performance and consumes less communication and I/O bandwidth [2]. With this, the multicore architectures have gained much significance for running hybrid MPI-OpenMP models on these architectures [2,5,6]. In this paper, we look into the Climate Forecast System (CFS) model and its atmospheric component Global Forecast System (GFS) model, their performance on the IBM Power 6 PRITHVI. We confine our results for three different spectral resolutions CFS T382, T126 and GFS T574. We look at the scalability problem of GFS T574 by varying the number of threads and CFS T382 scalability by varying the number of cores. Also, the high-resolution model (GFS T382) is compared with the low resolution model (GFS T126) and we look closely at the variations in these simulations. II. THE MODEL The NCEP Climate Forecast System (CFS) version 2 [7] is a fully coupled ocean-land-atmosphere dynamical seasonal prediction model installed on PRITHVI (IITM) under the Ministry of Earth Sciences (MOES) and NCEP MOU. This model is an operational version of NCEP released recently. At IITM, the CFS model is being run at two different resolutions T382 (~35 KM), T126 (~100KM) for the seasonal and extended range prediction. The atmospheric component of the CFS is the GFS model, includes both the global analysis and forecast components, which implies that GFS can also be used for the weather forecast. The Oceanic component of the CFS is the MoM4p0 and land-surface model is of NAOH. Interoperability is achieved with the ocean, atmosphere, seaice and the land-surface components being coupled with the Earth System Modelling Framework (ESMF) coupler and runs on the Multiple Program Multiple Data (MPMD) paradigm /12/$ IEEE
2 A. Ocean Model NCEP CFS model contains the Mom4P0 ocean model [8] which is a finite difference version of the ocean primitive equations configured under the Boussinesq and hydrostatic approximations. The model uses the tripolar grid with the latitude typically taken at 65 N and the other two grids are at poles situated over land which will not have any consequences for running the numerical ocean model. The horizontal layout is a staggered Arakawa B grid and geometric height is in the vertical. The model has a varying resolution in the meridonal direction with 0.25 between 10 S and 10 N, gradually increasing to 0.5 poleward of 30 N and 30 S and the zonal direction with 0.5 resolution. The time-step for this ocean model was 1800 seconds. Running the CFS model is different from GFS model, CFS has to be allocated few processors for the ocean component to run while for the GFS it was not required. For example, if the model has been allocated 128 cores with 60 cores to the ocean, GFS runs on 67 cores with one core being allocated to the coupler. B. Atmospheric Model GFS is global atmospheric numerical weather prediction model developed by NCEP-NOAA [9,10]. This is a spectral triangular model with hybrid MPI-OpenMP parallel implementation and has 64 levels in the vertical direction. Domain decomposition and data communication depend on the numerical methods used in the model. Most of the Atmospheric models are three dimensional and the domain decomposition can be up to three dimensions. Because of its domain dependent computations, like cloud microphysics, parameterization schemes NCEP GFS model is a threedimensional model with one-dimensional decomposition. This makes the limitation for the GFS model in the maximum number of tasks to the number in the latitudes but can have usage of OpenMP threads. In the paper, we discuss the GFS T574 (~27KM) along with the CFS T382 (~35KM) models. For each of these resolutions, forecast simulations have been done for 8 days and the total runtimes have been averaged to 1 day. Both the models have been run in forecast mode with 6 hourly outputs and the time-step for CFS was 600 seconds while for GFS was 120 seconds. In the last part of our paper, we compare the GFS T126 with the GFS T382 results by giving an observed SST forcing. Simulation for the GFS T126 (Forced Sea Surface Temperature (SST)) and GFS T382 (Forced SST) have been performed for 55 years and 40 years respectively, and the time taken for each of these runs are 21 days and 48 days respectively on 128 cores. III. THE SYSTEM Prithvi s IBM Power6 processors support Simultaneous Multithreading (SMT) which is one of the multi-core technology on which the hybrid models are supposed to giver better performance. SMT is a processor technology that allows two separate instruction streams (threads) to run concurrently on the same physical processor, improving overall throughput. On PRITHVI, there are 117 nodes with 128 GB of memory and each node is equipped with 16 Power6 processors, or 32 physical cores. With SMT switched on, there are 64 logical cores (virtual cpus). Since each node runs its own operating system, they can be rebooted or repaired independently from the others, resulting in higher availability of the overall. It should be noted that only the 32 physical cores within each node have direct access to the memory. Also, the total number of cores for a hybrid model to run in each node can be a maximum of 64 on IBM Power6 architecture. Due to this reason we can vary the OpenMP threads to a maximum of 32 and a minimum of 2 MPI tasks in each node. IBM s MPI and LAPI are being used for parallel communication while the Infiband network is Qlogic. Figure 1. GFS scalability plot for MPI tasks 128 and 400 by varying the threads. IV. EFFICIENCY OF THREADING Climate model involves solving the Navier-Strokes dynamical equations at each grid-point by different methods. Out of the finite difference, finite volume, finite element, spectral element and spectral method, spectral method gives good exponential convergence (for smooth solution) and the elimination of pole problems when using spherical coordinates. The main disadvantage of using the spectral method is the occurrence of Gibbs phenomena and the limitation of the spectral truncations. The climate models which we have presented here, CFS and GFS are also a part of them. When running CFS, or the GFS, the atmospheric model spectral dynamical core supports 1D decomposition over latitude. For this reason, we cannot increase the number of MPI tasks in running the model
3 beyond the spectral truncation. For example, if we are working on GFS T574 spectral resolution, it means that the spectral triangular truncations are 574 and the number of MPI tasks cannot be larger than 574. The other option is to run the climate models with OpenMP threads and we have performed experiments by varying the OpenMP threads on different cores rather than running on the same core. There was an apprehension that running threads on different cores in a multi-core architecture can increase the performance rather than on a single core. We address this question by performing this experiment on our multi-core PRITHVI system that allows threading to run on virtual and physical cores concurrently. Our aim is to run the model by varying the OpenMP threads within each node and look into the scalability issues. V. RESULTS AND DISCUSSIONS Hybrid model scalability performance can be better understood by varying the MPI and the OpenMP threads. Initial experiment was performed on the GFS T574 atmospheric model in varying the OpenMP threads. In this experiment, the total MPI tasks have been fixed and increased the OpenMP threads in each node from 1 to 32, thereby decreasing the MPI tasks in each node from 64 to 2 (1 OpenMP thread has 64 MPI tasks, 2 threads have 32 MPI tasks, 8 threads have 8 MPI tasks, 16 OpenMP threads have 4 MPI tasks in each node). Two experiments have been performed on this model with MPI task equal to 128 and the other with MPI tasks equal to 400 and varying the OpenMP threads. For example, with 400 MPI tasks and 8 OpenMP threads, the model has been run on 3200 cores ~ 50 nodes ~ 30TF. Fig.1 shows the scalability plot of GFS T574 for two MPI tasks 400 and 128. The GFS model is highly scalable with 128 MPI tasks but for the 400MPI tasks it is scalable till 8 threads perfectly but beyond that it has not efficiently scalable. Though 128 MPI tasks is better scaling than the 400 MPI tasks, but it is slower. Consider 8 OpenMP threads, the model takes 10.7 minutes for 128 MPI tasks (9.6 TF), while it takes 5.3 minutes for 400 MPI tasks (30TF). Having the same number of MPI and OpenMP tasks in each node and changing the total MPI tasks is very much effective on the timings. In fact, both the curves reach a limit, asymptotically as the number of threads per node is increased. This can be an inherent problem in the model. The speed-up of the model is calculated from the ratio of the time taken for a single core to the total time of number of cores. In this work, we have defined as the ratio between the times taken for a single thread to the total number of threads. Speed-up of the GFS model for both 128 MPI tasks and 400 MPI tasks is same up to 8 threads but changes the linearity when the threads are 16 which were plotted in Fig.2. From these two figures, we understand that one has to be careful in the selection of the right approach as both curves have the same speed-up till 8 threads. Though 128 MPI tasks curve gives good scalability, 400 MPI tasks with 8 threads would be a better option in running the model. Increasing the OpenMP threads on hybrid architectures may not be an option, but have a cap on the number of threads is required. In this experiment, 8 OpenMP threads in each node give an efficient scalability. Figure 3. With OpenMP threads equal to one, total time taken for 1 day simulation of GFS T574 and CFS T382 models by varying the total number of cores. We cannot increase the number of MPI tasks beyond the spectral truncations. Figure 2. Speed up of the GFS T574 model by varying the number of threads for two different MPI tasks 128 and 400. The performance of CFS and GFS models without threading is shown in Fig3. MPI tasks have been varied in these two models with OpenMP threads equal to one. For
4 the CFS T382, the number of cores for the ocean has been fixed to 60. Both the models are scalable until their limit of spectral truncations. It is very much interesting to compare the time taken for GFS T574 with threads (Fig.1) and without threads (Fig.3). Consider 512 total tasks, without threads GFS T574 takes 10.5 minutes, while with 128 MPI tasks and 4 threads takes 15.9 minutes. Pure MPI job takes less time than the job with threads, because, the GFS model is a spectral dynamical core with 1D decomposition, the MPI tasks can be a maximum of the spectral truncations. Mean climatological precipitation plots were shown in Fig.4 for the Indian subcontinent in which GFS (forced SST) T126 and T382 spectral resolutions are compared with the observations. The observations are from Indian Meteorological Department (IMD) and the GPCP (Global Precipitation Climatology Project). The IMD and GPCP are 1 x 1 data, while, GFS (Forced SST) T126 and T382 are 1 x 1 data and 0.3 x 0.3 respectively. Fig.4 shows that the observations are in good agreement with the model plots over the land. Also, the increase in resolution has considerable increase in the orographic features remarkably. We observe that there are few dark colors at some places and they appear periodically (where the mountains are located) which is nothing but Gibbs phenomenon inherent in the spectral models at high resolution. In conclusions, the model output data does not vary (RMS error difference is zero) with the threads nor with the cores, which makes us to believe that the hybrid MPI- OpenMP parallel implementation on the multi-core architectures has a tremendous potential to be improved in terms of the scalability. On the multi-core architectures, the performance of GFS and CFS models have been studied and looked into the scalability issues. GFS T574 is scalable up to 32 OpenMP threads with 128 MPI tasks, but the same is not true for 400 MPI tasks. Experimental results with the GFS T574 spectral resolution show that beyond 8 OpenMP threads the linearity in the speedup is lost. In this analysis, 400 MPI tasks with 8 OpenMP threads give better scalable results. Furthermore, the model improvement is performed with the high resolution model. This issue was addressed by looking at the CFS T126 and CFS T382 spectral models and comparing with the observational data. Though the orographic precipitation features have considerably increased over land but came across with Gibbs phenomenon at few places. Mere increase in the resolution may give us better results but the Gibbs phenomenon has to be taken into account in running the model. The GFS and CFS models are scalable with the MPI-tasks being less than the spectral triangular truncations. With MPI tasks near to the spectral triangular truncation, OpenMP threading gives better performance on the hybrid architectures ACKNOWLEDGMENT IITM is fully funded by Ministry of Earth Sciences. PR would like to thank Prof. Ravi Nanjundiah, IISc for his valuable inputs and Rajan, IBM. REFERENCES [1] Chunhua Liao, Zhenying Liu, Lei Huang and Barbara Chapman, Evaluating OpenMP on Chip MultiThreading Platforms, Lecture Notes in Computer Science, vol. 4315, pp , [2] David Champ, Christoph Garth, Hank Childs, Dave Pugmire and Kenneth I. Joy, Streamline Integration Using MPI-Hybrid Parallelism on a Large Multicore Architecture, IEEE Transactions on Visualization and Computer Graphics, 12, [3] H Hirata, K Kimura, S Nagamine, Y Mochizuki, A Nishimura, Y Nakase, T Nishizawa, An elementary processor architecture with simultaneous instruction issuing from multiple threads, Proceedings of the 19th annual international symposium on Computer architecture, , [4] I Foster, W. Gropp, R.Stevens, Parallel Scalability of the Spectral Transform Method", Proceedings of the fifth SIAM conference on "Parallel Processing on Scientific Computing, page 307, [5] John Drakea, Ian Foster, John Michalakesb, Brian Toonenb, Patrick Worleya, "Design and performance of a scalable parallel community climate model", Parallel Computing, 21, , [6] David Tam, Reza Azimi, Michael Stumm, Thread clustering: sharing-aware sheduling on SMP-CMp-SMT, Proceedings of the 2nd ACM SIGOPS/EuroSys European Conference on Computer Systems, [7] Suranjana, Saha et.al., The NCEP Climate Forecast System Version2, 2012, Submitted to Journal of Climate. [8] Stephen Griffies M. Harrison, Ronald C. Pacanowski and Antony Rosati, A technical guide to MoM4, GFDL Ocean Group Technical Report 5, NOAA, [9] NCEP NOAA: Envoronomental Modelling Center, The GFS Atmospheric model NCEP Office Note 442, Global Climate and Weather Modelling Branch, EMC, Camp Springs, Maryland, [10] Han J, H-L Pan, Revision and Convection of Vertical Diffusion in the NCEP Global Forecast System, Weather and Forecasting, 2011, 26,
5 Figure 4. Mean Climatological precipitation plot for a) GFS T126 (forced Sea Surface Temperature (SST)) for 55 years at 384 x 190 grid resolution b) GFS T382 (forced SST) for 40 years at 1152 x 576 grid resolution c) IMD station observational data at 360 x 180 grid resolution d) GPCP observational data at 360 x 180 grid resolution.
The next-generation supercomputer and NWP system of the JMA
The next-generation supercomputer and NWP system of the JMA Masami NARITA m_narita@naps.kishou.go.jp Numerical Prediction Division (NPD), Japan Meteorological Agency (JMA) Purpose of supercomputer & NWP
More informationDynamical Seasonal Monsoon Forecasting at IITM
Dynamical Seasonal Monsoon Forecasting at IITM H. S. Chaudhari, S. K. Saha, A. Hazra, S.Pokhrel, S. A. Rao, A. K. Sahai, R. Krishnan & Seasonal Prediction and Extended Range Prediction Group Indian Institute
More informationCORDEX Simulations for South Asia
WCRP CORDEX South Asia Planning Meeting 25-26 February 2012 Indian Institute of Tropical Meteorology (IITM) Pune, India CORDEX Simulations for South Asia J. Sanjay Centre for Climate Change Research (CCCR)
More informationNear future ( ) projection of the East Asia summer monsoon by an atmospheric global model with 20-km grid
Near future (2015-2039) projection of the East Asia summer monsoon by an atmospheric global model with 20-km grid Shoji KUSUNOKI Meteorological Research Institute (MRI) Ryo MIZUTA, Mio MATSUEDA Advanced
More informationHYCOM and Navy ESPC Future High Performance Computing Needs. Alan J. Wallcraft. COAPS Short Seminar November 6, 2017
HYCOM and Navy ESPC Future High Performance Computing Needs Alan J. Wallcraft COAPS Short Seminar November 6, 2017 Forecasting Architectural Trends 3 NAVY OPERATIONAL GLOBAL OCEAN PREDICTION Trend is higher
More informationOperational and research activities at ECMWF now and in the future
Operational and research activities at ECMWF now and in the future Sarah Keeley Education Officer Erland Källén Director of Research ECMWF An independent intergovernmental organisation established in 1975
More informationScaling the Software and Advancing the Science of Global Modeling and Assimilation Systems at NASA. Bill Putman
Global Modeling and Assimilation Office Scaling the Software and Advancing the Science of Global Modeling and Assimilation Systems at NASA Bill Putman Max Suarez, Lawrence Takacs, Atanas Trayanov and Hamid
More informationAMPS Update June 2017
AMPS Update June 2017 Kevin W. Manning Jordan G. Powers Mesoscale and Microscale Meteorology Laboratory National Center for Atmospheric Research Boulder, CO 12th Workshop on Antarctic Meteorology and Climate
More informationSeasonal Climate Outlook for South Asia (June to September) Issued in May 2014
Ministry of Earth Sciences Earth System Science Organization India Meteorological Department WMO Regional Climate Centre (Demonstration Phase) Pune, India Seasonal Climate Outlook for South Asia (June
More informationOcean model, Interconnections within the climate model
Ocean model, Interconnections within the climate model Vladimir Djurdjevic and Bora Rajkovic EXPERT WORKSHOP SEE RESEARCH FRAMEWORK IN REGIONAL CLIMATE MODELING FOR 2012-2017 Belgrade, Serbia, April 11-13,
More informationMODEL TYPE (Adapted from COMET online NWP modules) 1. Introduction
MODEL TYPE (Adapted from COMET online NWP modules) 1. Introduction Grid point and spectral models are based on the same set of primitive equations. However, each type formulates and solves the equations
More informationAn Overview of HPC at the Met Office
An Overview of HPC at the Met Office Paul Selwood Crown copyright 2006 Page 1 Introduction The Met Office National Weather Service for the UK Climate Prediction (Hadley Centre) Operational and Research
More informationWRF Modeling System Overview
WRF Modeling System Overview Louisa Nance National Center for Atmospheric Research (NCAR) Developmental Testbed Center (DTC) 27 February 2007 1 Outline What is WRF? WRF Modeling System WRF Software Design
More informationNCEP Applications -- HPC Performance and Strategies. Mark Iredell software team lead USDOC/NOAA/NWS/NCEP/EMC
NCEP Applications -- HPC Performance and Strategies Mark Iredell software team lead USDOC/NOAA/NWS/NCEP/EMC Motivation and Outline Challenges in porting NCEP applications to WCOSS and future operational
More informationGeneral Circulation. Nili Harnik DEES, Lamont-Doherty Earth Observatory
General Circulation Nili Harnik DEES, Lamont-Doherty Earth Observatory nili@ldeo.columbia.edu Latitudinal Radiation Imbalance The annual mean, averaged around latitude circles, of the balance between the
More informationFigure 1 - Resources trade-off. Image of Jim Kinter (COLA)
CLIMATE CHANGE RESEARCH AT THE EXASCALE Giovanni Aloisio *,, Italo Epicoco *,, Silvia Mocavero and Mark Taylor^ (*) University of Salento, Lecce, Italy ( ) Euro-Mediterranean Centre for Climate Change
More information5.2 PRE-PROCESSING OF ATMOSPHERIC FORCING FOR ENSEMBLE STREAMFLOW PREDICTION
5.2 PRE-PROCESSING OF ATMOSPHERIC FORCING FOR ENSEMBLE STREAMFLOW PREDICTION John Schaake*, Sanja Perica, Mary Mullusky, Julie Demargne, Edwin Welles and Limin Wu Hydrology Laboratory, Office of Hydrologic
More information2. Outline of the MRI-EPS
2. Outline of the MRI-EPS The MRI-EPS includes BGM cycle system running on the MRI supercomputer system, which is developed by using the operational one-month forecasting system by the Climate Prediction
More informationASSESMENT OF THE SEVERE WEATHER ENVIROMENT IN NORTH AMERICA SIMULATED BY A GLOBAL CLIMATE MODEL
JP2.9 ASSESMENT OF THE SEVERE WEATHER ENVIROMENT IN NORTH AMERICA SIMULATED BY A GLOBAL CLIMATE MODEL Patrick T. Marsh* and David J. Karoly School of Meteorology, University of Oklahoma, Norman OK and
More informationJ11.5 HYDROLOGIC APPLICATIONS OF SHORT AND MEDIUM RANGE ENSEMBLE FORECASTS IN THE NWS ADVANCED HYDROLOGIC PREDICTION SERVICES (AHPS)
J11.5 HYDROLOGIC APPLICATIONS OF SHORT AND MEDIUM RANGE ENSEMBLE FORECASTS IN THE NWS ADVANCED HYDROLOGIC PREDICTION SERVICES (AHPS) Mary Mullusky*, Julie Demargne, Edwin Welles, Limin Wu and John Schaake
More informationGFDL, NCEP, & SODA Upper Ocean Assimilation Systems
GFDL, NCEP, & SODA Upper Ocean Assimilation Systems Jim Carton (UMD) With help from Gennady Chepurin, Ben Giese (TAMU), David Behringer (NCEP), Matt Harrison & Tony Rosati (GFDL) Description Goals Products
More informationLOM Layered Ocean Model Workshop, 2-4 June 2015, Copenhagen
LOM Layered Ocean Model Workshop, 2-4 June 2015, Copenhagen 2 HYCOM Preliminary operational system Grid 1/3 (22 levels) Spin-up COADS 30 years (USP-UFBA) 2007/2008 NCEP atmospheric forcing HYCOM Grid nesting
More informationClimate Modelling: Basics
Climate Modelling: Basics Lecture at APN-TERI Student Seminar Teri University, 16 th Feb 2015 Saurabh Bhardwaj Associate Fellow Earth Science & Climate Change Division TERI saurabh.bhardwaj@teri.res.in
More informationScalable and Power-Efficient Data Mining Kernels
Scalable and Power-Efficient Data Mining Kernels Alok Choudhary, John G. Searle Professor Dept. of Electrical Engineering and Computer Science and Professor, Kellogg School of Management Director of the
More informationECMWF Forecasting System Research and Development
ECMWF Forecasting System Research and Development Jean-Noël Thépaut ECMWF October 2012 Slide 1 and many colleagues from the Research Department Slide 1, ECMWF The ECMWF Integrated Forecasting System (IFS)
More informationVariable-Resoluiton Global Atmospheric Modeling Spanning Convective to Planetary Scales
Variable-Resoluiton Global Atmospheric Modeling Spanning Convective to Planetary Scales Bill Skamarock, NCAR/MMM MPAS consists of geophysical fluid-flow solvers based on unstructured centroidal Voronoi
More informationImpact of the 2002 stratospheric warming in the southern hemisphere on the tropical cirrus clouds and convective activity
The Third International SOWER meeting,, Lake Shikotsu,, July 18-20, 2006 1 Impact of the 2002 stratospheric warming in the southern hemisphere on the tropical cirrus clouds and convective activity Eguchi,
More informationIITM Earth System Model (IITM ESM)
IITM Earth System Model (IITM ESM) Swapna Panickal Centre for Climate Change Research Indian Institute of Tropical Meteorology ESM Team: R. Krishnan, V. Prajeesh, N. Sandeep, V. Ramesh, D.C. Ayantika,
More informationATS 421/521. Climate Modeling. Spring 2015
ATS 421/521 Climate Modeling Spring 2015 Lecture 9 Hadley Circulation (Held and Hou, 1980) General Circulation Models (tetbook chapter 3.2.3; course notes chapter 5.3) The Primitive Equations (tetbook
More informationA new century-long high-resolution global climate run using the Community Earth System Model.
Office of Science (BER) and NERSC and DOE/UCAR Co-operative Agreement US Department of Energy NCAR-Wyoming Supercomputer Center National Science Foundation A new century-long high-resolution global climate
More information11 days (00, 12 UTC) 132 hours (06, 18 UTC) One unperturbed control forecast and 26 perturbed ensemble members. --
APPENDIX 2.2.6. CHARACTERISTICS OF GLOBAL EPS 1. Ensemble system Ensemble (version) Global EPS (GEPS1701) Date of implementation 19 January 2017 2. EPS configuration Model (version) Global Spectral Model
More informationAdvancing Weather Prediction at NOAA. 18 November 2015 Tom Henderson NOAA / ESRL / GSD
Advancing Weather Prediction at NOAA 18 November 2015 Tom Henderson NOAA / ESRL / GSD The U. S. Needs Better Global Numerical Weather Prediction Hurricane Sandy October 28, 2012 A European forecast that
More informationRegional climate projections for NSW
Regional climate projections for NSW Dr Jason Evans Jason.evans@unsw.edu.au Climate Change Projections Global Climate Models (GCMs) are the primary tools to project future climate change CSIROs Climate
More informationThe Maritime Continent as a Prediction Barrier
The Maritime Continent as a Prediction Barrier for the MJO Augustin Vintzileos EMC/NCEP SAIC Points to take back home. Forecast of the MJO is at, average, skillful for lead times of up to circa 2 weeks.
More informationWRF Modeling System Overview
WRF Modeling System Overview Jimy Dudhia What is WRF? WRF: Weather Research and Forecasting Model Used for both research and operational forecasting It is a supported community model, i.e. a free and shared
More informationConvective scheme and resolution impacts on seasonal precipitation forecasts
GEOPHYSICAL RESEARCH LETTERS, VOL. 30, NO. 20, 2078, doi:10.1029/2003gl018297, 2003 Convective scheme and resolution impacts on seasonal precipitation forecasts D. W. Shin, T. E. LaRow, and S. Cocke Center
More informationWRF Modeling System Overview
WRF Modeling System Overview Jimy Dudhia What is WRF? WRF: Weather Research and Forecasting Model Used for both research and operational forecasting It is a supported community model, i.e. a free and shared
More informationEgyptian Meteorological Authority Cairo Numerical Weather prediction centre
JOINT WMO TECHNICAL PROGRESS REPORT ON THE GLOBAL DATA PROCESSING AND FORECASTING SYSTEM AND NUMERICAL WEATHER PREDICTION RESEARCH ACTIVITIES FOR 2016 Egyptian Meteorological Authority Cairo Numerical
More informationPerformance Evaluation of Scientific Applications on POWER8
Performance Evaluation of Scientific Applications on POWER8 2014 Nov 16 Andrew V. Adinetz 1, Paul F. Baumeister 1, Hans Böttiger 3, Thorsten Hater 1, Thilo Maurer 3, Dirk Pleiter 1, Wolfram Schenck 4,
More informationECMWF Computing & Forecasting System
ECMWF Computing & Forecasting System icas 2015, Annecy, Sept 2015 Isabella Weger, Deputy Director of Computing ECMWF September 17, 2015 October 29, 2014 ATMOSPHERE MONITORING SERVICE CLIMATE CHANGE SERVICE
More informationNCEP non-hydrostatic regional model and surface scheme LAPS: A dynamical scaling tool for use in agricultural models
NCEP non-hydrostatic regional model and surface scheme LAPS: A dynamical scaling tool for use in agricultural models D.T. Mihailović and B. Lalić Faculty of Agriculture, University of Novi Sad, Novi Sad,
More informationSwedish Meteorological and Hydrological Institute
Swedish Meteorological and Hydrological Institute Norrköping, Sweden 1. Summary of highlights HIRLAM at SMHI is run on a CRAY T3E with 272 PEs at the National Supercomputer Centre (NSC) organised together
More informationLéo Siqueira Ph.D. Meteorology and Physical Oceanography
Léo Siqueira Ph.D. Meteorology and Physical Oceanography Modular Ocean Model (Griffies 2009) from GFDL version MOM4p1: Includes the Sea Ice Simulator (SIS) built-in ice model (Winton 2000). Includes TOPAZ
More informationProgress in NWP on Intel HPC architecture at Australian Bureau of Meteorology
Progress in NWP on Intel HPC architecture at Australian Bureau of Meteorology www.cawcr.gov.au Robin Bowen Senior ITO Earth System Modelling Programme 04 October 2012 ECMWF HPC Presentation outline Weather
More informationTropical cyclone simulations and predictions with GFDL s prototype global cloud resolving model
Tropical cyclone simulations and predictions with GFDL s prototype global cloud resolving model S.-J. Lin and GFDL model development teams NOAA/Geophysical Fluid Dynamics Laboratory Workshop on High-Resolution
More informationINDIAN INSTITUTE OF TROPICAL METEOROLOGY, PUNE Advertisement No. PER/ 09 /2010 Opportunities for Talented Young Scientists in Climate Science
INDIAN INSTITUTE OF TROPICAL METEOROLOGY, PUNE411008 (An Autonomous Institute under Ministry of Earth Sciences, Government of India) Advertisement No. PER/ 09 /2010 Opportunities for Talented Young Scientists
More informationWRF Modeling System Overview
WRF Modeling System Overview Wei Wang & Jimy Dudhia Nansha, Guangdong, China December 2015 What is WRF? WRF: Weather Research and Forecasting Model Used for both research and operational forecasting It
More informationTHE NCEP CLIMATE FORECAST SYSTEM. Suranjana Saha, ensemble workshop 5/10/2011 THE ENVIRONMENTAL MODELING CENTER NCEP/NWS/NOAA
THE NCEP CLIMATE FORECAST SYSTEM Suranjana Saha, ensemble workshop 5/10/2011 THE ENVIRONMENTAL MODELING CENTER NCEP/NWS/NOAA An upgrade to the NCEP Climate Forecast System (CFS) was implemented in late
More informationOperational Ocean and Climate Modeling at NCEP
Operational Ocean and Climate Modeling at NCEP 5 th Annual CoRP Science Symposium Corvallis, OR Aug. 12-13, 2008 Hua-Lu Pan and Hendrik Tolman Environmental Modeling Center NCEP 1.7B Obs/Day Satellites
More informationApplication of Clustering to Earth Science Data: Progress and Challenges
Application of Clustering to Earth Science Data: Progress and Challenges Michael Steinbach Shyam Boriah Vipin Kumar University of Minnesota Pang-Ning Tan Michigan State University Christopher Potter NASA
More informationCurrent status and prospects of Extended range prediction of Indian summer monsoon using CFS model
Current status and prospects of Extended range prediction of Indian summer monsoon using CFS model Dr. A. K. Sahai Indian Institute of Tropical Meteorology, Pune 411 008, INDIA E-mail: sahai@tropmet.res.in
More informationSouth Asian Climate Outlook Forum (SASCOF-12)
Twelfth Session of South Asian Climate Outlook Forum (SASCOF-12) Pune, India, 19-20 April 2018 Consensus Statement Summary Normal rainfall is most likely during the 2018 southwest monsoon season (June
More informationNOAA Research and Development High Performance Compu3ng Office Craig Tierney, U. of Colorado at Boulder Leslie Hart, NOAA CIO Office
A survey of performance characteris3cs of NOAA s weather and climate codes across our HPC systems NOAA Research and Development High Performance Compu3ng Office Craig Tierney, U. of Colorado at Boulder
More informationWeather Forecasting. March 26, 2009
Weather Forecasting Chapter 13 March 26, 2009 Forecasting The process of inferring weather from a blend of data, understanding, climatology, and solutions of the governing equations Requires an analysis
More informationCombining Deterministic and Probabilistic Methods to Produce Gridded Climatologies
Combining Deterministic and Probabilistic Methods to Produce Gridded Climatologies Michael Squires Alan McNab National Climatic Data Center (NCDC - NOAA) Asheville, NC Abstract There are nearly 8,000 sites
More informationA framework for detailed multiphase cloud modeling on HPC systems
Center for Information Services and High Performance Computing (ZIH) A framework for detailed multiphase cloud modeling on HPC systems ParCo 2009, 3. September 2009, ENS Lyon, France Matthias Lieber a,
More informationESiWACE. A Center of Excellence for HPC applications to support cloud resolving earth system modelling
ESiWACE A Center of Excellence for HPC applications to support cloud resolving earth system modelling Joachim Biercamp, Panagiotis Adamidis, Philipp Neumann Deutsches Klimarechenzentrum (DKRZ) Motivation:
More informationOptimization strategy for MASNUM surface wave model
Hillsboro, September 27, 2018 Optimization strategy for MASNUM surface wave model Zhenya Song *, + * First Institute of Oceanography (FIO), State Oceanic Administrative (SOA), China + Intel Parallel Computing
More informationPolar COAWST. Coupled Atmosphere (Land) Ocean Sea Ice Wave Sediment Transport Modeling System for Polar Regions
U.S. Department of the Interior U.S. Geological Survey Polar COAWST Coupled Atmosphere (Land) Ocean Sea Ice Wave Sediment Transport Modeling System for Polar Regions David Bromwich 1, Le-Sheng Bai 1 Michael
More informationData Assimilation: Finding the Initial Conditions in Large Dynamical Systems. Eric Kostelich Data Mining Seminar, Feb. 6, 2006
Data Assimilation: Finding the Initial Conditions in Large Dynamical Systems Eric Kostelich Data Mining Seminar, Feb. 6, 2006 kostelich@asu.edu Co-Workers Istvan Szunyogh, Gyorgyi Gyarmati, Ed Ott, Brian
More informationAn Investigation of Reforecasting Applications for NGGPS Aviation Weather Prediction: An Initial Study of Ceiling and Visibility Prediction
An Investigation of Reforecasting Applications for NGGPS Aviation Weather Prediction: An Initial Study of Ceiling and Visibility Prediction Kathryn L. Verlinden, Oregon State University David Bright, WFO
More informationMoving to a simpler NCEP production suite
Moving to a simpler NCEP production suite Unified coupled global modeling Hendrik L. Tolman Director, Environmental Modeling Center NOAA / NWS / NCEP Hendrik.Tolman@NOAA.gov page 1 of 14 Content The suite
More informationPerformance of WRF using UPC
Performance of WRF using UPC Hee-Sik Kim and Jong-Gwan Do * Cray Korea ABSTRACT: The Weather Research and Forecasting (WRF) model is a next-generation mesoscale numerical weather prediction system. We
More informationExascale I/O challenges for Numerical Weather Prediction
Exascale I/O challenges for Numerical Weather Prediction A view from ECMWF Tiago Quintino, B. Raoult, S. Smart, A. Bonanni, F. Rathgeber, P. Bauer, N. Wedi ECMWF tiago.quintino@ecmwf.int SuperComputing
More informationACCELERATING WEATHER PREDICTION WITH NVIDIA GPUS
ACCELERATING WEATHER PREDICTION WITH NVIDIA GPUS Alan Gray, Developer Technology Engineer, NVIDIA ECMWF 18th Workshop on high performance computing in meteorology, 28 th September 2018 ESCAPE NVIDIA s
More informationProspects for subseasonal forecast of Tropical Cyclone statistics with the CFS
Prospects for subseasonal forecast of Tropical Cyclone statistics with the CFS Augustin Vintzileos (1)(3), Tim Marchok (2), Hua-Lu Pan (3) and Stephen J. Lord (1) SAIC (2) GFDL (3) EMC/NCEP/NOAA During
More informationWorld Weather Research Programme WWRP. PM Ruti WMO
World Weather Research Programme WWRP PM Ruti WMO Societal challenges: a 10y vision High Impact Weather and its socio-economic effects in the context of global change Water: Modelling and predicting the
More informationReflecting on the Goal and Baseline of Exascale Computing
Reflecting on the Goal and Baseline of Exascale Computing Thomas C. Schulthess!1 Tracking supercomputer performance over time? Linpack benchmark solves: Ax = b!2 Tracking supercomputer performance over
More informationUsing Aziz Supercomputer
The Center of Excellence for Climate Change Research Using Aziz Supercomputer Mansour Almazroui Director, Center of Excellence for Climate Change Research (CECCR) Head, Department of Meteorology King Abdulaziz
More informationICON-ESM MPI-M s next-generation Earth system model
ICON-ESM MPI-M s next-generation Earth system model Climate and Earth system models are applied to simulate the past, present, and projected future climate, and to advance understanding of processes that
More informationMultiple Ocean Analysis Initialization for Ensemble ENSO Prediction using NCEP CFSv2
Multiple Ocean Analysis Initialization for Ensemble ENSO Prediction using NCEP CFSv2 B. Huang 1,2, J. Zhu 1, L. Marx 1, J. L. Kinter 1,2 1 Center for Ocean-Land-Atmosphere Studies 2 Department of Atmospheric,
More informationLecture 1. Amplitude of the seasonal cycle in temperature
Lecture 6 Lecture 1 Ocean circulation Forcing and large-scale features Amplitude of the seasonal cycle in temperature 1 Atmosphere and ocean heat transport Trenberth and Caron (2001) False-colour satellite
More informationWRF Modeling System Overview
WRF Modeling System Overview Jimy Dudhia What is WRF? WRF: Weather Research and Forecasting Model Used for both research and operational forecasting It is a supported community model, i.e. a free and shared
More informationParalleliza(on and Performance of the NIM Weather Model on CPU, GPU and MIC Architectures
Paralleliza(on and Performance of the NIM Weather Model on CPU, GPU and MIC Architectures Mark Gove? NOAA Earth System Research Laboratory We Need Be?er Numerical Weather Predic(on Superstorm Sandy Hurricane
More informationM.Sc. in Meteorology. Physical Meteorology Prof Peter Lynch. Mathematical Computation Laboratory Dept. of Maths. Physics, UCD, Belfield.
M.Sc. in Meteorology Physical Meteorology Prof Peter Lynch Mathematical Computation Laboratory Dept. of Maths. Physics, UCD, Belfield. Climate Change???????????????? Tourists run through a swarm of pink
More informationHMON (HNMMB): Development of a new Hurricane model for NWS/NCEP operations
1 HMON (HNMMB): Development of a new Hurricane model for NWS/NCEP operations Avichal Mehra, EMC Hurricane and Mesoscale Teams Environmental Modeling Center NOAA / NWS / NCEP HMON: A New Operational Hurricane
More informationP3.1 Development of MOS Thunderstorm and Severe Thunderstorm Forecast Equations with Multiple Data Sources
P3.1 Development of MOS Thunderstorm and Severe Thunderstorm Forecast Equations with Multiple Data Sources Kathryn K. Hughes * Meteorological Development Laboratory Office of Science and Technology National
More informationIntroduction of a Stabilized Bi-Conjugate Gradient iterative solver for Helmholtz s Equation on the CMA GRAPES Global and Regional models.
Introduction of a Stabilized Bi-Conjugate Gradient iterative solver for Helmholtz s Equation on the CMA GRAPES Global and Regional models. Peng Hong Bo (IBM), Zaphiris Christidis (Lenovo) and Zhiyan Jin
More informationDevelopment of Super High Resolution Global and Regional Climate Models
Development of Super High Resolution Global and Regional Climate Models Project Representative Akira Noda Meteorological Research Institute Authors Akira Noda 1, Shoji Kusunoki 1 and Masanori Yoshizaki
More informationHYBRID GODAS STEVE PENNY, DAVE BEHRINGER, JIM CARTON, EUGENIA KALNAY, YAN XUE
STEPHEN G. PENNY UNIVERSITY OF MARYLAND (UMD) NATIONAL CENTERS FOR ENVIRONMENTAL PREDICTION (NCEP) HYBRID GODAS STEVE PENNY, DAVE BEHRINGER, JIM CARTON, EUGENIA KALNAY, YAN XUE NOAA CLIMATE REANALYSIS
More information1. Header Land-Atmosphere Predictability Using a Multi-Model Strategy Paul A. Dirmeyer (PI) Zhichang Guo (Co-I) Final Report
1. Header Land-Atmosphere Predictability Using a Multi-Model Strategy Paul A. Dirmeyer (PI) Zhichang Guo (Co-I) Final Report 2. Results and Accomplishments Output from multiple land surface schemes (LSS)
More informationModel Systems at MPI-M. Marco Giorgetta
Model Systems at MPI-M Marco Giorgetta Content What is a model in climate research as used here for IPCC? What is a model? Components of the climate system Climate models What is inside? Climate models
More informationWind Flow Modeling The Basis for Resource Assessment and Wind Power Forecasting
Wind Flow Modeling The Basis for Resource Assessment and Wind Power Forecasting Detlev Heinemann ForWind Center for Wind Energy Research Energy Meteorology Unit, Oldenburg University Contents Model Physics
More informationIntroduction to Climatology. GEOG/ENST 2331: Lecture 1
Introduction to Climatology GEOG/ENST 2331: Lecture 1 Us! Graham Saunders (RC 2006C) graham.saundersl@lakeheadu.ca! Jason Freeburn (RC 2004) jtfreebu@lakeheadu.ca Graham Saunders! Australian Weather Bureau!
More informationRecent advances in the GFDL Flexible Modeling System
Recent advances in the GFDL Flexible Modeling System 4th ENES HPC Workshop Toulouse, FRANCE V. Balaji and many others NOAA/GFDL and Princeton University 6 April 2016 V. Balaji (balaji@princeton.edu) GFDL
More informationClimate Change Modelling: BASICS AND CASE STUDIES
Climate Change Modelling: BASICS AND CASE STUDIES TERI-APN s Training program on Urban Climate Change Resilience 22 nd 23 rd January, 2014 Goa Saurabh Bhardwaj Associate Fellow Earth Science & Climate
More informationSPECIAL PROJECT FINAL REPORT
SPECIAL PROJECT FINAL REPORT All the following mandatory information needs to be provided. Project Title: Sensitivity of decadal forecast to atmospheric resolution and physics Computer Project Account:
More informationDeciphering the desiccation trend of the South Asian monsoon hydroclimate in a warming world
Deciphering the desiccation trend of the South Asian monsoon hydroclimate in a warming world R. Krishnan Centre for Climate Change Research (CCCR) Indian Institute of Tropical Meteorology, Pune Collaborators:
More informationThe Influence of Intraseasonal Variations on Medium- to Extended-Range Weather Forecasts over South America
486 MONTHLY WEATHER REVIEW The Influence of Intraseasonal Variations on Medium- to Extended-Range Weather Forecasts over South America CHARLES JONES Institute for Computational Earth System Science (ICESS),
More informationLong Range Forecast Update for 2014 Southwest Monsoon Rainfall
Earth System Science Organization (ESSO) Ministry of Earth Sciences (MoES) India Meteorological Department PRESS RELEASE New Delhi, 9 June 2014 Long Update for 2014 Southwest Monsoon Rainfall HIGHLIGHTS
More informationAN INTERNATIONAL SOLAR IRRADIANCE DATA INGEST SYSTEM FOR FORECASTING SOLAR POWER AND AGRICULTURAL CROP YIELDS
AN INTERNATIONAL SOLAR IRRADIANCE DATA INGEST SYSTEM FOR FORECASTING SOLAR POWER AND AGRICULTURAL CROP YIELDS James Hall JHTech PO Box 877 Divide, CO 80814 Email: jameshall@jhtech.com Jeffrey Hall JHTech
More informationScalability Programme at ECMWF
Scalability Programme at ECMWF Picture: Stan Tomov, ICL, University of Tennessee, Knoxville Peter Bauer, Mike Hawkins, George Mozdzynski, Tiago Quintino, Deborah Salmond, Stephan Siemen, Yannick Trémolet
More informationMesoscale meteorological models. Claire L. Vincent, Caroline Draxl and Joakim R. Nielsen
Mesoscale meteorological models Claire L. Vincent, Caroline Draxl and Joakim R. Nielsen Outline Mesoscale and synoptic scale meteorology Meteorological models Dynamics Parametrizations and interactions
More informationPolar Meteorology Group, Byrd Polar Research Center, The Ohio State University, Columbus, Ohio
JP2.14 ON ADAPTING A NEXT-GENERATION MESOSCALE MODEL FOR THE POLAR REGIONS* Keith M. Hines 1 and David H. Bromwich 1,2 1 Polar Meteorology Group, Byrd Polar Research Center, The Ohio State University,
More information3.4 THE IMPACT OF CONVECTIVE PARAMETERIZATION SCHEMES ON CLIMATE SENSITIVITY
3.4 THE IMPACT OF CONVECTIVE PARAMETERIZATION SCHEMES ON CLIMATE SENSITIVITY David J. Karoly*, Lance M. Leslie and Diandong Ren School of Meteorology, University of Oklahoma, Norman OK and Mark Leplastrier
More informationDevelopment of a High-Resolution Coupled Atmosphere-Ocean-Land General Circulation Model for Climate System Studies
Chapter 1 Earth Science Development of a High-Resolution Coupled Atmosphere-Ocean-Land General Circulation Model for Climate System Studies Project Representative Tatsushi Tokioka Frontier Research Center
More informationClimate Hazards Group, Department of Geography, University of California, Santa Barbara, CA, USA. 2
Forecasting seasonal agricultural droughts in East Africa using satellite based observations, land surface models and dynamical weather/climate forecasts Shraddhanand Shukla 1, Amy McNally 3,4, Greg Husak
More informationNonlinear atmospheric response to Arctic sea-ice loss under different sea ice scenarios
Nonlinear atmospheric response to Arctic sea-ice loss under different sea ice scenarios Hans Chen, Fuqing Zhang and Richard Alley Advanced Data Assimilation and Predictability Techniques The Pennsylvania
More informationRegional Climate Simulations with WRF Model
WDS'3 Proceedings of Contributed Papers, Part III, 8 84, 23. ISBN 978-8-737852-8 MATFYZPRESS Regional Climate Simulations with WRF Model J. Karlický Charles University in Prague, Faculty of Mathematics
More informationProcessing NOAA Observation Data over Hybrid Computer Systems for Comparative Climate Change Analysis
Processing NOAA Observation Data over Hybrid Computer Systems for Comparative Climate Change Analysis Xuan Shi 1,, Dali Wang 2 1 Department of Geosciences, University of Arkansas, Fayetteville, AR 72701,
More information