Long-range Interactions in Particle Simulations ScaFaCoS. Olaf Lenz

Similar documents
Fast Algorithms for Many-Particle Simulations on Peta-Scale Parallel Architectures

Introduction to molecular dynamics

Long-range interactions: P 3 M, MMMxD, ELC, MEMD and ICC

Advanced Molecular Molecular Dynamics

CECAM Tutorial Stuttgart, Oct. 2012

Computer simulation methods (2) Dr. Vania Calandrini

Simulating Soft Matter with ESPResSo, ESPResSo++ and VOTCA!

Coarse-Grained Models!

Introduction to Computer Simulations of Soft Matter Methodologies and Applications Boulder July, 19-20, 2012

Computational Molecular Biophysics. Computational Biophysics, GRS Jülich SS 2013

A Fast N-Body Solver for the Poisson(-Boltzmann) Equation

Neighbor Tables Long-Range Potentials

Molecular Dynamics. A very brief introduction

MOLECULAR DYNAMICS SIMULATIONS OF BIOMOLECULES: Long-Range Electrostatic Effects

Non-bonded interactions

Parallelization of Molecular Dynamics (with focus on Gromacs) SeSE 2014 p.1/29

N-body simulations. Phys 750 Lecture 10

Why Proteins Fold? (Parts of this presentation are based on work of Ashok Kolaskar) CS490B: Introduction to Bioinformatics Mar.

Molecular Dynamics Simulations

Fast Methods for Long-Range Interactions in Complex Systems. IAS Series. Volume 6 ISBN

Efficient Parallelization of Molecular Dynamics Simulations on Hybrid CPU/GPU Supercoputers

Virtual material Design

Efficient Algorithms for Electrostatic Interactions Including Dielectric Contrasts

Non-bonded interactions

Hydrodynamics in Espresso. Kai Grass 1

Elliptic Problems / Multigrid. PHY 604: Computational Methods for Physics and Astrophysics II

Scientific Computing II

Molecular Dynamics Simulations. Dr. Noelia Faginas Lago Dipartimento di Chimica,Biologia e Biotecnologie Università di Perugia

Molecular Dynamics. What to choose in an integrator The Verlet algorithm Boundary Conditions in Space and time Reading Assignment: F&S Chapter 4

Electrostatics interactions in condensed matter, theory and simulation

Olaf Lenz. MPI for Polymer Research, Mainz. Olaf Lenz October erphysik, Universität Stuttgart, Germany. th Summer School at the ICP

JASS Modeling and visualization of molecular dynamic processes

Kobe-Brown Simulation Summer School 2015 Project: DPD Simulation of a Membrane

Gromacs Workshop Spring CSC

Molecular Dynamics Simulation of a Nanoconfined Water Film

What is Classical Molecular Dynamics?

Supporting information


Molecular Dynamics. The Molecular Dynamics (MD) method for classical systems (not H or He)

Computational Astrophysics 1 Particle Mesh methods

Molecular Dynamics 9/6/16

Intermolecular Forces and Monte-Carlo Integration 열역학특수연구

Optimizing GROMACS for parallel performance

The Fast Multipole Method in molecular dynamics

The Blue Gene/P at Jülich Case Study & Optimization. W.Frings, Forschungszentrum Jülich,

The Molecular Dynamics Simulation Process

Simulation Laboratories at JSC

FENZI: GPU-enabled Molecular Dynamics Simulations of Large Membrane Regions based on the CHARMM force field and PME

MOLSIM: A Modular Molecular Simulation Software

Systematic Coarse-Graining and Concurrent Multiresolution Simulation of Molecular Liquids

LAMMPS Performance Benchmark on VSC-1 and VSC-2

Introduction to Classical Molecular Dynamics. Giovanni Chillemi HPC department, CINECA

Modeling and visualization of molecular dynamic processes

Simulations of bulk phases. Periodic boundaries. Cubic boxes

Error Analysis of the Poisson P 3 MForce Field Scheme for Particle-Based Simulations of Biological Systems

Hands-on : Model Potential Molecular Dynamics

The Computational Microscope

CE 530 Molecular Simulation

Hydrogels in charged solvents

Lecture 12: Solvation Models: Molecular Mechanics Modeling of Hydration Effects

Computation of non-bonded interactions: Part 1

Parallel Utility for Modeling of Molecular Aggregation

Accepted Manuscript. Correcting Mesh-Based Force Calculations to Conserve Both Energy and Momentum in Molecular Dynamics Simulations

The GROMOS Software for (Bio)Molecular Simulation

The Lattice Boltzmann Method in ESPResSo: Polymer Diffusion and Electroosmotic Flow

Introduction to model potential Molecular Dynamics A3hourcourseatICTP

Brad Gibson Centre for Astrophysics & Supercomputing Swinburne University

General-purpose Coarse-grained. Molecular Dynamics Program COGNAC

All-atom Molecular Mechanics. Trent E. Balius AMS 535 / CHE /27/2010

ICCP Project 2 - Advanced Monte Carlo Methods Choose one of the three options below

Adaptive resolution molecular-dynamics simulation: Changing the degrees of freedom on the fly

CS 542G: The Poisson Problem, Finite Differences

Applications to simulations: Monte-Carlo

Molecular modeling for functional materials design in chemical engineering and more

Sunyia Hussain 06/15/2012 ChE210D final project. Hydration Dynamics at a Hydrophobic Surface. Abstract:

Tutorial on the smooth particle-mesh Ewald algorithm

Molecular Mechanics, Dynamics & Docking

Ab initio molecular dynamics and nuclear quantum effects

Principles and Applications of Molecular Dynamics Simulations with NAMD

Investigation of physiochemical interactions in

Supplementary Information for: Controlling Cellular Uptake of Nanoparticles with ph-sensitive Polymers

MD Thermodynamics. Lecture 12 3/26/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Simulations of structure-formation in theories beyond General Relativity

Massively parallel electronic structure calculations with Python software. Jussi Enkovaara Software Engineering CSC the finnish IT center for science

Computational Linear Algebra

Crystal Structure Prediction using CRYSTALG program

E-CAM The European Centre of Excellence for Software, Training and Consultancy in Simulation and Modelling

Diffusion of Water and Diatomic Oxygen in Poly(3-hexylthiophene) Melt: A Molecular Dynamics Simulation Study

An FPGA Implementation of Reciprocal Sums for SPME

SIMCON - Computer Simulation of Condensed Matter

The Molecular Dynamics Method

Advanced Molecular Dynamics

Classical Electrostatics for Biomolecular Simulations

Using Molecular Dynamics to Compute Properties CHEM 430

Bootstrap AMG. Kailai Xu. July 12, Stanford University

Force fields, thermo- and barostats. Berk Hess

MARTINI simulation details

The liquid-vapour interface of QDO water. Flaviu Cipcigan Andrew Jones Jason Crain Vlad Sokhan Glenn Martyna

Development of Molecular Dynamics Simulation System for Large-Scale Supra-Biomolecules, PABIOS (PArallel BIOmolecular Simulator)

k θ (θ θ 0 ) 2 angles r i j r i j

Transcription:

Long-range Interactions in Particle Simulations ScaFaCoS 1/33

Outline Many-particle Simulations Short-range Interactions Long-range Interactions: ScaFaCoS If time permits: ESPResSo and coarse-graining Many slides from Axel Arnold (ICP Stuttgart) and Godehard Sutmann (FZ Jülich) Algorithms, but not so much mathematics (too short time, and I'm too bad) 2/33

Many-particle Simulations Many-particle simulations are a standard tool in many branches of science Plasma Physics Physical Chemistry Biochemistry and -physics ( Molecular Dynamics MD) Soft Matter Material sciences Astrophysics and Cosmology 3/33

The New Trinity of Physics? Experiment Theory Computer Simulations (a.k.a. Computer Experiments ) 4/33

Usage of FZ Jülich Supercomputers Tier-0: Jugene Blue Gene/P Tier-1: Juropa Bull/Sun Nehalem-Cluster JUGENE JUROPA Astrophysics Biophysics Chemistry Earth & Environment Plasma Physics Soft Matter Fluid Dynamics Elementary Particle Physics Computer Science Condensed Matter Material Science 5/33

How To Do a Many-Particle Simulation? System: N Particles at positions x i ~O(N²) (Pair) Interactions between them Numerically integrate Newton's Equations of Motion: Involves computing f ( x i ) and propagation of x i in every timestep Computational cost of naïve implementation: Propagating the particles: O(N) x i ): O(N²) Computing the Interactions f ( 6/33

Example: All-Atom Molecular Dynamics e.g. AMBER, GROMACS, NAMD, Interactions (AMBER/Sander Force Field): Lennard-Jones H-bonds Coulomb interaction van-der-waals interaction Pauli exclusion 7/33

Short-range vs. Long-range Interactions Many interactions are short-ranged i.e. it is possible to cut off the interaction at a certain distance Formal definition: Revisiting AMBER: short-range long-range 8/33

Speeding Up Short-range Interactions: Cell lists a.k.a. Linked Cell Algorithm Split the simulation box into Cells with side length rcut Create and maintain a list of particles in each cell Only need to compute the interactions with particles in a neighboring cell Good news: Reduces time complexity to O(n)! (at constant density) D. Frenkel, B. Smit; Understanding Molecular Simulation; Academic Press, San Diego, 2002. 9/33

Speeding Up Short-range Interactions: Verlet lists For each particle, store a list of particles in a distance < rs Verlet lists need to be updated when a particle travelled further than ½ skin Reduces number of interaction partners by ~40% compared to Cell lists Use on top of Cell lists Significant memory requirements Most probably not a very good algorithm in the future! D. Frenkel, B. Smit; Understanding Molecular Simulation; Academic Press, San Diego, 2002. 10/33

Parallelizing Short-range Interactions Nowadays rarely Atomic Decomposition Typically Domain Decomposition Next-neighbor communication Size of border region rcut Can employ Cell lists data structures Various tricks of the trade, Improved algorithms, Load balancing Still not very good for exascale computing Domains should not become smaller than rcut Requires a few hundred particles per process to be efficient Stolen from W. Smith, Daresbury Lab, Warrington, UK 11/33

Periodic Boundary Conditions Simulated systems are much smaller than real systems Boundaries make up a significant part of the system! Trick: Periodic Boundary Conditions The simulated system has infinitely many copies of itself in all directions A particle at the right boundary interacts with the particle at the left boundary in the image Minimum image convention: Each particle only interacts with the closest image of another particle Pseudo-infinite system without boundaries Significantly reduces boundary-related errors 12/33

Long-range Interactions Coulomb Potential For long-range interactions (LRI), these tricks do not work! Why can't we just cut off anyway? Qualitatively different behavior due to longrange interactions, e.g. Charge inversion Like-charge attraction Direct summation: O(N²) Worse in PBC or (Poisson Equation) Need to take into account several periodic images Slowly convergent sum Like-charge attraction 13/33

Methods for Long-range Interactions Luckily, a number of smart algorithms do exist Unfortunately, they are complex Often they do not parallelize easily Prefactors and parallel scaling are highly platform and implementation dependent LRI often account for a significant part of the computational cost 14/33

Enter: ScaFaCoS Scalable Fast Coulomb Solvers Library that provides various of these algorithms Common interface Focus on scalability Financial support by BMBF HPC-1 Call Coordinated by the Forschungszentrum Jülich (IAS/JSC) University Partners: Bonn, Chemnitz, Stuttgart, Wuppertal Jülich Cognis Wuppertal SCAI Bonn Research Centres: MPIP Mainz, Fraunhofer St. Augustin Industry: BASF, Cognis, IBM Chemnitz MPI BASF Stuttgart Alas, not finished yet! Only preliminary results Expect publication in Q2 2012 IBM 15/33

Ewald Summation Not implemented in ScaFaCoS, just for understanding Useful mostly for Periodic Boundary Conditions Two problems with Coulomb Interaction Slowly decaying, long tail Singular at each particle position Idea: Separate the problems: Charge distribution = + P. P. Ewald, 1888-1985 Far field Smear out charges with a Gaussian Smooth distribution in periodic BC Solve Poisson Equation in reciprocal space Near field Subtract the effect of the far field Yields in short-ranged interaction with cut-off Compute directly in real space 16/33

Ewald Summation Near field Compute short-ranged near field interaction Far field Ewald sum Fourier transform smeared out charge distribution Cut off in Fourier space Solve Poisson Equation in reciprocal space via Green's Function Backtransform potential to real space Differentiate potential to get fields or forces Overall complexity: O(N3/2) 17/33

P3M/PME: Ewald Summation on a Mesh Idea: Use discrete FFT (O(N log N)) instead of normal Fourier Transform Requires a regular mesh Far field Assign charges onto the mesh Forward-FFT mesh Solve Poisson Equation by applying Green's function ( Influence function ) Back-FFT mesh Interpolate potentials onto charge positions Force computation Differentiate potential in Fourier space (ik Differentiation), then backtransform and interpolate forces (3xFFT, high precision) Or backtransform potential, then differentiate and interpolate analytically in real space (1xFFT, lower precision) Near field as Ewald Summation 18/33

P3M/PME: Ewald Summation on a Mesh Particle-Particle-Particle-Mesh P3M (Hockney/Eastwood 1988) Particle-Mesh-Ewald PME (Darden 1993) Similar algorithms, differences in the details Used for a long time, standard method in MD Good error estimates exist Very useful for tuning the algorithm to required accuracy Not so good for inhomogenous systems Overall complexity: O(N log N) Parallelization: FFT requires all-to-all communication ScaFaCoS: Scaling of the parallel 3D-FFT in ScaFaCoS (Mesh 512³, up to 262144 cores) P3M (Stuttgart) NFFT (Chemnitz): 3D FFT Scaling seems to be good 19/33

Multigrid Multigrid Poisson Solver (Brandt 1977) Near field as Ewald Summation Far field Hierarchy of successively finer grids Solve Poisson equation on coarsest grid Smooth out high-frequency errors Prolongate correction to finer grid Repeat until finest grid is reached Downsample residual error O(N) Open BC (PBC in development) Parallelization: scaling graph ScaFaCoS: PP3MG (Wuppertal) VMG (Bonn) 20/33

Barnes-Hut Tree Code Tree code (Barnes/Hut 1988) Bundle charges in hierarchical tree Multipole development of bundles Propagate multipole terms up the tree Depending on distance in tree of bundle, use multipole terms of higher levels Open or closed BC O(N log N) Parallelization: scaling graph ScaFaCoS: PEPC (Jülich) 21/33

Fast Multipole Method FMM (Greengard/Rokhlin 1987) Like Barnes-Hut Tree code on a grid Propagate multipole terms up the hierarchy of coarser grids Use multipole terms of coarser grid for larger distances Open BC (PBC in development) O(N) Parallelization: scaling graph ScaFaCoS: FMM (Jülich) 22/33

Maxwell Equation Molecular Dynamics (MEMD) MEMD (Maggs/Rosetto 2002, Pasichnyk/Dünweg 2004) In principle, Maxwell's equations are local Problem is the large speed of light Simply assume that the speed of light is small Statistics is still correct! Method Simulate the propagation of E, j, H, on a cubic lattice (plaquettes) Use same time steps as MD Can handle dielectric variations O(N) Parallelization: Algorithm is fully local! A. C. Maggs and V. Rosetto, PRL 88:196402, 2002 I. Pasichnyk and B. Dünweg, JPCM 16:3999, 2004 ScaFaCoS: MEMD (Stuttgart) 23/33

Summary Acknowledgements MD poses some challenges for the exascale Bonn Long-range interactions are a significant problem Chemnitz ScaFaCoS will provide the best stateof-the-art algorithms for computing long-range interactions Multigrid: M.Griebel, R.Wildenhues Parallel sorting: M.Hofmann, G.Rünger NFFT/PFFT: M.Pippig, D.Potts Jülich Project Coordination: G.Sutmann Interface: R.Halver. FMM: H.Dachsel, I.Kabadchov Barnes-Hut: L.Arnold, P.Gibbon, R.Speck Multigrid: B.Steffen Stuttgart MEMD: C. Holm, F.Fahrenberger P3M, MMM, ELC: A.Arnold, O.Lenz Wuppertal Multigrid: M.Bolten 24/33

And now for something completely different... 25/33

Coarse-grained simulations ESPResSo 26/33

Coarse-graining Even with exascale, many systems are out of reach Instead of going to higher resolution, go to lower resolution! Model only important degrees of freedom Reduce number of model parameters Top-down modelling Pros Much shorter simulation times No exascale HPC machines Parameter space can be sampled much better Effects of parameters are better understood Cons Quantitative results often not as good Often requires non-standard methods 27/33

Coarse-graining ng i n i a r g rse a o c Finite elements Time scale Atomistic Soft fluid Molecular Quantum Length scale 28/33

ESPResSo Extensible Simulation Package for Research on Soft matter systems Developed originally at Max-PlanckInstitute for Polymer Research in Mainz Now maintained at ICP Stuttgart Created by 3 PhD students during their thesis (Axel Arnold, Bernward Mann and Hanjo Limbach) http://espressomd.org Open-source, GPL Intended for coarse-grained beadspring models (with a focus on charged systems) 29/33

Why yet another Simulation Package? Bead-spring models: Combine several atoms into a single bead Often requires uncommon methods Special interactions (DPD, Gay-Berne ellipsoids,...) Special integrators (MCPD, Hybrid MC/MD,...) Combined with lattice models (Lattice-Boltzmann, MEMD,...) Uncommon simulation protocols (Simulated annealing, Parallel tempering, ) Special constraints (Walls, Pores, ) Standard MD simulation packages (GROMACS, NAMD, AMBER, ) are not flexible enough to deal these models Package must be flexible! In research, new methods are developed Building new methods into highly optimized code is very hard Package must be extensible! 30/33

ESPResSo Overview Simulation core written in ANSI C MPI parallelized Simulation core is controlled by scripting language (currently Tcl, future Python) A simulation is defined by an ESPResSo script Free and Open-Source (GPL) Extensible: readability preferred over performance But of course not as optimized as e.g. GROMACS.. setmd box_l 10.0 10.0 10.0 integrate 1000 # compute kinetic energy set e_kin [analyze energy kinetic].. 31/33

Methods in ESPResSo Integrators and ensembles: Velocity-Verlet algorithm (NVE), Langevin thermostat (NVT), Barostat by Dünweg (NPT), Quarternion integrator for non-spherical particles or point-like dipoles Nonbonded interactions: Lennard-Jones, Gay-Berne, Buckingham, Morse, Bonded interactions: harmonic, FENE, tabulated, bond-angle interaction, dihedral interaction Long-range interactions: for electrostatics: P3M, MMM1D, MMM2D, Ewald, ELC and MEMD; for point-like dipoles: dipolar P3M Hydrodynamic interactions: DPD, Lattice-Boltzmann fluid (on GPGPU) coupled to particle simulation Constraints: Particles can be fixed in any directions; walls, pores, spheres... Analysis: energy components, pressure tensor, forces, distribution functions, structure factors, polymer-specific analysis functions (radius of gyration,...), output to VMD...and it is continuously growing... 32/33

That's all. Thanks for your attention! 33/33