KIPAC Computing Resources and Infrastructure. Stuart Marshall. Program Review 6/7/2006

Similar documents
Science at the Kavli Institute

Major Option C1 Astrophysics. C1 Astrophysics

THE KAVLI UNIVERSE. Kavli Building Dedication March 17, 2006

Astrophysics with GLAST: dark matter, black holes and other astronomical exotica

What are the Big Questions and how can Radio Telescopes help answer them? Roger Blandford KIPAC Stanford

Andrei Frolov Ted Baltz Phil Marshall Wynn Ho Anatoly Spitkovsky Weiqun Zhang Tom Abel

Review of Lecture 15 3/17/10. Lecture 15: Dark Matter and the Cosmic Web (plus Gamma Ray Bursts) Prof. Tom Megeath

Chapter 18 Reading Quiz Clickers. The Cosmic Perspective Seventh Edition. The Bizarre Stellar Graveyard Pearson Education, Inc.

Foundations of Astrophysics

Making Sense of the Universe with Supercomputers

Open-Source Astrophysics with the Enzo Community Code. Brian W. O Shea Michigan State University

3/1/18 LETTER. Instructors: Jim Cordes & Shami Chatterjee. Reading: as indicated in Syllabus on web

Isotropy and Homogeneity

Real Astronomy from Virtual Observatories

Cactus Tools for Petascale Computing

COSMIC RAYS DAY INTRODUCTION TO COSMIC RAYS WINDWARD COMMUNITY COLLEGE - SEPTEMBER 26, 2015 VERONICA BINDI - UNIVERSITY OH HAWAII

If you want to learn more about us, contact us or show up here, we are glad to give you a tour of CENTRA's facilities.

LISA: Probing the Universe with Gravitational Waves. Tom Prince Caltech/JPL. Laser Interferometer Space Antenna LISA

Black Holes Thursday, 14 March 2013

MSC HPC Infrastructure Update. Alain St-Denis Canadian Meteorological Centre Meteorological Service of Canada

Neutron Stars. Neutron Stars and Black Holes. The Crab Pulsar. Discovery of Pulsars. The Crab Pulsar. Light curves of the Crab Pulsar.

Learning Objectives: Chapter 13, Part 1: Lower Main Sequence Stars. AST 2010: Chapter 13. AST 2010 Descriptive Astronomy

Michael L. Norman Chief Scientific Officer, SDSC Distinguished Professor of Physics, UCSD

Multi-wavelength Astronomy

Gruppo Virgo MiB+PR (PI Bernuzzi) Gruppo Virgo TO (PI Nagar)

The Memory Intensive System

Strangelets from space. Jes Madsen University of Aarhus, Denmark

This class: Life cycle of high mass stars Supernovae Neutron stars, pulsars, pulsar wind nebulae, magnetars Quark-nova stars Gamma-ray bursts (GRBs)

GLAST. Exploring the Extreme Universe. Kennedy Space Center. The Gamma-ray Large Area Space Telescope

Machine Learning Applications in Astronomy

Gamma Ray Burst Polarimeter POLAR

Other Galaxy Types. Active Galaxies. A diagram of an active galaxy, showing the primary components. Active Galaxies

Weather Research and Forecasting (WRF) Performance Benchmark and Profiling. July 2012

STEM Society Meeting, December 8, 2009

Advanced Computing Systems for Scientific Research

Physics HW Set 3 Spring 2015

The high angular resolution component of the SKA

Dark Energy Research at SLAC

General Relativistic MHD Simulations of Neutron Star Mergers

The first stars and primordial IMBHs

Star systems like our Milky Way. Galaxies

Laser Interferometer Gravitational-Wave Observatory (LIGO)! A Brief Overview!

Relativistic Astrophysics Neutron Stars, Black Holes & Grav. W. ... A brief description of the course

LIGO Status and Advanced LIGO Plans. Barry C Barish OSTP 1-Dec-04

Age-redshift relation. The time since the big bang depends on the cosmological parameters.

Lecture 25: Cosmology: The end of the Universe, Dark Matter, and Dark Energy. Astronomy 111 Wednesday November 29, 2017

Braneworld Cosmological Perturbations

from Fermi (Higher Energy Astrophysics)

Lecture Outlines. Chapter 22. Astronomy Today 8th Edition Chaisson/McMillan Pearson Education, Inc.

Extreme Astronomy and Supernovae. Professor Lynn Cominsky Department of Physics and Astronomy Sonoma State University

Chapter 0 Introduction X-RAY BINARIES

Fermi: Highlights of GeV Gamma-ray Astronomy

MIDTERM #2. ASTR 101 General Astronomy: Stars & Galaxies. Can we detect BLACK HOLES? Black Holes in Binaries to the rescue. Black spot in the sky?

Synergy with Gravitational Waves

Particle Acceleration in the Universe

Gone with the wind. Black holes and their gusty influence on the birth of galaxies. Nadia Zakamska Johns Hopkins University

GRAVITATIONAL COLLAPSE

HPC in Physics. (particularly astrophysics) Reuben D. Budiardja Scientific Computing National Institute for Computational Sciences

Chapter 15 The Milky Way Galaxy. The Milky Way

Charles Keeton. Principles of Astrophysics. Using Gravity and Stellar Physics. to Explore the Cosmos. ^ Springer

Cosmological N-Body Simulations and Galaxy Surveys

Cherenkov Telescope Array ELINA LINDFORS, TUORLA OBSERVATORY ON BEHALF OF CTA CONSORTIUM, TAUP

Dark Matter ASTR 2120 Sarazin. Bullet Cluster of Galaxies - Dark Matter Lab

Sources of Gravitational Waves

AST-1002 Section 0459 Review for Final Exam Please do not forget about doing the evaluation!

Stochastic Modelling of Electron Transport on different HPC architectures

Integrating Globus into a Science Gateway for Cryo-EM

Electromagnetic Counterparts to Gravitational Wave Detections: Bridging the Gap between Theory and Observation

Radio Transients with Apertif Joeri van Leeuwen

GRAVITATIONAL WAVE ASTRONOMY

Gravitational Wave Detectors: Back to the Future

Pulsar Surveys Present and Future: The Arecibo-PALFA Survey and Projected SKA Survey

X-ray Astronomy F R O M V - R O CKETS TO AT HENA MISSION. Thanassis Akylas

GRAVITATIONAL WAVES. Eanna E. Flanagan Cornell University. Presentation to CAA, 30 April 2003 [Some slides provided by Kip Thorne]

Astro-2: History of the Universe. Lecture 5; April

Data Intensive Computing meets High Performance Computing

The Formation of Population III Protostars. Matthew Turk - Stanford/KIPAC with Tom Abel (Stanford/KIPAC)

Rick Ebert & Joseph Mazzarella For the NED Team. Big Data Task Force NASA, Ames Research Center 2016 September 28-30

Gravitational Wave Astronomy using 0.1Hz space laser interferometer. Takashi Nakamura GWDAW-8 Milwaukee 2003/12/17 1

Lecture 11: Ages and Metalicities from Observations. A Quick Review. Multiple Ages of stars in Omega Cen. Star Formation History.

Chapter 17. Active Galaxies and Supermassive Black Holes

Part two of a year-long introduction to astrophysics:

Galaxies. Galaxy Diversity. Galaxies, AGN and Quasars. Physics 113 Goderya

LIGO Observational Results

A100H Exploring the Universe: Quasars, Dark Matter, Dark Energy. Martin D. Weinberg UMass Astronomy

Dark Energy: Measuring the Invisible with X-Ray Telescopes

ASTRONOMY (ASTRON) ASTRON 113 HANDS ON THE UNIVERSE 1 credit.

2 The Sloan Digital Sky Survey (2000-) is a terapixel visualization of the Universe.

CTA SKA Synergies. Stefan Wagner Landessternwarte (CTA Project Office) Heidelberg

Neutron Stars. Properties of Neutron Stars. Formation of Neutron Stars. Chapter 14. Neutron Stars and Black Holes. Topics for Today s Class

ASTRONOMY (ASTR) 100 Level Courses. 200 Level Courses. 300 Level Courses

Test #3 Next Tuesday, Nov. 8 Bring your UNM ID! Bring two number 2 pencils. Announcements. Review for test on Monday, Nov 7 at 3:25pm

Searching for gravitational waves. with LIGO detectors

Lobster X-ray Telescope Science. Julian Osborne

High Energy Astrophysics

Chapter 18 Lecture. The Cosmic Perspective Seventh Edition. The Bizarre Stellar Graveyard Pearson Education, Inc.

Analysis of Off-Nuclear X-Ray Sources in Galaxy NGC Sarah M. Harrison

White dwarfs are the remaining cores of dead stars. Electron degeneracy pressure supports them against the crush of gravity. The White Dwarf Limit

The Dynamic Radio Sky: On the path to the SKA. A/Prof Tara Murphy ARC Future Fellow

International Olympiad on Astronomy and Astrophysics (IOAA)

Transcription:

KIPAC Computing Resources and Infrastructure Stuart Marshall 1

KIPAC Objectives: Address science questions arising at the interface between physics and astronomy: DM, DE, CMB, size, shape, age of the universe, growth of first structures, Bridge the campus and SLAC research communities and to facilitate collaboration between the three federal agencies, DOE, NASA and NSF. We are building, from the ground up, the infrastructure that will facilitate the success of these research programs. A substantial portion of that infrastructure is computing capability. Our computing capabilities have just served the needs of the first KIPAC members and warrant significant development if we are to take advantage of our potential. 2

The Science of KIPAC Particle Astrophysics Black Holes, Neutron Stars, White Dwarfs GRBs, magnetars, supernovae Accretion disks and jets Relativistic shocks, particle acceleration, UHECR Solar Physics Cosmology Dark energy, dark matter Gravitational lenses Clusters of galaxies and intergalactic medium Microwave background observations First stars, galaxy formation Supernovae 3

Current Computation: Diverse range of topics Multivariate Monte Carlo Simulations of X-ray Clusters of galaxies (cluster physics, cosmological constraints) (Peterson) LSST sims: photons from galaxies to detector (Peterson) Optical and x-ray studies of galaxy clusters: constraining cosmological parameters (Allen) Neutron stars: pulsar magnetospheres, nuclear burning on rotating NS during type I x-ray bursts (Spitkovsky) X-ray spectroscopy & atomic processes (Sako, Gu) Gravitational Lensing (strong, weak) (P. Marshall, Bradac) Formation of first objects in Universe: (Abel) Stars, dwarf galaxies, supermassive black holes Gamma Ray Burst modeling: (Zhang, Granot) Relativistic jet, accretion disk simulations, prediction of observables Super symmetric particle physics models and observables: comparison with data (Baltz) Anatoly Spitkovsky: Relavitistic wind interacting with a rotating pulsar magnetosphere. Used to model recently discovered binary pulsar system J0737. 4

Current Computation: Scope Range of computing scales: Interactive small jobs: 20-30 processors 24/7 on desktop and batch systems Medium scale: <100 processors Global shared memory (SGI) for quick algorithm development Distributed shared memory (MPI on 32, 128p at SLAC) Fully independent (up to ~3000 processors at SLAC) Large scale: >128p: Big codes (ENZO, RAM, TRISTAN) on big machines @ (NERSC, ORNL, NCSA, SDSC, SLAC?) DOE Huge Memory architecture prototype (future uses) Visualization facility: prototype 3D projection display, software Software: idl, mathematica, iraf, aips, heasoft, pgplot, ciao and many personal/group packages. To date, management is pretty ad-hoc. 5

Current Computation: Resources 14 Apple XServe G5s (2 FS, 2 interactive, 10 batch, 9.6 TB Storage SGI Altix w/72p 1.5GHz, 220 GB memory, 8 TB Storage 21 node dual cpu opteron for XOC group 15 TB NFS Storage on Sun systems 6

HPC on SGI Altix Large,parallel runs on Red with heavy Disk I/O as well as message passing usage,using the code ENZO (75k lines,written by Greg Bryan) Timesteps,of which there can be tens or hundreds,are typically between 2 and 8 gigabytes,and consisting of thousands to tens of thousands of files. Small runs use between 4-16 processors, larger runs (cosmology) use 32 typically. More than 32 is considered inconsiderate to fellow users. Users manage processor allocation. Run times can be between one day and several weeks,depending on start and stop times and complexity of the run. Simulation complexity can be very difficult to estimate in advance,and until the actual collapse of primordial objects in the simulation,it can be impossible to estimate when simulations will become complex. Matt Turk 7

Visualization: Prototypes Small 3-D projection system Rendering usingvisit, developed LLNL (http://www.llnl.gov/visit/) Based on VTK, with generalized parallel support for analysis and rendering Can render on Altix or new 4 core Sun visualization host. Currently forwarding X11,will soon be using full client/server architecture to forward OpenGL primitives through SSH directly Stereoscopic rendering, with VisIt as well as particle data in partiview (from NCSA) Moving analysis tools to VisIt, for generalized parallel architecture support Apple G5 based multi-screen display 8

Visualization: Research and Development Three time-steps from a galaxy and star formation simulation rendered with the optical model that maps density and temperature fields to emission and absorption coefficients. dark regions in the center result from non-illuminated gas in the front, that absorbs large amounts of light in that region. Images were created with a new volume renderer developed by Ralf Kaehler (ZIB), John Wise & Tom Abel (KIPAC). Ray traces thousands of grid patches on different spatial resolution and composites them consistently. Gives rates of 10 frames per second for a complicated data set which used to take minutes per frame with a software based approach. 9

What s to come Black Hole coalescence (Andres Escala*) High resolution SPH simulations of inspiral and merger of MBH binaries. Code is gadget w/mpi. Black Hole Galaxy Connection (Stelios Kazantzidis) Multiscale N-Body+SPH simulations of merging galaxies w/smbhs at high spatial resolution. Code is gasoline SPH optimized for MPI scaling to many processors. Re-ionization of the Universe (Marcelo Alvarez*) Large scale radiative transfer simulations of cosmic reionization. Code C 2 -Ray runs on SMP and MPI architectures. Radiation transfer in curved spacetime (Steven Fuerst) General formulation for relativistic effects in photon transport and motion of emitters. Code suitable for larger scale deployment on shared memory machines. 10

Looking Forward, what is needed Basic services: network (FKB, campus) Accounts (~100 in KI group) Individual computing: desktop and laptop ~85 assigned seats in FKB, over 100 during summer Communication and collaboration Software Commercial: IDL, Mathematica, Matlab, Community and OSS: IRAF, Heasoft, Imcat, SM, HippoDraw, skycat, starlink, pgplot, partiview, mysql, $30K+ this year, growth dependent on mix of comm/oss 11

Communication KIPAC mailing lists are an important tool for keeping our community informed. Admins manage addition/deletion of visitors/students via web interface. 12

Collaboration Collaboration web space based on open source package Plone. Prototype set up and tested for several user groups Next step is to install new instance on dedicated server Continue discussions with SCCS to similar capabilities hosted at SLAC 13

Supporting KIPAC Research: Services needed Storage and data management Not unusual for PD s and research groups to have multi-tb storage requirements We are purchasing ~30 TB this year with expected constant $$ need for storage Data Lifecycle Management Computing cycles: simple to full HPC Interactive servers Large memory/processor w/few cpu s MPI cluster 64 nodes w/4 cores Continued SMP development Scientific visualization Large pixel display w/software 3-D display w/software development Users desktops large SAN Storage Area Network served via multiple fiber channel connected file servers RED 72 proc 220Gb SGI Altix 8Tb CFXS NFS SAN StoreNext FS HPC Numerical Simulations and batch data analysis AMD cluster 64 nodes 8Gb each added to shared cluster Overview SAN StoreNext FS SAN StoreNext FS Xsan clients SUN, 32Gb dual core 4 processor 32Gb Analysis Engine BACKUP CAMPUS G5 Cinema large pixel display wall 4 x 30" screens 5040x3800 resolution SUN, dual core 2 proc AMD, 16Gb Nvidida FX4500 Data Visualization & HP Analysis Stereo Display Wall 14

Supporting KIPAC Research: People Staff: Marshall, Kunz, Morris,.. With completion of FKB we are just getting together as a team. Kunz brings deep experience at SLAC as physicist and developer. Morris has expertise in modern codes for Cosmological simulations and data analysis. New Position opening to support design, implementation, and development of SOA visualization for KIPAC. KIPAC/SCCS collaboration Fortnightly meetings for coordination and development of KIPAC computing resources. Major milestones: Integration of small Apple Xserve cluster and fileservers 72p SGI Altix w/220 GB memory Many publications due to this effort which was large! Ongoing work: Search for suitable networked storage system (Xsan, GPFS, ) Implementation of automated backup via IBM s TSM on SLAC silos Possible use of SLAC batch system for MPI jobs Costing and planning for new hardware An acknowledgement: SCCS has provided major efforts to support new hardware and operating systems that have in fact enabled much of the successful results KIPAC researchers have published. Continued support for scientific computing by SCCS is critical! 15

Long Term: Computational Challenges & Development High-performance computing development platform: Current 72 processor, 220 Gb, SGI Altix SMP is a resource that is useful for development of new numerical techniques, optimization and also some production runs. Maintaining this type of resource is desired. Dedicated batch computing: More resources with higher interconnect speeds would find much usage. Data and file space management: Progress in this area would benefit broad range of SCCS users. Scientific Visualization: Scientific visualization enables human interaction with the vast amount of data generated by numerical simulations and Personal computing environment and training: Expand current desktop environments to have a well maintained software and development environment tailored to the experimental, observational and theoretical users and resources to train them in its effective use. Information and project management: Deploy modern tools to document and communicate KIPAC's efforts internally and publicly as well as enable national and international collaborative project management. Continuing and expanding our ongoing collaboration efforts with SCCS. Collaborating with other groups at SLAC with similar needs and interests must be greatly increased! People Year Desk Storage Software Cycles Visualization 75 2006 94 10TB FNAS + 30TB NAS 35 128n8Gb 8d + s3dstereo 80 2007 100 10TB FNAS + 30TB NAS 40 1 8x8 * display wall +install 85 2008 106 60TB BA + 20TB FNAS + 30TB 45 SMP 3 tile 3d stereo 90 2009 113 90TB BA + 20TB FNAS + 30TB 45 3 new GPUS + active stereo system 95 2010 119 120TB BA + 40TB FNAS + 60TB 45 3 visualization workstation 100 2011 125 180TB BA + 40TB FNAS + 90TB 50 5 display wall upgrade 16