ComPASS Project Overview Panagiotis Spentzouris, Fermilab ComPASS PI.

Slides:



Advertisements
Similar presentations
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Advertisements

European Strategy for Particle Physics 2013 Preparatory group->Strategy group Individual town meetings Town meeting in Krakow: september 2012 Drafting.
Maria Grazia Pia, INFN Genova 1 Part V The lesson learned Summary and conclusions.
Helmholtz International Center for Oliver Boine-Frankenheim GSI mbH and TU Darmstadt/TEMF FAIR accelerator theory (FAIR-AT) division Helmholtz International.
SLAC Accelerator Research and Introduction to SAREC Tor Raubenheimer FACET Users Meeting August 29 th - 30 th, 2011.
Sept. 18, 2008SLUO 2008 Annual Meeting Vision for SLAC Science Persis S. Drell Director SLAC.
February 19, 2008 FACET Review 1 Lab Overview and Future Onsite Facilities Persis S. Drell DirectorSLAC.
Advancing Computational Science Research for Accelerator Design and Optimization Accelerator Science and Technology - SLAC, LBNL, LLNL, SNL, UT Austin,
Office of Science U.S. Department of Energy U.S. Department of Energy’s Office of Science Dr. Raymond L. Orbach Under Secretary for Science U.S. Department.
Office of Science Perspective Symposium on Accelerators for America’s Future October 26, 2009 Dr. William F. Brinkman Director, Office of Science U.S.
SLAC is focusing on the modeling and simulation of DOE accelerators using high- performance computing The performance of high-brightness RF guns operating.
T7/High Performance Computing K. Ko, R. Ryne, P. Spentzouris.
Role of Deputy Director for Code Architecture and Strategy for Integration of Advanced Computing R&D Andrew Siegel FSP Deputy Director for Code Architecture.
SciDAC Accelerator Simulation project: FNAL Booster modeling, status and plans Robert D. Ryne, P. Spentzouris.
Improved pipelining and domain decomposition in QuickPIC Chengkun Huang (UCLA/LANL) and members of FACET collaboration SciDAC COMPASS all hands meeting.
Summary of AWG4: Beam Dynamics A. Latina (CERN), N. Solyak (FNAL) LCWS13 – Nov 11-15, 2013 – The University of Tokyo, Japan.
R. Ryne, NUG mtg: Page 1 High Energy Physics Greenbook Presentation Robert D. Ryne Lawrence Berkeley National Laboratory NERSC User Group Meeting.
Virtual Accelerator at J-PARC 3 GeV Rapid Cycling Synchrotron H. Harada*, K. Shigaki (Hiroshima University in Japan), H. Hotchi, F. Noda, H. Sako, H. Suzuki,
24 April 2015 FY 2016 Budget Request to Congress for DOE’s Office of Science Dr. Patricia M. Dehmer Acting Director, Office of Science
Components for Beam Dynamics Douglas R. Dechow, Tech-X Lois Curfman McInnes, ANL Boyana Norris, ANL With thanks to the Common Component Architecture (CCA)
High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative.
Building an Electron Cloud Simulation using Bocca, Synergia2, TxPhysics and Tau Performance Tools Phase I Doe SBIR Stefan Muszala, PI DOE Grant No DE-FG02-08ER85152.
SAP Participants: Douglas Dechow, Tech-X Corporation Lois Curfman McInnes, Boyana Norris, ANL Physics Collaborators: James Amundson, Panagiotis Spentzouris,
Eric Prebys 10/28/2008.  There is a great deal of synergy between PS2 and the Fermilab Main Injector during the Project X era.  High energy ion transport,
BESAC Dec Outline of the Report I. A Confluence of Scientific Opportunities: Why Invest Now in Theory and Computation in the Basic Energy Sciences?
October 4-5, Electron Lens Beam Physics Overview Yun Luo for RHIC e-lens team October 4-5, 2010 Electron Lens.
Office of Science U.S. Department of Energy Raymond L. Orbach Director Office of Science U.S. Department of Energy Presentation to BESAC December 6, 2004.
FALC Technology Benefits study P. Grannis Beijing GDE meeting Feb. 7, 2007 FALC = Funding Agencies for Large Colliders is composed of representatives from.
Beam Dynamics: Planned Activities Code Development Intrabeam collisions Electron cooling Continued support for IMPACT Continued development of  beam-beam.
1 1 What does Performance Across the Software Stack mean?  High level view: Providing performance for physics simulations meaningful to applications 
 Advanced Accelerator Simulation Panagiotis Spentzouris Fermilab Computing Division (member of the SciDAC AST project)
1 BNL LARP Accelerator Physics Program Resources BNL role in national program BNL Accelerator Physics Program.
I/O for Structured-Grid AMR Phil Colella Lawrence Berkeley National Laboratory Coordinating PI, APDEC CET.
Office of Science U.S. Department of Energy 1 International Linear Collider In August 2004 ICFA announced their technology selection for an ILC: 1.The.
ERHIC design status V.Ptitsyn for the eRHIC design team.
1 1 Office of Science Jean-Luc Vay Accelerator Technology & Applied Physics Division Lawrence Berkeley National Laboratory HEP Software Foundation Workshop,
BESAC August Part III IV. Connecting Theory with Experiment V. The Essential Resources for Success Co-Chairs Bruce Harmon – Ames Lab and Iowa.
ILC Damping Rings Mini-Workshop, KEK, Dec 18-20, 2007 Status and Plans for Impedance Calculations of the ILC Damping Rings Cho Ng Advanced Computations.
Improved electron cloud build-up simulations with PyECLOUD G. Iadarola (1),(2), G. Rumolo (1) (1) CERN, Geneva, Switzerland, (2) Università di Napoli “Federico.
Accelerator Simulation in the Computing Division Panagiotis Spentzouris.
COMPASS All-Hands Meeting, FNAL, Sept , 2007 Accelerator Prototyping Through Multi-physics Analysis Volkan Akcelik, Lie-Quan Lee, Ernesto Prudencio,
HEP and NP SciDAC projects: Key ideas presented in the SciDAC II white papers Robert D. Ryne.
Accelerator Simulation in the Computing Division Panagiotis Spentzouris.
ComPASS Summary, Budgets & Discussion Panagiotis Spentzouris, Fermilab ComPASS PI.
Status of Head-on Beam-Beam Compensation BNL - FNAL- LBNL - SLAC US LHC Accelerator Research Program A. Valishev, FNAL 09 April 2009 LARP CM12.
13 September 2006 Global Design Effort 1 ML (x.7) Goals and Scope of Work to end FY09 Nikolay Solyak Fermilab Peter Tenenbaum SLAC.
Global Design Effort: Controls & LLRF Americas Region Team WBS x.2 Global Systems Program Overview for FY08/09.
Steering Group Meeting 10:30 – 12:30 am CDT Monday, July 23, 2007 Y2K.
Pushing the space charge limit in the CERN LHC injectors H. Bartosik for the CERN space charge team with contributions from S. Gilardoni, A. Huschauer,
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Accelerator Science Panagiotis Spentzouris Fermilab Estelle Cormier Tech-X.
 Accelerator Simulation P. Spentzouris Accelerator activity coordination meeting 03 Aug '04.
1 Comments concerning DESY and TESLA Albrecht Wagner Comments for the 5th meeting of the ITRP at Caltech 28 June 2004 DESY and the LC What could DESY contribute.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
CPM 2012, Fermilab D. MacFarlane & N. Holtkamp The Snowmass process and SLAC plans for HEP.
LARP Accelerator Systems D. Rice, J. Rosenzweig, M. White LARP 2009 review.
Particle Physics Sector Young-Kee Kim / Greg Bock Leadership Team Strategic Planning Winter Workshop January 29, 2013.
49th ICFA Advanced Beam Dynamics Workshop. October 8–12, 2010 LEPP, the Cornell University Laboratory for Elementary-Particle Physics, has joined with.
VisIt Project Overview
G. Cheng, R. Rimmer, H. Wang (Jefferson Lab, Newport News, VA, USA)
Challenges in Electromagnetic Modeling Scalable Solvers
Why Study Electron Clouds? Methods and Tools to Study Electron Clouds
Wakefield Accelerator
Parallel 3D Finite Element Particle-In-Cell Simulations with Pic3P*
PARALLEL FINITE ELEMENT MODELING TOOLS FOR ERL DESIGN AND ANALYSIS
L Ge, L Lee, A. Candel, C Ng, K Ko, SLAC
EIC Collaborations EIC Collaboration Workshop, JLAB Oct 28-Nov 1st
Invited Talks Proposals John Byrd (ANL) and Pietro Musumeci (UCLA)
PARALLEL FINITE ELEMENT MODELING TOOLS FOR ERL DESIGN AND ANALYSIS
Prospect of Indo-US Collaboration
Presentation transcript:

ComPASS Project Overview Panagiotis Spentzouris, Fermilab ComPASS PI

ComPASS  Community Petascale Project for Accelerator Science and Simulation (ComPASS) Project, proposed January 2007, funded August 2007

ComPASS program motivation  Accelerators enable many important applications, both in basic research and applied sciences  Different machine attributes are emphasized for different applications Variety of particle beams and operation principles Different energies and intensities  Strong overlap in underlying basic accelerator science concepts  a wide spectrum of requirements for very complex instruments. Assisting their design and operation requires an equally complex set of computational tools.

Multi-scale, multi-physics requirements  Wide range of scales: accelerator complex (10 3 m) → EM wavelength ( m) → component (10-1 m) → particle bunch (10 -3 m) → PIC ( ) Simulations need to connect scales and allow inclusion of multiple physics effects at each level

ComPASS vision  Accelerator science and technology objectives: Achieve higher energy and intensity, faster and cheaper machine design, more reliable operation  To assist these objectives, ComPASS aims to develop integrated modelling capabilities Multi-physics, multi-scale for beam dynamics; “virtual accelerator” Thermal, mechanical, and electromagnetic for component design; “virtual prototyping” Support and guide R&D for new high-gradient acceleration techniques  Deployment of such tools requires utilization of massive computing resources!

ComPASS  Management structure ensures program execution and priority adaptability  ~2.8 ($M)/year (now ~3.0 ($M)/year, thanks Lali!), with planned budget expected to ramp up to 3.3 ($M) for the last (5 th ) year  Project information at

COMPASS Plan  Develop multi-physics and integrated system modeling capabilities on DOE HPC resources Build on HPC high-fidelity single physics codes developed under SciDAC1 Deploy tools under development by SciDAC2 teams in applied mathematics and computer science Algorithm development and performance optimization for petascale resources Development of infrastructure for multi-physics, multi-component simulations  Requirements Availability and support of necessary environment (scientific libraries) on these resources.

Scientific Software Libraries CS, math algorithms, solvers Physics algorithms Accelerator modeling framework: geometry, physics models Accelerator applications Analysis, visualization infrastructure ComPASS software ComPASS CET, Inst. Facility

Computationally challenging physics, common to most applications  Machine design: particles affected by machine components, other beam particles, or other beams Space charge Beam-beam Electron cloud Electron cooling Intrabeam scattering Accurate description (optics, position, feedback, etc)  Component design Impedance Wakefields multipacting Thermal, mechanical  New acceleration techniques Laser and plasma wakefields

Basic Energy Sciences Priorities  Studies of metals, crystals, and amorphous materials ORNL SNS, LANL LANSCE (spallation neutron sources)  Studies of arrangement of atoms in organic and inorganic materials LBNL ALS, BNL NSLS, ANL APS, SLAC SSRL (synchrotron light sources), and LCLS (FEL) starting up at SLAC

High Energy Physics Priorities  High energy frontier Use high-energy colliders to discover new particles and directly probe the properties of nature. –FNAL Tevatron, CERN LHC, future lepton collider  High intensity frontier Use intense beams to uncover the elusive properties of neutrinos and observe rare processes that probe physics beyond the Standard Model. –Future high-intensity proton source at Fermilab

Nuclear Physics Priorities  Study the properties of nuclear matter and the structure of the nucleus CEBAF (electrons) at JLab and RHIC (heavy ions) at BNL, and a future polarized electron-ion collider (eRHIC, ELIC)  Study nuclei far from stability Future rare isotope accelerator

Year 1, HEP priorities changed: first test of ComPASS program flexibility  When the project was proposed, the design of the ILC was the highest HEP priority, thus ILC dominated our application planning.  Change of HEP priorities triggered Changes in application development plan –More emphasis on finalizing Tevatron Run-II applications –Increased LARP involvement –LHC, PS2 –Plan major participation in Project-X design studies –Shift future linear collider design application development to –High-gradient concept design studies –generic SRF (Project-X linac) No changes in capabilities development plan –Justifies original planning

ComPASS major thrust areas for HPC accelerator physics  Electromagnetics (EM): Modelling of electromagnetic fields in complex accelerating cavities and other accelerator components, to maximize acceleration while minimizing beam quality degrading effects. [Component design]  Beam dynamics (BD): Modelling the evolution of beams through beam optics systems, including self-forces and other forces of interaction. [Machine design]  Advanced acceleration (AA): Tools that guide the R&D for new high-gradient acceleration techniques such as plasma or laser wakefield accelerators. [New concept design]  Common computer science and applied math activities to achieve performance and develop simulation environment [Enabling technologies]

Enabling Technologies: collaboration with SciDAC CET and Institutes  Scalable parallel eigensolvers (with TOPS), to enable simulation of complete systems of rf cavities with many millions of degrees of freedom.  Domain-specific scalable linear solvers (with TOPS), for large EM systems.  Meshing technology for shape adaptation in EM, essential for cost effective design of rf cavities (with ITAPS and TOPS).  Poisson solvers that perform and scale on petascale platforms (with TOPS). Essential for applications involving a mean field treatment of space charge.  Parallel adaptive refinement for finite elements, to improve accuracy and reduce computational cost (with ITAPS and CSCAPES).  Utilization of remote and interactive visualization tools (with ISUV).  Deployment of performance analysis and optimization tools (with PERI).  Embedded boundary methods for EM structure PIC simulations (with ITAPS).  Mesh refinement and optimized preconditioning in reduced PIC and spectral method-based dispersionless solvers (with APDEC, TOPS, and PERI).  “Computational quality-of-service” infrastructure and interoperable components for BD applications (with TASCS, TOPS, and PERI).  High-performance parallel data management and analysis tools for BD modelling (with VACET).  Implementation of effective load balancing for particle-field simulations, to improve PIC performance (with ITAPS and CSCAPES).

Electromagnetics  Continue to develop and optimize capabilities for finite difference and finite element, time and frequency domain  Begin developing integrated environment including mechanical & thermal properties  Application focus Finalize ILC cryomodule calculations (including imperfections) –Transfer expertise to Project-X SRF linac applications Design and optimization of LHC crab cavity; multipacting; wakefield effects in collimators; PS2 impedance calculation. Wakefield effects in high-energy linac; ring impedance calculation for Project-X Cavity design optimization including EM, thermal and mechanical analysis; dark current simulation; support for High-Gradient R&D. Utilize existing application capabilities for NP, BES needs (JLab, SNS)

Beam Dynamics  Mainly electrostatic PIC codes. Development focus work on Poisson solver performance enhancement, new solver development and PIC infrastructure enhance framework infrastructure to allow multi-physics capabilities benchmark, share physics modules between different frameworks  Applications Finalized ILC applications (RTML, ML, DR) Begun involvement with Project-X (space-charge, e-cloud, impedance, …) Increased Tevatron and LHC, LARP involvement (utilizing existing capabilities) –Beam-beam effects and mitigation –PS2 design (space-charge, impedance, e-cloud) Continued/ramp-up NP applications (RHIC, eRHIC, ELIC, FRIB) –Beam-beam –Electron Cooling Continued light source applications for BES (LCLS, future facilities) –Microbunch instability –Emittance preservation

Advanced Accelerators  Help design linear colliders based on staging PWFA or LWFA stages  Develop integrated codes for modeling a staged wakefield “system”.  Develop high fidelity modeling for optimizing a single LWFA or PWFA stage.  Enable routine modeling of experiments: Bella FACET  Enable (near) real time steering of experiments  Code validation against numerous worldwide experiments  Code verification

ComPASS at DOE HPC facilities  ComPASS applications run high concurrency jobs both at NERSC and at the ASCR LCF’s.  ComPASS utilized ~100% of its NERSC allocation for ’08, and is on track to do the same in ’09, despite the instability issues with franklin. We ported (and continue to port) our codes on the ALCF BG/P, where in ’08 we used close to 20M hours and we are on track to surpass that in ’09.  In addition to the ComPASS generic allocations, ComPASS codes are utilized in other LCF allocations (mostly INCITE), specific to certain applications These specific allocations focus on high concurrency runs (~10k cores or more, for example INCITE at ORNL’s NCCS)

ComPASS at DOE HPC facilities  ComPASS specific performance on franklin (repo m778) for calendar ’09 (right), compared with all repos (left): ComPASS large jobs (>30 k cores) at ~10%, compared to 0.3% for all repos.  High concurrency for average jobs (>8k cores) for all other repos (see next talks)

ComPASS on LCF’s  INCITE project progress at NCSS: ~20% of the runs using ~25k cores  ALCF average job size ~8.2k cores

Summary  Accelerators are complex instruments with a wide spectrum of design requirements depending on the application Multi-scale, multi-physics modeling requirements  To help maximize performance and minimize cost, ComPASS is successfully developing the new generation of HPC accelerator modeling tools, aiming to Provide integrated multi-physics capabilities Utilize petascale capable solvers and algorithms  The ComPASS target applications are well aligned with the accelerator science priorities of HEP, NP, and BES  The success of the program relies on the effectiveness of the collaboration of ComPASS with the SciDAC CET’s and Institutes and the support of the ASCR LCF’s in providing the necessary scientific software infrastructure.