Scientific Computing At Jefferson Lab

Slides:



Advertisements
Similar presentations
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Advertisements

U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
Amber Boehnlein, FNAL D0 Computing Model and Plans Amber Boehnlein D0 Financial Committee November 18, 2002.
JLab Status & 2016 Planning April 2015 All Hands Meeting Chip Watson Jefferson Lab Outline Operations Status FY15 File System Upgrade 2016 Planning for.
FACET: A National User Facility Carsten Hast SLAC National Accelerator Laboratory.
OSG GUMS CE SE VOMS VOMRS UConn-OSG University of Connecticut GLUEX support center Gluex VO Open Science Grid All-Hands Meeting, Chicago, IL, Mar. 8-11,
NGAO Team Meeting Management Peter Wizinowich May 26, 2009.
PRESTON SMITH ROSEN CENTER FOR ADVANCED COMPUTING PURDUE UNIVERSITY A Cost-Benefit Analysis of a Campus Computing Grid Condor Week 2011.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
Jefferson Lab Status Hall A collaboration Dec. 16, 2013 R. D. McKeown Deputy Director For Science.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
1 Midrange Computing Working Group Process and Goals Background The MRC Working Group Phase I: - Assessment and Findings - Recommendations for a path forward.
IT in the 12 GeV Era Roy Whitney, CIO May 31, 2013 Jefferson Lab User Group Annual Meeting.
W.Smith, U. Wisconsin, ZEUS Computing Board Zeus Executive, March 1, ZEUS Computing Board Report Zeus Executive Meeting Wesley H. Smith, University.
The GlueX Collaboration Meeting October 4-6, 2012 Jefferson Lab Curtis Meyer.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
Outline IT Organization SciComp Update CNI Update
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update September.
Computing and IT Update Jefferson Lab User Group Roy Whitney, CIO & CTO 10 June 2009.
NLIT May 26, 2010 Page 1 Computing Jefferson Lab Users Group Meeting 8 June 2010 Roy Whitney CIO & CTO.
Scientific Computing Experimental Physics Lattice QCD Sandy Philpott May 20, 2011 IT Internal Review 12GeV Readiness.
1 Metrics for the Office of Science HPC Centers Jonathan Carter User Services Group Lead NERSC User Group Meeting June 12, 2006.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
JLab Scientific Computing: Theory HPC & Experimental Physics Thomas Jefferson National Accelerator Facility Newport News, VA Sandy Philpott.
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
May 25-26, 2006 LQCD Computing Review1 Jefferson Lab 2006 LQCD Analysis Cluster Chip Watson Jefferson Lab, High Performance Computing.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
U.S. Department of Energy’s Office of Science Midrange Scientific Computing Requirements Jefferson Lab Robert Edwards October 21, 2008.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
Thomas Jefferson National Accelerator Facility Page 1 Overview Talk Content Break-out Sessions Planning 12 GeV Upgrade Software Review Jefferson Lab November.
US ATLAS Tier 1 Facility Rich Baker Deputy Director US ATLAS Computing Facilities October 26, 2000.
LHCbComputing Computing for the LHCb Upgrade. 2 LHCb Upgrade: goal and timescale m LHCb upgrade will be operational after LS2 (~2020) m Increase significantly.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
05/14/04Larry Dennis, FSU1 Scale of Hall D Computing CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental.
EGI-Engage EGI Webinar - Introduction - Gergely Sipos EGI.eu / MTA SZTAKI 6/26/
May 23, 2007ALICE DOE Review - Computing1 ALICE-USA Computing Overview of Hard and Soft Computing Resources Needed to Achieve Research Goals 1.Calibration.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Report from US ALICE Yves Schutz WLCG 24/01/2007.
High Performance Computing Activities at Fermilab James Amundson Breakout Session 5C: Computing February 11, 2015.
Hall D Computing Facilities Ian Bird 16 March 2001.
Margaret Votava / Scientific Computing Division FIFE Workshop 20 June 2016 State of the Facilities.
Compute and Storage For the Farm at Jlab
LQCD Computing Project Overview
A Brief Introduction to NERSC Resources and Allocations
Project Management – Part I
Jefferson Lab Overview
Ian Bird WLCG Workshop San Francisco, 8th October 2016
Computational Requirements
Electron Ion Collider New aspects of EIC experiment instrumentation and computing, as well as their possible impact on and context in society (B) COMPUTING.
Future Trends in Nuclear Physics Computing Workshop
for the Offline and Computing groups
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
UK Status and Plans Scientific Computing Forum 27th Oct 2017
LQCD Computing Operations
ALICE Computing Model in Run3
NP-ASCR Workshop Purposes of review
Learn about MATLAB Engineers – not sales!
JEFFERSON LAB LCLSII CRYOPLANT INSTALLATION PACKAGE DIRECTOR’S PROGRESS REVIEW Welcome and Introduction Stuart Henderson June 1, 2017.
Sky Computing on FutureGrid and Grid’5000
TeraScale Supernova Initiative
2018 COMPUTING REVIEW HOMEWORK QUESTIONS
Computing Overview Amber Boehnlein.
Outline IT Division News Other News
FY2018 LABORATORY CALENDAR
Scientific Computing Strategy
Development of LHCb Computing Model F Harris
Sky Computing on FutureGrid and Grid’5000
Jefferson Lab Scientific Computing Update
Presentation transcript:

Scientific Computing At Jefferson Lab Amber Boehnlein Chief Information Officer IT Division Director

S&T Recommendations Received recommendations at the July 2017 S&T review to assess computing needs for Experimental and Theory programs Generate a cost-effective plan to ensure sufficient computing resources for data analysis and simulation in FY18 and a longer-term approach to address the needs in FY2019 and beyond.  The plan for FY2019+ resources should include a time line and detailed plan to evaluate the feasibility of the proposed approach, including off-site computing resources, such that the plan can be place and tested before the FY2019 running.  Synergy with the theory computing needs should be considered.   Develop a 5-year plan of computational needs related to Office of Nuclear Physics (NP) activities for the theory group on local JLAB computers, in the context of all available computing resources. Submit to DOE NP by June 1, 2018. DOE OPS review charge for July 2018:  Do the plans for the provision of computing resources meet the needs of the Facility and planned experimental program?". Jefferson Lab User's Group Meeting

Experimental Computing Performance Plan Experimental program Local resources at ‘Shutdown’ level Investment needed Spreadsheet Model projections key parameters benchmarked against actual performance Construct run scenarios Projections Usage (May 2017-May 2018) June 26, 2018 JSA Director’s Operations Review

Experimental Computing Performance Plan Insure adequate computing resources with $850K investment in FY18 Use local farm for reconstruction, calibration and analysis Use distributed resources for MC Storage and associated bandwidth scaled to support all resources Open Science Grid GlueX -6 institutions contribute resources In a recent 2 week period ~1M core-hours Expect yearly 35M-50M core-hours Investigating options for CLAS12 GlueX reconstruction code at NERSC Scale test in July Anticipate 70M core-hours/year Cloud Computing available for bursts   Current FY19 FY20 CPU (M-core-hours/year) 37 70 90 Scratch Disk & Cache Disk (PB) 0.65 1.1 2 Tape (GB/s) 3 5 7 WAN bandwidth (Gbps) 10 Current and Projected Capacity June 26, 2018 JSA Director’s Operations Review

LQCD Computing Estimated Science needs captured in Exascale Requirements Process Advances will allow previously unaffordable calculations critical to 12 GeV science program Two Stage process Highly parallel calculations require supercomputers Computationally intensive analysis uses ‘smaller’ scale resources Both types of resources must scale together to accomplish the science Analyzing results from Summit system require doubling of Jlab resources NP funded a new 4 year Jefferson Lab based Hardware Program for USQCD allocations Extended SciPhy XVI cluster (17K cores) Xeon Phi KNL by 12,240 cores. Deploy new GPU cluster this fall Software deliverables from SciDAC and Exascale Computing Project making excellent progress 9.1x faster on 8x fewer GPUs ~73x gain 4.1x faster on 2x fewer ~8x gain June 26, 2018 JSA Director’s Operations Review