Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.

Slides:



Advertisements
Similar presentations
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
Advertisements

Amber Boehnlein, FNAL D0 Computing Model and Plans Amber Boehnlein D0 Financial Committee November 18, 2002.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
Assessment of Core Services provided to USLHC by OSG.
M.Goldberg/NSFOSGSep 18, 2012 The Open Science Grid 1.
Copyright © 2010 Platform Computing Corporation. All Rights Reserved.1 The CERN Cloud Computing Project William Lu, Ph.D. Platform Computing.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
Ian Fisk and Maria Girone Improvements in the CMS Computing System from Run2 CHEP 2015 Ian Fisk and Maria Girone For CMS Collaboration.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
José M. Hernández CIEMAT Grid Computing in the Experiment at LHC Jornada de usuarios de Infraestructuras Grid January 2012, CIEMAT, Madrid.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
TechFair ‘05 University of Arlington November 16, 2005.
D0 SAM – status and needs Plagarized from: D0 Experiment SAM Project Fermilab Computing Division.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
Presented by Leadership Computing Facility (LCF) Roadmap Buddy Bland Center for Computational Sciences Leadership Computing Facility Project.
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Next Generation Operating Systems Zeljko Susnjar, Cisco CTG June 2015.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Brent Gorda LBNL – SOS7 3/5/03 1 Planned Machines: BluePlanet SOS7 March 5, 2003 Brent Gorda Future Technologies Group Lawrence Berkeley.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
High Energy Physics & Computing Grids TechFair Univ. of Arlington November 10, 2004.
High Energy FermiLab Two physics detectors (5 stories tall each) to understand smallest scale of matter Each experiment has ~500 people doing.
Storage and Data Movement at FNAL D. Petravick CHEP 2003.
Predrag Buncic Future IT challenges for ALICE Technical Workshop November 6, 2015.
LHC Computing, CERN, & Federated Identities
Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
Latest Improvements in the PROOF system Bleeding Edge Physics with Bleeding Edge Computing Fons Rademakers, Gerri Ganis, Jan Iwaszkiewicz CERN.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Perspectives on LHC Computing José M. Hernández (CIEMAT, Madrid) On behalf of the Spanish LHC Computing community Jornadas CPAN 2013, Santiago de Compostela.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
IT-DSS Alberto Pace2 ? Detecting particles (experiments) Accelerating particle beams Large-scale computing (Analysis) Discovery We are here The mission.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
Fabric for Frontier Experiments at Fermilab Gabriele Garzoglio Grid and Cloud Services Department, Scientific Computing Division, Fermilab ISGC – Thu,
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
EGI-Engage EGI Webinar - Introduction - Gergely Sipos EGI.eu / MTA SZTAKI 6/26/
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
1 P. Murat, Mini-review of the CDF Computing Plan 2006, 2005/10/18 An Update to the CDF Offline Plan and FY2006 Budget ● Outline: – CDF computing model.
Particle Physics Sector Young-Kee Kim / Greg Bock Leadership Team Strategic Planning Winter Workshop January 29, 2013.
HIGH ENERGY PHYSICS DATA WHERE DATA COME FROM, WHERE THEY NEED TO GO In their quest to understand the fundamental nature of matter and energy, Fermilab.
EGI-InSPIRE RI An Introduction to European Grid Infrastructure (EGI) March An Introduction to the European Grid Infrastructure.
Gene Oleynik, Head of Data Storage and Caching,
Computing models, facilities, distributed computing
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
New strategies of the LHC experiments to meet
Grid Canada Testbed using HEP applications
Scientific Computing At Jefferson Lab
LHC Data Analysis using a worldwide computing grid
EGI Webinar - Introduction -
Presentation transcript:

Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots

Mission of Fermilab 2 Particle Physics Operations Capability Future Facilities Science and Technology R&D Producing world-class particle physics results Developing and operating advanced accelerators, detectors, and computing facilities Designing and building the next generation of facilities R&D into technologies for future facilities

Lab Scientific Roadmap 3

Long history of provisioning and operating storage for all Fermilab Users: approaching 100 Petabytes of data… 44

...and computing for experiments and scientific projects to which Fermilab is contributing:  LHC: US CMS Tier-1 Computing Center  Tevatron: continued analysis and preservation of CDF and D0 data.  Computing for ~10 neutrino/dark matter/dark energy experiments.  Significant contributor to the Open Science Grid.  HPC systems for Lattice Quantum Chromodynamics (LQCD).  Experiment data acquisition and accelerator controls systems. >30,000 cores of cluster based and distributed high throughput computing resources ~30 Petabytes of high performance disk cache. Heavily used 8x10Gigabit pipes to WAN; >100Gigabit data transfers on the data center LAN; Commissioning new 100Gigabit connectivity for wide area. 5

Typical US CMS Tier-1 Computing utilization: Jobs and Disk I/O 6

R&D and Support of Physics Software. For example: Accelerator modeling and simulation High speed data acquisition and accelerator controls. Distributed computing and databases, Frameworks for data simulation, processing and analysis. Scientific Linux added value to Redhat linux OS for HEP and wider community. Monte Carlo event generators, simulation packages, track finders, pattern recognition algorithms, high volume statistical analysis etc. 7 Plus many centralized Computing Services that achieved ISO20K Certification in 2012

Computing needs for CMS increasing significantly: Processing needs will increase by ~factor 5 in the next 3- 5 years.  Computing budgets expected to remain constant. Data sizes will increase to exabyte scale by  Aim to provide global federated data storage solutions to reduce total cost of storage. Must adapt to new multi-core technologies to preserve time to results. Must be agile to use of whatever CPU resources are available  Leading to investigations into the use of HPC (DOE & NSF machines); private, commercial and public clouds. 8

Total computing needs for smaller experiments will outpace available resources on site in 2-3 more years: 9 Adapting to opportunistic use of OSG accessible resources Deploying Virtualization to increase efficiency of sharing across existing h/w. Exploring/evaluating bursting to off-site Cloud resources for peak and instantaneous demands.

10

OSG ramping up activity towards provisioning across heterogeneous resources 11

& sustaining the environment 12 & the Illinois Accelerator Research Center