Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.

Slides:



Advertisements
Similar presentations
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
The EGI – a sustainable European grid infrastructure Michael Wilson STFC RAL.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Les Les Robertson LCG Project Leader LCG - The Worldwide LHC Computing Grid LHC Data Analysis Challenges for 100 Computing Centres in 20 Countries HEPiX.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
Jürgen Knobloch/CERN Slide 1 Grids for Science in Europe by Jürgen Knobloch CERN IT-Department Presented at Gridforum.nl Annual Business Day Eindhoven.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
José M. Hernández CIEMAT Grid Computing in the Experiment at LHC Jornada de usuarios de Infraestructuras Grid January 2012, CIEMAT, Madrid.
Experience with the WLCG Computing Grid 10 June 2010 Ian Fisk.
A short introduction to GRID Gabriel Amorós IFIC.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Ian Bird LCG Deployment Manager EGEE Operations Manager LCG - The Worldwide LHC Computing Grid Building a Service for LHC Data Analysis 22 September 2006.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
1 The LHC Computing Grid – February 2007 Frédéric Hemmer, CERN, IT Department LHC Computing and Grids Frédéric Hemmer Deputy IT Department Head January.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Les Les Robertson LCG Project Leader WLCG – Management Overview LHCC Comprehensive Review September 2006.
SC4 Planning Planning for the Initial LCG Service September 2005.
The WLCG Service from a Tier1 Viewpoint Gareth Smith 7 th July 2010.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Plans for Service Challenge 3 Ian Bird LHCC Referees Meeting 27 th June 2005.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
The LHC Computing Grid Visit of Dr. John Marburger
1 The LHC Computing Grid – April 2007 Frédéric Hemmer, CERN, IT Department The LHC Computing Grid A World-Wide Computer Centre Frédéric Hemmer Deputy IT.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE: Enabling grids for E-Science Bob Jones.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
Jürgen Knobloch/CERN Slide 1 Grid Computing by Jürgen Knobloch CERN IT-Department Presented at Physics at the Terascale DESY, Hamburg December 4, 2007.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010.
2 nd EGEE/OSG Workshop Data Management in Production Grids 2 nd of series of EGEE/OSG workshops – 1 st on security at HPDC 2006 (Paris) Goal: open discussion.
Dr. Ian Bird LHC Computing Grid Project Leader Göttingen Tier 2 Inauguration 13 th May 2008 Challenges and Opportunities.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
Grid Computing in HIGH ENERGY Physics
IT Department and The LHC Computing Grid
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
EGEE support for HEP and other applications
A high-performance computing facility for scientific research
LHC Data Analysis using a worldwide computing grid
The LHC Computing Grid Visit of Prof. Friedrich Wagner
Overview & Status Al-Ain, UAE November 2007.
The LHCb Computing Data Challenge DC06
Presentation transcript:

Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007

Jürgen Knobloch/CERN Slide 2 LHC gets ready …

… what about Computing? The Challenge Starting to grasp the scope Going for the Grid Are we there? What other use is the Grid? Where do we go from here? Jürgen Knobloch/CERN Slide 3

Jürgen Knobloch/CERN Slide 4 The LHC Computing Challenge Signal/Noise Data volume –High rate * large number of channels * 4 experiments  15 PetaBytes of new data each year Compute power –Event complexity * Nb. events * thousands users  100 k of (today's) fastest CPUs Worldwide analysis & funding –Computing funding locally in major regions & countries –Efficient analysis everywhere  GRID technology

Jürgen Knobloch/CERN Slide 5 Timeline LHC Computing LHC approved ATLAS & CMS approved ALICE approved LHCb approved “Hoffmann” Review 7x10 7 MIPS 1,900 TB disk ATLAS (or CMS) requirements for first year at design luminosity ATLAS&CMS CTP 10 7 MIPS 100 TB disk LHC start Computing TDRs 55x10 7 MIPS 70,000 TB disk (140 MSi2K)

CSC 2007 LCG Evolution of CPU Capacity at CERN SC (0.6GeV) PS (28GeV) ISR (300GeV) SPS (400GeV) ppbar (540GeV) LEP (100GeV) LEP II (200GeV) LHC (14 TeV) Costs (2007 Swiss Francs) Includes infrastructure costs (comp.centre, power, cooling,..) and physics tapes Slide from Les Robertson

Requirements Match Jürgen Knobloch/CERN Slide 7 Tape & disk requirements: >10 times CERN possibility

Options as seen in 1996 Jürgen Knobloch/CERN Slide 8

Jürgen Knobloch/CERN Slide 9 Timeline Grids Data Challenges First physics Cosmics GRID 3 EGEE 1 LCG 1 Service Challenges EU DataGrid GriPhyN, iVDGL, PPDG EGEE 2 OSG LCG 2 EGEE WLCG

Jürgen Knobloch/CERN Slide 10 WLCG Collaboration The Collaboration –4 LHC experiments –~250 computing centres –12 large centres (Tier-0, Tier-1) –38 federations of smaller “Tier-2” centres –Growing to ~40 countries –Grids: EGEE, OSG, Nordugrid Technical Design Reports –WLCG, 4 Experiments: June 2005 Memorandum of Understanding –Agreed in October 2005 Resources –5-year forward look

Jürgen Knobloch/CERN Slide 11 CPU & Disk Requirements 2006 CPUDisk CERN: ~ 10%

Events at LHC Jürgen Knobloch/CERN Slide 12

Trigger and Data Acquisition Jürgen Knobloch/CERN Slide 13

Tier-0 Recording Jürgen Knobloch/CERN Slide 14

Tier Jürgen Knobloch/CERN Slide 15

Jürgen Knobloch/CERN Slide 16 Centers around the world form a Supercomputer The EGEE and OSG projects are the basis of the Worldwide LHC Computing Grid Project WLCG Inter-operation between Grids is working!

Jürgen Knobloch/CERN Slide 17 Middleware Security –Virtual Organization Management (VOMS) –MyProxy Data management –File catalogue (LFC) –File transfer service (FTS) –Storage Element (SE) –Storage Resource Management (SRM) Job management –Work Load Management System(WMS) –Logging and Bookeeping (LB) –Computing Element (CE) –Worker Nodes (WN) Information System –Monitoring: BDII (Berkeley Database Information Index), RGMA (Relational Grid Monitoring Architecture)  aggregate service information from multiple Grid sites, now moved to SAM (Site Availability Monitoring) –Monitoring & visualization (Griview, Dashboard, Gridmap etc.)

Moore Delivered for CPU & Disk Jürgen Knobloch/CERN Slide 18 Slides from 1996 Expectations fulfilled!

Network was a Concern … Jürgen Knobloch/CERN Slide 19 We are now here

Jürgen Knobloch/CERN Slide 20 LHCOPN Architecture

Jürgen Knobloch/CERN Slide 21 Networking

Data Transfer out of Tier-0 Jürgen Knobloch/CERN Slide 22

Jürgen Knobloch/CERN Slide 23 Site reliability

Site Reliability Tier-2 Sites 83 Tier-2 sites being monitored Targets – CERN + Tier-1s Before July July 07Dec 07 Avg.last 3 months Each site88%91%93%89% 8 best sites 88%93%95%93%

Jürgen Knobloch/CERN Slide 25 ARDA Dashboard

Gridmap Jürgen Knobloch/CERN Slide 26

Increasing workloads 27 EGEE'07; 2nd October % non-LHC

Many Grid Applications At present there are about 20 applications from more than 10 domains on the EGEE Grid infrastructure –Astronomy & Astrophysics - MAGIC, PlanckPlanck –Computational Chemistry –Earth Sciences - Earth Observation, Solid Earth Physics, Hydrology, Climate –Fusion –High Energy Physics - 4 LHC experiments (ALICE, ATLAS, CMS, LHCb) BaBar, CDF, DØ, ZEUSATLAS –Life Sciences - Bioinformatics (Drug Discovery, Xmipp_MLrefine, etc.) –Condensed Matter Physics –Computational Fluid Dynamics –Computer Science/Tools –Civil Protection –Finance (through the Industry Task Force) Jürgen Knobloch/CERN Slide 28

Grid Applications Jürgen Knobloch/CERN Slide 29 Medical Metadata Seismology Chemistry Astronomy Fusion Particle Physics

Available Infrastructure EGEE: ~250 sites, >45000 CPU OSG: ~ 15 sites for LHC, > CPU ¼ of the resources are contributed by groups external to the project ~>25 k simultaneous jobs 30 EGEE'07; 2nd October 2007

Ramp-up Needed for Startup Jul Sep Apr X Sep Jul Apr Sep Jul Apr X 2.9 X Sep Jul Apr Sep Jul Apr X 3.7 X target usage usage pledge installed

Jürgen Knobloch/CERN Slide 32 3D - Distributed Deployment of Databases for LCG ORACLE Streaming with Downstream Capture (ATLAS, LHCb) SQUID/FRONTIER Web caching (CMS)

The Next Step Jürgen Knobloch/CERN Slide 33

EGI – European Grid Initiative Jürgen Knobloch/CERN Slide 34 EGI Design Study proposal to the European Commission (started Sept 07) Supported by 37 National Grid Initiatives (NGIs) 2 year project to prepare the setup and operation of a new organizational model for a sustainable pan-European grid infrastructure after the end of EGEE-3

Jürgen Knobloch/CERN Slide 35 Tier-1 Centers: TRIUMF (Canada); GridKA(Germany); IN2P3 (France); CNAF (Italy); SARA/NIKHEF (NL); Nordic Data Grid Facility (NDGF); ASCC (Taipei); RAL (UK); BNL (US); FNAL (US); PIC (Spain) The Grid is now in operation, working on: reliability, scaling up, sustainability