Alain Romeyer - Dec. 20041 Grid computing for CMS What is the Grid ? Let’s start with an analogy How it works ? (Some basic ideas) Grid for LHC and CMS.

Slides:



Advertisements
Similar presentations
Jens G Jensen Atlas Petabyte store Supporting Multiple Interfaces to Mass Storage Providing Tape and Mass Storage to Diverse Scientific Communities.
Advertisements

Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Torsten Antoni – LCG Operations Workshop, CERN 02-04/11/04 Global Grid User Support - GGUS -
FP7-INFRA Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
Grids: Why and How (you might use them) J. Templon, NIKHEF VLV T Workshop NIKHEF 06 October 2003.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Infrastructure overview Arnold Meijster &
Workload Management Workpackage Massimo Sgaravatto INFN Padova.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Other servers Java client, ROOT (analysis tool), IGUANA (CMS viz. tool), ROOT-CAVES client (analysis sharing tool), … any app that can make XML-RPC/SOAP.
POLITEHNICA University of Bucharest California Institute of Technology National Center for Information Technology Ciprian Mihai Dobre Corina Stratan MONARC.
HEP Prospects, J. Yu LEARN Strategy Meeting Prospects on Texas High Energy Physics Network Needs LEARN Strategy Meeting University of Texas at El Paso.
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
Copyright © 2010 Platform Computing Corporation. All Rights Reserved.1 The CERN Cloud Computing Project William Lu, Ph.D. Platform Computing.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
RomeWorkshop on eInfrastructures 9 December LCG Progress on Policies & Coming Challenges Ian Bird IT Division, CERN LCG and EGEE Rome 9 December.
FP6−2004−Infrastructures−6-SSA E-infrastructure shared between Europe and Latin America Pilot Test-bed Operations and Support Work.
José M. Hernández CIEMAT Grid Computing in the Experiment at LHC Jornada de usuarios de Infraestructuras Grid January 2012, CIEMAT, Madrid.
November 7, 2001Dutch Datagrid SARA 1 DØ Monte Carlo Challenge A HEP Application.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
L ABORATÓRIO DE INSTRUMENTAÇÃO EM FÍSICA EXPERIMENTAL DE PARTÍCULAS Enabling Grids for E-sciencE Grid Computing: Running your Jobs around the World.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Data Logistics in Particle Physics Ready or Not, Here it Comes… Prof. Paul Sheldon Vanderbilt University Prof. Paul Sheldon Vanderbilt University.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
11-Feb-2004 IoP Half Day Meeting: Getting Ready For the Grid Peter Clarke SC2003 Video.
CLRC and the European DataGrid Middleware Information and Monitoring Services The current information service is built on the hierarchical database OpenLDAP.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Eine Einführung ins Grid Andreas Gellrich IT Training DESY Hamburg
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
ATLAS WAN Requirements at BNL Slides Extracted From Presentation Given By Bruce G. Gibbard 13 December 2004.
EGEE is a project funded by the European Union under contract IST Grid computing Assaf Gottlieb Tel-Aviv University assafgot tau.ac.il
DTI Mission – 29 June LCG Security Ian Neilson LCG Security Officer Grid Deployment Group CERN.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Storage Management on the Grid Alasdair Earl University of Edinburgh.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
Enabling Grids for E-sciencE INFSO-RI Dr. Rüdiger Berlich Forschungszentrum Karslruhe Introduction to Grid Computing Christopher.
Bob Jones EGEE Technical Director
Grid Computing: Running your Jobs around the World
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
UK GridPP Tier-1/A Centre at CLRC
Simulation use cases for T2 in ALICE
LHC Data Analysis using a worldwide computing grid
Collaboration Board Meeting
Presentation transcript:

Alain Romeyer - Dec Grid computing for CMS What is the Grid ? Let’s start with an analogy How it works ? (Some basic ideas) Grid for LHC and CMS computing model Conclusion Alain Romeyer (Mons - Belgium)

Alain Romeyer - Dec What is the Grid ? What is not a Grid? A cluster, a network attached storage device, a scientific instrument, a network, etc. Each may be an important component of a Grid, but by itself does not constitute a Grid For us : A new way of doing science !!! an integrated advanced cyber infrastructure that delivers: Computing capacity Data capacity Communication capacity Coordinated resource sharing and problem solving in dynamic no centralized control Use standard and open protocols and interfaces deliver nontrivial qualities of service

Alain Romeyer - Dec An analogy : Power electricity (on demand access) Time Quality, economies of scale

Alain Romeyer - Dec By analogy Decouple production and consumption Enable on-demand access Achieve economies of scale Enhance consumer flexibility Enable new device On a variety of scales Department Campus Enterprise Internet

Alain Romeyer - Dec Not a perfect analogy… I import electricity but must export data “Computing” is not interchangeable but highly heterogeneous Computers, data, sensors, services, … So the story is more complicated But more significantly, the sum can be greater than the parts Dynamic allocation of resources Access to distributed services Virtualization & distributed service management

Alain Romeyer - Dec How it works ? Grid responsibilities Security Infrastructure Authentication (identity) authorization (rights) Management : Information Management Soft-state, registration, discovery, selection, monitoring Resource Management Remote service invocation, reservation, allocation Resource specification Data Management High-performance, remote data access Cataloguing, replication, staging

Alain Romeyer - Dec Grid Security Infrastructure (GSI) Public key infrastructure (asymmetric) Need to be associated to a Virtual Organisation (VO) Need certificate delivered by a Certification Authority (CA) A certificate (x509 international standard) is : It contains : A subject name (identify the user/person) A user public key The identity of the CA The digital signature of the CA How it works ? Security - Authentification a digitally signed document attesting to the binding of a public key to an individual entity

Alain Romeyer - Dec How it works ? Security - Authentication CA VO Cert signing registration hash 3kjfgf*£$& Digital Signature Message Digest Public Certificate Certificate Request Encrypt Py75c%bn

Alain Romeyer - Dec Workload Manager Job control CONDOR-G Network Server Global Manager How it works ? Management LRMS Computing Element LRMS Storage Element Publish characs, status, available services… Request (JDL) Information Service Resource Location Service Where ? Status ? Best actions to satisfy the request : match-making where submit Grid status Decision Job submission Data Transfert End of job : outputs are stored in your « sand box » ask to download them

Alain Romeyer - Dec Some Grid e-science projects Sloan Digital Sky Survey ALMA LHC LHCb Atlas Alice CMS

Alain Romeyer - Dec EGEE ( Enabling Grid for E-science in Europe (2 years project) Funded by the EU, 3 core areas : 1) build a consistent, robust and secure Grid network that will attract additional computing resources. 2) continuously improve and maintain the middleware in order to deliver a reliable service to users. 3) attract new users from industry as well as science and ensure they receive the high standard of training and support they need. Two pilot application selected : Biomedical Grids (bioinformatics and healthcare data) Large Hadron Collider Computing Grid (LCG)

Alain Romeyer - Dec Phase I ( ) : development phase + series of computing data challenges Phase II (2006 – 2008) : real production and deployment phase 2 phase project LHC Computing Grid (LCG) physicist working together PetaBytes of data will be generated each year (20 millions CDs == 20 km) Analysing this will require the equivalent of 70,000 of today's fastest PC processors (~192 years) LCG goal : prepare the computing infrastructure for the simulation, processing and analysis of LHC data for the 4 experiments.

Alain Romeyer - Dec LCG status 22/09/2004 Total Sites : 82 Total CPUs : 7269 Total Storage : 6558 (TB)

Alain Romeyer - Dec CMS data production at LHC 40 MHz (1000 TB/sec) Level 1 Trigger 75 KHz (50 GB/sec) 1 bunch crossing Every 25 ns p pp High Level Trigger 100 Hz (100 MB/sec) Data Recording & Offline Analysis Cluster for the Trigger ~ 1000 – 2000 PCs

Alain Romeyer - Dec CMS computing model Online System CERN Center PBs of Disk; Tape Robot Tier 1 FNAL Center INFN Center ~ Gbps IN2P3 Center RAL Center ~ MBytes/sec ~PByte/sec Tier 0 +1 Experiment Gbps Workstations Tier 4 Tier2 Center Institute 0.1 to 10 Gbps Physics data cache Tier2 Center ~ Gbps Tier 3 Tier2 Center Tier 2 Physicists work on analysis “channels”. data for these channels should be cached by the institute server

Alain Romeyer - Dec DC04 Data Challenge T0 T0 at CERN in DC04 25 Hz input event rate Reconstruct quasi-realtime Events filtered into streams Distribute data to T1’s PIC Barcelona FZK Karlsruhe CNAF Bologna RAL Oxford IN2P3 Lyon T1 FNAL Chicago T1 T1 centres in DC04 Pull data from T0 to T1 and store Make data available to PRS Demonstrate quasi-realtime “fake” analysis March-April 2004

Alain Romeyer - Dec DC04 Processing Rate Processed about 30M events T0 events processed vs. days Got above 25Hz on many short occasions Only one full day >25Hz with full system T0 event processing rate (Hz) Next challenge: make it useable by average physicists …and demonstrate that the performance scales acceptably DC04 demonstrated that the system can work…at least for well controlled data flow / analysis, and for a few expert users

Alain Romeyer - Dec Conclusion Grid becomes a reality Management is the crucial issue that is not fully implemented will be done by the EGEE project For the HEP, LCG II already available and working CMS DC04 has showed that the system starts to work Next data challenge will be crucial : Usable by standard physicist Performances reasonable for LHC

Alain Romeyer - Dec Conclusion Belgrid project ( « a Belgian Grid initiative « Regroups academic, public and private partners Goal : share the local computing resources using Grid technologies Status : GridFTP between sites is working Plan : distributed computing BEgrid (belnet) : grid computing for the Belgian Research Belnet : official CA -> certificate also valid for use in EGEE 5 universities connected (KULeuven, UA, UG, ULB and VUB)KULeuvenUAUGULBVUB LCG II and follow the EGEE middleware