INFSO-RI-508833 Enabling Grids for E-sciencE www.eu-egee.org Porting Scientific Applications on GRID: CERN Experience Patricia Méndez Lorenzo CERN (IT-PSS/ED)

Slides:



Advertisements
Similar presentations
First results from the ATLAS experiment at the LHC
Advertisements

Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
INFSO-RI Enabling Grids for E-sciencE The Grid Challenges in LHC Service Deployment Patricia Méndez Lorenzo CERN (IT-GD) Linköping.
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
DATA PRESERVATION IN ALICE FEDERICO CARMINATI. MOTIVATION ALICE is a 150 M CHF investment by a large scientific community The ALICE data is unique and.
A tool to enable CMS Distributed Analysis
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
Les Les Robertson LCG Project Leader LCG - The Worldwide LHC Computing Grid LHC Data Analysis Challenges for 100 Computing Centres in 20 Countries HEPiX.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
INFSO-RI Enabling Grids for E-sciencE Geant4 Physics Validation: Use of the GRID Resources Patricia Mendez Lorenzo CERN (IT-GD)
INFSO-RI Enabling Grids for E-sciencE Project Gridification: the UNOSAT experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) CERN,
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Increased productivity for merging Grid applications:
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Ian Bird LCG Deployment Manager EGEE Operations Manager LCG - The Worldwide LHC Computing Grid Building a Service for LHC Data Analysis 22 September 2006.
Data Logistics in Particle Physics Ready or Not, Here it Comes… Prof. Paul Sheldon Vanderbilt University Prof. Paul Sheldon Vanderbilt University.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Responsibilities of ROC and CIC in EGEE infrastructure A.Kryukov, SINP MSU, CIC Manager Yu.Lazin, IHEP, ROC Manager
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Enabling Grids for E-sciencE System Analysis Working Group and Experiment Dashboard Julia Andreeva CERN Grid Operations Workshop – June, Stockholm.
1 The LHC Computing Grid – February 2007 Frédéric Hemmer, CERN, IT Department LHC Computing and Grids Frédéric Hemmer Deputy IT Department Head January.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
ATLAS Data Challenges US ATLAS Physics & Computing ANL October 30th 2001 Gilbert Poulard CERN EP-ATC.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
INFSO-RI Enabling Grids for E-sciencE Experience of using gLite for analysis of ATLAS combined test beam data A. Zalite / PNPI.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The EGEE User Support Infrastructure Torsten.
Service, Operations and Support Infrastructures in HEP Processing the Data from the World’s Largest Scientific Machine Patricia Méndez Lorenzo (IT-GS/EIS),
SC4 Planning Planning for the Initial LCG Service September 2005.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
LCG CERN David Foster LCG WP4 Meeting 20 th June 2002 LCG Project Status WP4 Meeting Presentation David Foster IT/LCG 20 June 2002.
LHC Computing, CERN, & Federated Identities
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
The LHC Computing Grid Visit of Dr. John Marburger
1 The LHC Computing Grid – April 2007 Frédéric Hemmer, CERN, IT Department The LHC Computing Grid A World-Wide Computer Centre Frédéric Hemmer Deputy IT.
INFSO-RI Enabling Grids for E-sciencE gLite Certification and Deployment Process Markus Schulz, SA1, CERN EGEE 1 st EU Review 9-11/02/2005.
INFSO-RI Enabling Grids for E-sciencE UNOSAT and Geant4: Experiences of their merge in the LCG Environment Patricia Méndez Lorenzo.
Analysis of job submissions through the EGEE Grid Overview The Grid as an environment for large scale job execution is now moving beyond the prototyping.
Enabling Grids for E-sciencE Experience Supporting the Integration of LHC Experiments Computing Systems with the LCG Middleware Simone.
EGEE is a project funded by the European Union under contract IST New VO Integration Fabio Hernandez ROC Managers Workshop,
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE Operations: Evolution of the Role of.
INFSO-RI Enabling Grids for E-sciencE VOCE & AUGER User Support a Current State & Future Plans Jan Kmuníček, Jiří Chudoba CESNET.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Acronyms GAS - Grid Acronym Soup, LCG - LHC Computing Project EGEE - Enabling Grids for E-sciencE.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
Top 5 Experiment Issues ExperimentALICEATLASCMSLHCb Issue #1xrootd- CASTOR2 functionality & performance Data Access from T1 MSS Issue.
GDB Meeting CERN 09/11/05 EGEE is a project funded by the European Union under contract IST A new LCG VO for GEANT4 Patricia Méndez Lorenzo.
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
BaBar-Grid Status and Prospects
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Ian Bird GDB Meeting CERN 9 September 2003
Data Challenge with the Grid in ATLAS
Database Readiness Workshop Intro & Goals
The LHC Computing Challenge
CERN, the LHC and the Grid
LHC Data Analysis using a worldwide computing grid
Overview & Status Al-Ain, UAE November 2007.
The LHCb Computing Data Challenge DC06
Presentation transcript:

INFSO-RI Enabling Grids for E-sciencE Porting Scientific Applications on GRID: CERN Experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) Trieste, 10th February 2006 ICTP/INFM-Democritos Workshop on Porting Scientific Applications on Computational GRIDs

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo Outlook ◘ This is an introductory presentation ◘ Let’s see what is CERN, what is LCG, the elements, the actors and how to get involved ◘ This afternoon, during the 2 nd talk we will see some practical examples of communities getting involved in the GRID

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo CERN The European Organization for Nuclear Research The European Laboratory for Particle Physics ◘ Fundamental research in particle physics ◘ Designs, builds & operates large accelerators ◘ Financed by 20 European countries (member states) + others (US, Canada, Russia, India, etc) 2000 staffs users from all over the world ◘ Next huge challenge: LHC (starts in 2007) experiment: 2000 physicists, 150 universities, with an operation life greater than 10 years

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo LHC- Physics Goals ◘ Higgs particle Key particle in the Standard Model that could explain the elementary particle masses ◘ Search for super-symmetric particles and possible extra dimensions Their discovery would be a serious push for Super Symmetric theories or “String Theories” aiming at the unification of the fundamental forces in the Universe ◘ Anti-matter issues Why the Universe is made of matter instead of an equal quantity of matter and antimatter ◘ Understand the early Universe ( – seconds) Soup of quarks and gluons stabilized into nucleons and then nuclei and atoms

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo The LHC Experiment The LHC: Generation of 40 million particle collisions (events) per second at the center of each for experiments Reduce by online computers that filter out a few hundred good events per sec Recorded on disk and magnetic tape at MB/sec: 15 PB/year ALICE, ATLAS, CMS and LHCb

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo The LHC Computing Environment LCG (LHC computing Grid) has been developed to build and maintain a storage and analysis infrastructure for the entire high-energy physics community ◘ LHC is beginning the data taking in summer 2007 ➸ Enormous volume of data Few PB/year at the beginning of the machine operation Several hundred PB yearly produced for all experiments in 2012 ➸ Large amount of processing power ◘ As a solution a LCG world-wide Grid is proposed ➸ Established using a world-wide distributed federal Grid ➸ Many components, services, software, etc, to coordinate ◘ Takes place at an unprecedented scale ➸ Many institutes, experiments and people working closely together ◘ LCG must be ready at full production capacity, functionality and reliability in less than 1 year! LCG is an essential part of the chain allowing the physicists to perform their analyses ➸ It has to be a stable, reliable and easy to use service

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo LCG: The LHC Computing Grid Tier-0 – the accelerator centre Data acquisition and initial Processing of raw data Distribution of data to the different Tier’s Canada – Triumf (Vancouver) France – IN2P3 (Lyon) Germany – Forschunszentrum Karlsruhe Italy – CNAF (Bologna) Netherlands – NIKHEF/SARA (Amsterdam) Nordic countries – distributed Tier-1 Spain – PIC (Barcelona) Taiwan – Academia SInica (Taipei) UK – CLRC (Oxford) US – FermiLab (Illinois) – Brookhaven (NY) Tier-1 – “online” to the data acquisition process  high availability Managed Mass Storage –  grid-enabled data service Data-heavy analysis National, regional support Tier-2 – ~100 centres in ~40 countries Simulation End-user analysis – batch and interactive

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo LCG in the World May Grid sites 34 countries CPUs 8 PetaBytes 30 sites 3200 cpus 25 Universities 4 National Labs 2800 CPUs Grid3

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo What is the structure we have at CERN? ◘ CERN is the T0 ➸ Deployment: We pack and distribute the software to the sites ➸ Development: Development of new projects and also part of the software ➸ Support: Assistance to experiments and sites EXPERIMENTS SITES LCG DEPLOYMENT ARDA EIS APPLICATIONS Contact through ARDA-EIS

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo Our Tier1 centers ALICEATLASCMSLHCb 1GridKaKarlsruheGermanyXXXX 2CCIN2P3LyonFranceXXXX 3CNAFBolognaItalyXXXX 4NIKHEF/SARAAmsterdamNetherlandsXXX 5NDGFDistributedDk, No, Fi, SeXX 6PICBarcelonaSpainXXX 7RALDidcotUKXXXX 8TRIUMFVancouverCanadaX 9BNLBrookhavenX 10FNALBataviaX 11ASCCTaipeiXX

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo The Elements of the Middleware UI CE RB/BDII SE WN LFC Connections to UI Resources Searching Sent to the batch system Distribution to CPUs Ouputs copied to Storage Resources Catalogs getting track of the inputs

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo Current status of the Experiments and the GRID ◘ They have to get ready ➸ How to take raw data and how to store and distribute them ➸ How to produce simulated data ➸ How to analyze data ◘ They are not alone, EGEE/LCG assist them ➸ ARDA Group ➸ Application area Group Successfully completed 6 months before the data taking They have to be ready to a high stability level Ramp up their capacity to twice the nominal data rates expected for the production phase EXPERIMENTS SITES SERVICES: EGEE/LCG THIS IS A FULL CHALLENGE FOR ALL OF US THE SUPPORT AMONG THE 3 ACTORS IS FUNDAMENTAL

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo We have a fundamental Challenge in front of us: Service Challenge As we have just seeing experiments and Grid teams are ramping up… but making what exactly? Service Challenges (SC): Grid part ◘ We provide services to the users (done) ◘ But we have to test them to the necessary level of functionality, reliability and scale ◘ We are preparing, hardening and delivering the production of the LCG environment ◘ Moreover we have to run an environment as realistic as possible Data Challenges (DC): Experiments part ◘ Experiments test their LCG based production chains and performance of the Grid fabric ➸ Processing data from simulated events ➸ Emulating step by step the scale they will have to face during real data taking ◘ From 2005 experiments include SC testing as part of their DC

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo Assistance to Experiments ARDA: A Realization of Distributed Analysis for LHC ◘ Main Goal: ➸ Coordination of activities needed to prototype distributed analysis systems for the LHC ➸ It does NOT develop middleware, go one step further, closer to the users ◘ Main Actors: ➸ People of each experiment (they know what they need) ➸ Middleware development team (they know what they have) ➸ Experiment support team: EIS (they can contact both) ◘ New Gridifications: ➸ Coordinated by this group (You will go through us, surely) ➸ Application of their tools to new communities (We can help you)

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo OK, now you know the project, its elements and its actors And however you want still to go inside… ☺ Good choice, let’s see how

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo New Contributions ◘ Several different communities are joining the GRID project ◘ It has applications to all these fields needing large amount of computational and space resources ◘ We are direct supporting (outside LHC exp.): ➸ Biomed (medical applications) ➸ Geant4 (simulation toolkit) ➸ UNOSAT, ITU (UNO projects) ◘ And how to put then inside the EGEE/LCG is a HOT TOPIC at this moment This afternoon Next part of this 1 st talk

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo Before proceeding… ◘ There are several points under discussion at this moment about how to admit new Virtual Organizations arriving to the GRID ➸ Last discussion, during the Grid Deployment Board meeting this Wednesday ◘ I going to explain the procedure if the request is associated in somehow with CERN ➸ This is my experience ◘ What to do in the rest of cases is being discussed ➸ It should have maximal priority inside the EGEE environment ➸ From my point of view it must be flexibly ➸ I will explain you also my proposal

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo What we are doing now for Geant4 ◘ Full Geant4 gridification this afternoon ◘ Right now, just see it as a new community… ◘ But a “special” community because: ➸ They pretend to run twice per year (Not the whole year) ► To validate their own software ➸ It is used as simulation tool by many Grid VOs ► A good validated product, will make experiments life easier ➸ Its software is very well known ► stable, reliable, quite long tested ► LCG thought to use this software as a part of the LCG test ➸ The tests inside LCG were asked by people placed at CERN ► Geant4 at CERN, LCG support at CERN, easier to gridify and support this community

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo Immersion of Geant4 in LCG ◘ As any other new VO, they will need support ➸ Provided by EGEE/LCG ◘ Support is fundamental ➸ To involve you quickly and safely ➸ Till you gain familiarity it will play the role of software, production and VO manager ➸ We dial with the site on your behalf ◘ They will need to gain familiarity with the system and to test it before passing through the whole EGEE/LCG VO procedure ➸ You have to learn before taking a decision ➸ This will be a situation to face more and more UNOSAT and more ◘ EGEE/LCG has to face: ➸ What to do with light new VOs arriving ➸ The solution should not be “dteam” (deployment team VO) ➸ The most important point is already setup: YOU HAVE SUPPORT

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo Our Challenge Provide a formal procedure to give them support ◘ It is not the same if the community contacts directly the support at CERN ◘ This should not be like that and a common policy should be found ➸ A regional policy delegation could be the solution ◘ Most of all in terms of VO policy a procedure should be fixed ◘ Something we still have to clarify ➸ Are the sites free enough to provide resources to any new VO as local users? ► Well, this is what we have done for Geant4 and this does seem the method to follow

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo Current Procedure 2) EGAAP 3) NA4/SA1 4) CICs/ROCs Initial deplo yment config uratio n Asks for change Recommended VO candidate Resource proposal VO requirements 1) New applications Geant4 application Sites 1. Geant4 is quite known 2. It is fully supported 3. It was the 3 rd production 4. We had a short time to begin the production EGEE Procedure LCG Procedure What we did The solution for us is to begin the support here

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo What we have done ◘ We presented the Geant4 community last GDB and we asked for local support ➸ In my ideal world, once you ask for the entry a support person should be assigned to you to discuss with you your hopes and the viability ➸ Let him to deal with the LCG/Sites Management ◘ At the same time we were following the official procedure to become VO ➸ After testing the product and seeing you want it, this is mandatory ◘ In the case of Geant4 ➸ We asked directly for the support to Geant4 site per site ➸ We got in time in 5 sites ➸ We got the total amount of CPUs required for Geant4 (120CPUs) ➸ LCG Efficiency: 99% ➸ At any moment the Geant4 production was totally followed not to interfere in a wrong way with the sites ► This is fundamental for the sites ► Your LCG support is responsible of your production ► Many other communities are running and you should not interfere them (and they should not interfere you!)

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo Light-weight VO registration ◘ We are considering to have an incubator VO to assist new communities ➸ This VO is standard for all new communities ➸ It allows you to play with the system ➸ During this time your support plays with you ➸ You do not decide the policy of this VO, we do it ➸ You will not make productions inside, you just learn ➸ LCG has already experience, we know how to deal with ◘ Warning: You should have a limited time inside ➸ Then you are a integrated VO ➸ You should go through the registrations steps ➸ Now your support makes exactly that: LCG support no more, no less ► With a strict LCG support no problems should be seen ► Experiments happy, sites happy, LCG happy

Enabling Grids for E-sciencE INFSO-RI Trieste, 10 th February 2006 Patricia Méndez Lorenzo Final Message LCG was born with a clear objective Assist LHC experiments during their real data taking The project is however extensible and quite attractive for any other community We are thinking at this moment how to procede with the light-weight VOs This is foreseen and it is part of the project SUPPORT is the most important point if you arrive to the GRID... And the infrastructure is already developed