Grid Computing for the ILC

Slides:



Advertisements
Similar presentations
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
Advertisements

Site Report HEPHY-UIBK Austrian federated Tier 2 meeting
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
London Tier 2 Status Report GridPP 13, Durham, 4 th July 2005 Owen Maroney, David Colling.
Makrand Siddhabhatti Tata Institute of Fundamental Research Mumbai 17 Aug
Southgrid Status Report Pete Gronbech: February 2005 GridPP 12 - Brunel.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Dave Kant Grid Monitoring and Accounting Dave Kant CCLRC e-Science Centre, UK HEPiX at Brookhaven 18 th – 22 nd Oct 2004.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
GridKa SC4 Tier2 Workshop – Sep , Warsaw Tier2 Site.
Computational grids and grids projects DSS,
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Grid Infrastructure for the ILC Andreas Gellrich DESY European ILC Software and Physics Meeting Cambridge, UK,
L ABORATÓRIO DE INSTRUMENTAÇÃO EM FÍSICA EXPERIMENTAL DE PARTÍCULAS Enabling Grids for E-sciencE Grid Computing: Running your Jobs around the World.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
Spending Plans and Schedule Jae Yu July 26, 2002.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
São Paulo Regional Analysis Center SPRACE Status Report 22/Aug/2006 SPRACE Status Report 22/Aug/2006.
1 PRAGUE site report. 2 Overview Supported HEP experiments and staff Hardware on Prague farms Statistics about running LHC experiment’s DC Experience.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Eine Einführung ins Grid Andreas Gellrich IT Training DESY Hamburg
Presenter Name Facility Name UK Testbed Status and EDG Testbed Two. Steve Traylen GridPP 7, Oxford.
Grid DESY Andreas Gellrich DESY EGEE ROC DECH Meeting FZ Karlsruhe, 22./
Glite. Architecture Applications have access both to Higher-level Grid Services and to Foundation Grid Middleware Higher-Level Grid Services are supposed.
30/07/2005Symmetries and Spin - Praha 051 MonteCarlo simulations in a GRID environment for the COMPASS experiment Antonio Amoroso for the COMPASS Coll.
HEP SYSMAN 23 May 2007 National Grid Service Steven Young National Grid Service Manager Oxford e-Research Centre University of Oxford.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
II EGEE conference Den Haag November, ROC-CIC status in Italy
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
13 January 2004GDB Geneva, Milos Lokajicek Institute of Physics AS CR, Prague LCG regional centre in Prague
A Nordic Tier-1 for LHC Mattias Wadenstein Systems Integrator, NDGF Grid Operations Workshop Stockholm, June the 14 th, 2007.
A Distributed Tier-1 for WLCG Michael Grønager, PhD Technical Coordinator, NDGF CHEP 2007 Victoria, September the 3 rd, 2007.
Grid Technology in Production at DESY Andreas Gellrich * DESY ACAT May 2005 *
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
WLCG IPv6 deployment strategy
London Tier-2 Quarter Owen Maroney
Grid Computing: Running your Jobs around the World
SuperB – INFN-Bari Giacinto DONVITO.
BaBar-Grid Status and Prospects
The EDG Testbed Deployment Details
LCG Service Challenge: Planning and Milestones
IBCP - CNRS STATUS Christelle Eloto Lyon - France
Moroccan Grid Infrastructure MaGrid
GILDA Project Valeria Ardizzone INFN Catania Italy
LCG Deployment in Japan
VOCE Peter Kaczuk, Dan Kouril, Miroslav Ruda, Jan Svec,
Sergio Fantinel, INFN LNL/PD
UK GridPP Tier-1/A Centre at CLRC
UTFSM computer cluster
OpenGATE meeting/Grid tutorial, mars 9nd 2005
Short update on the latest gLite status
Simulation use cases for T2 in ALICE
Grid Computing and the National Analysis Facility
NorduGrid and LCG middleware comparison
High Energy Physics Computing Coordination in Pakistan
LHC Data Analysis using a worldwide computing grid
CC and LQCD dimanche 13 janvier 2019dimanche 13 janvier 2019
The GENIUS portal and the GILDA t-Infrastructure
The CMS Beijing Site: Status and Application
GRIF : an EGEE site in Paris Region
Installation/Configuration
The LHCb Computing Data Challenge DC06
Presentation transcript:

Grid Computing for the ILC Andreas Gellrich* DESY LC Simulation Mini Workshop 28/29 June 2005 *http://www.desy.de/~gellrich/

Overview Why is the Grid interesting for ILC? How can ILC benefit from the Grid?

The LCG Grid 24.06.2005 DESY

Grid Computing The Grid is there – pioneered by LCG (Almost ) all HEP resources have been moved to the Grid Grid tools become the de facto standard for computing (local batch system globally accessible ) and data access (GridFTP, SRM, Replica Catalogs, Meta Data Catalogs) User management done within Virtual Organizations (VO) Certificates for authentication rather local site account Embedded in the EU project Enabling Grids for E-scienE (EGEE)

Grid @ DESY With the HERA-II luminosity upgrade, the demand for MC production rapidly increased while the outside collaborators moved there computing resources into LCG H1 and ZEUS maintain VOs and use the Grid for MC production The LQCD group develops a Data Grid to exchange data DESY is about to become an LCG Tier-2 site EGEE and D-GRID dCache is a DESY / FNAL development Since spring 2004 an LCG-2 Grid infrastructure in operation

Grid Infrastructure … VOs hosted at DESY: Global: ‘hone’, ‘ilc’, ‘zeus’ (registration via LCG registrar system) Regional: ‘calice’, ‘dcms’, ‘ildg’ Local: ‘baikal’, ‘desy’, ‘herab’, ‘hermes’, ‘icecube’ VOs supported at DESY: Global: (‘atlas’), ‘cms’, ‘dteam’ Regional: ‘dech’ H1 Experiment at HERA (‘hone’) DESY, U Dortmund, RAL, RAL-PP, Bham ILC Community (‘ilc’, ‘calice’) DESY, RHUL, QMUL, IC, … ZEUS Experiment at HERA (‘zeus’) DESY, U Dortmund, INFN, UKI, UAM, U Toronto, Cracow, U Wisconsin, U Weizmann

ZEUS @ Grid > 270 M events have been produced on the Grid since Nov 2004 mainly produced outside DESY 32 sites (incl. Wisconsin and Toronto)

… Grid Infrastructure … SL 3.04 Quattor (OS for all nodes; complete installation for WNs) Yaim (for all service nodes) LCG-2_4_0 Central VO Services: VO server (LDAP) [grid-vo.desy.de] Replica Location Services (RLS) [grid-cat.desy.de] Distributed VO Services: Resource Broker (RB) [grid-rb.desy.de] Information Index (BDII) [grid-bdii.desy.de] Proxy (PXY) [grid-pxy.desy.de] Site Services: [DESY-HH] GIIS: ldap://grid-giis.desy.de:2170/mds-vo-name=DESY-HH,o=grid CE: 24 WNs (48 CPUs, XEON 3.06 GHZ) (IT) [grid-ce.desy.de] CE: 17 WNs (34 CPUs, XEON 1GHz) (ZEUS) [zeus-ce.desy.de] SE: dCache-based with access to the entire DESY data space Storage: disk 5 TB (+ 15 TB this summer), tape 0.5 PB media (2 PB capacity)

… Grid Infrastructure … VO Services VO Services A Site Services VO Services A Distributed Site Services A Site Services B JDL lcg-* BDII GIIS Site Services C PXY UI UI RB JDL PBS CE RGMA UI PBS/MAUI Central WN RLS lcg-* SE VO RB CE SE DESY RHUL QMUL …

… Grid Infrastructure SuperMicro Superserver rack-mounted 1U servers dual Intel P4 XEON 2.8 / 3.06 GHz 2 GB ECC DDRAM GigaBit Ethernet 80 GB (E)IDE system disk 200 GB (E)IDE data disk 10 Gbit/s DESY back-bone 1 Gbit/s WAN (G-WIN) 19 Worker Nodes (WN) 10 service nodes 3 WNs ZEUS 2 WNS U Hamburg 17 WNs ZEUS

ILC @ Grid The VOs ‘ilc’ and ‘calice’ are hosted at DESY since end of 2004 Initial MC studies (Peter Wienemann, DESY / U Freiburg) Registration to ‘ilc’ is managed by LCG (http://lcg-registrar.cern.ch) and has become an official VO ‘ilc’ is currently supported by DESY, RHUL, QMUL ‘calice’ is supported by Imperial College (IC) The testbeam data of CALICE were moved between DESY and IC using Grid tools (GridFTP, SRM, RLS)

The Grid is ready for ILC! Conclusions DESY maintains a Grid Infrastructure in production in the context of EGEE and D-GRID DESY is about to become an LCG Tier-2 centre H1 and ZEUS heavily use the Grid for MC production VOs already exist for ILC and CALICE with international support Significant resources are available The Grid is ready for ILC!

Grid @ Web DESY Grid Web Sites: http://grid.desy.de/ http://www.dcache.org/ http://www-zeus.desy.de/grid/ http://www.lqcd.org/ildg/ Grid Computing Web Sites: http://cern.ch/lcg/ http://lcg-registrar.cern.ch http://www.eu-egee.org/ http://d-grid.de/