Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Grid Jeff Templon PDP Group, NIKHEF NIKHEF Jamboree 22 december 2005 Throbbing jobsGoogled Grid.
The DutchGrid Platform Collaboration of projects from –Computer Science, HEP and service providers Participating and supported projects –Virtual Laboratory.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Infrastructure overview Arnold Meijster &
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
LHC Computing Plans Scale of the challenge Computing model Resource estimates Financial implications Plans in Canada.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
GridPP18 Glasgow Mar 07 DØ – SAMGrid Where’ve we come from, and where are we going? Evolution of a ‘long’ established plan Gavin Davies Imperial College.
Dutch Tier Hardware Farm size –now: 150 dual nodes + scavenging 200 nodes –buildup to ~1500 up-to-date nodes in 2007 Network –now: 2 Gbit/s internatl.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
Data Preservation in High Energy Physics Towards a Global Effort for Sustainable Long-Term Data Preservation in HEP
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
The DutchGrid Platform – An Overview – 1 DutchGrid today and tomorrow David Groep, NIKHEF The DutchGrid Platform Large-scale Distributed Computing.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
Computing Coordination Aspects for HEP in Germany International ICFA Workshop on HEP Networking, Grid and Digital Divide Issues for Global e-Science nLCG.
INFSO-RI Enabling Grids for E-sciencE EGEE SA1 in EGEE-II – Overview Ian Bird IT Department CERN, Switzerland EGEE.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
NIKHEF CT/ Status NIKHEF (NL). NIKHEFDataGrid/Oxford/July DutchGrid Participation of High-energy Physics Earth observation Computer.
119 May 2003HEPiX/HEPNT National Institute for Nuclear Physics and High Energy Physics Coordinates all (experimental) subatomic physics research in The.
23.March 2004Bernd Panzer-Steindel, CERN/IT1 LCG Workshop Computing Fabric.
, VilniusBaltic Grid1 EG Contribution to NEEGRID Martti Raidal On behalf of Estonian Grid.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Physics Data Processing at NIKHEF Jeff Templon WAR 7 May 2004.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Research organization technology David Groep, October 2007.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
GRID IL Tel Aviv, G.Mikenberg2 General Comments on Israeli Education and Research (2005… but not far from now..) Israeli Population 6.86 Millions.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Grid Computing Jeff Templon Programme: Group composition (current): 2 staff, 10 technicians, 1 PhD. Publications: 2 theses (PD Eng.) 16 publications.
J. Templon Nikhef Amsterdam Physics Data Processing Group “Grid” Computing J. Templon SAC, 26 April 2012.
Status of the NL-T1. BiG Grid – the dutch e-science grid Realising an operational ICT infrastructure at the national level for scientific research (e.g.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
Bob Jones EGEE Technical Director
A Dutch LHC Tier-1 Facility
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Data Challenge with the Grid in ATLAS
LHCb Computing Model and Data Handling Angelo Carbone 5° workshop italiano sulla fisica p-p ad LHC 31st January 2008.
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
Collaboration Board Meeting
The LHCb Computing Data Challenge DC06
Presentation transcript:

Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005

Jeff Templon – NIKHEF SAC, HEP Computing Model u Tier-0 : measurement center (CERN) n Dedicated computers (L2/L3 trigger farms) n Archival of raw data u Tier-1 : data centers n Archival of 2nd copy of raw data n Large-scale computing farms (e.g. reprocessing) n Spread geographically n Strong Support u Tier-2 : user facilities for data analysis / Monte Carlo

Jeff Templon – NIKHEF SAC, Worldwide HEP Computing Needs

Jeff Templon – NIKHEF SAC, Amsterdam : Tier-1 for LHC u Three experiments : LHCb / ATLAS / ALICE u Overall scale determined by estimating funding in NL u Contribution to experiments scaled by NIKHEF presence: 3:2:1 u Resulting NIKHEF share of total Tier-1 needs: n LHCb: 23% n ATLAS: 11,5% n ALICE: 5,75%

Jeff Templon – NIKHEF SAC, Amsterdam Tier-1 Numbers Status: GOOD!  Basic collaboration with SARA in place  Attitude adjustment needed (response time)  Appropriate funding line in NCF long-term draft plan  Just enough; concerns about ‘me-too’ (grids are popular)  Community-building (VL-E project)  Pull me-too people into same infrastructure

Jeff Templon – NIKHEF SAC, Overall Status LHC Computing u LCG a successful service n 14,000 CPUs and well-ordered operations, active community n Monte Carlo productions working well (next slide) u Data Management a problem n Software never converged in EDG n May not be converging in EGEE (same team) n Risk losing HEP community on DM n Makes community-forming (generic middleware) difficult: I’ll just build my own, this one stinks

Jeff Templon – NIKHEF SAC, Results of “Data Challenge ’04” u Monte Carlo tasks distributed to computers across world u Up to 3000 simultaneous “jobs” per experiment u 2.2 million CPU-hours (250 years) used in one month u Total data volume > 25 TB For LHCb: NIKHEF ~ 6% of global total See it in action Backup

Jeff Templon – NIKHEF SAC, Transport of primary data to Tier-1s 

LCG Service Challenge II “The Dutch Contribution”

Jeff Templon – NIKHEF SAC, Local Status u Positioned well in LCG & EGEE n Present on ‘blessed’ Tier-1 list n One of ‘best run’ sites n One of first sites (#3 in EDG, compare #4 in WWW) n Membership on: s Middleware Design Team (US collaboration here too) s Project Technical Forum s LCG Grid Applications Group (too bad, almost defunct) s Middleware Security Group s Etc etc etc u D. Groep chairs world-recognized EUGridPMA u K. Bos chairs LHC Grid Deployment Board

Jeff Templon – NIKHEF SAC, Local Status #2 u NIKHEF “grid site” n Roughly 300 CPUs / 10 terabytes of storage n Several distinct components s LCG / VL-E production s LCG pre-production s EGEE testing s VL-E certification u Manpower: 8 staff, interviews this week for three more (project funding)

Jeff Templon – NIKHEF SAC, PDP Group Activities u Middleware (3 FTE) n Mostly “security” -- best bang for buck + local world expert u Operations (3 FTE) n How does one operate a terascale / kilocomputer site? n Knowledge transfer to SARA (they have support mandate) n Contribute regularly to operational middleware u Applications (3 FTE) n Strong ties to local HEP (ATLAS “Rome” production, LHCb Physics Performance Report, D0 “SAMGrid”) n Community forming: LOFAR & KNMI, looking for others

Jeff Templon – NIKHEF SAC, Industrial Interest GANG NIKHEF IBM, LogicaCMG, Philips, HPC, UvA, SARA, NIKHEF … 16 industrial participants (24 total)