GridPP Presentation to PPARC e-Science Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon.

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

1 Copyright © 2005, Oracle. All rights reserved. Introducing the Java and Oracle Platforms.
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
24-May-01D.P.Kelsey, GridPP WG E: Security1 GridPP Work Group E Security Development David Kelsey CLRC/RAL, UK
S.L.LloydGridPP Collaboration Meeting Introduction Slide 1 GridPP - Introduction Welcome to the First GridPP Collaboration Meeting Introduction A brief.
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
Fabric and Storage Management GridPP Fabric and Storage Management GridPP 24/24 May 2001.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 16 th Collaboration Meeting QMUL June 2006.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
5-Dec-02D.P.Kelsey, GridPP Security1 GridPP Security UK Security Workshop 5-6 Dec 2002, NeSC David Kelsey CLRC/RAL, UK
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
GridPP Funding Model(s) D. Britton Imperial College 24/5/01 £21m + £5m.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
Tony Doyle GridPP2 Specification Process Grid Steering Committee Meeting, MRC, London, 18 February 2004.
Partner Logo UK GridPP Testbed Rollout John Gordon GridPP 3rd Collaboration Meeting Cambridge 15th February 2002.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Andrew McNab - Manchester HEP - 24 May 2001 WorkGroup H: Software Support Both middleware and application support Installation tools and expertise Communication.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
GridPP Presentation to PPARC Grid Steering Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
Steve Lloyd Tony Doyle GridPP Presentation to PPARC e-Science Committee 31 May 2001.
The National Grid Service Mike Mineter.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Forschungszentrum Karlsruhe Technik und Umwelt LHCC Meeting Bologna, 15 June Planning a Regional Data and Computing Center in Germany Peter Mickel.
25 July, 2014 Hailiang Mei, TU/e Computer Science, System Architecture and Networking 1 Hailiang Mei Remote Terminal Management.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Andrew McNab - Manchester HEP - 2 May 2002 Testbed and Authorisation EU DataGrid Testbed 1 Job Lifecycle Software releases Authorisation at your site Grid/Web.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
2 GridPP2 Budget David Britton, 4/12/03 Imperial College.
Global Analysis and Distributed Systems Software Architecture Lecture # 5-6.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Oxford Jan 2005 RAL Computing 1 RAL Computing Implementing the computing model: SAM and the Grid Nick West.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC WP2+5: Data and Storage Management.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
October LHCUSA meeting BNL Bjørn S. Nilsen Update on NSF-ITR Proposal Bjørn S. Nilsen The Ohio State University.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
October 30, 2001ATLAS PCAP1 LHC Computing at CERN and elsewhere The LHC Computing Grid Project as approved by Council, on September 20, 2001 M Kasemann,
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
6 march Building the INFN Grid Proposal outline a.ghiselli,l.luminari,m.sgaravatto,c.vistoli INFN Grid meeting, milano.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
CERN 1 DataGrid Architecture Group Bob Jones CERN.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Bob Jones EGEE Technical Director
Grid related projects CERN openlab LCG EDG F.Fluckiger
UK GridPP Tier-1/A Centre at CLRC
LHC Data Analysis using a worldwide computing grid
Collaboration Board Meeting
Gridifying the LHCb Monte Carlo production system
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

GridPP Presentation to PPARC e-Science Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon

GridPPe-Science PresentationSlide 2 Outline Component Model Resource Allocation and Funding Scenarios Int l Financial Comparisons Int l Grid Collaborations Grid Architecture(s) Links with Industry Summary Addendum: 1. VISTA and GridPP 2. GridPP monitoring page

GridPPe-Science PresentationSlide 3 GridPP Proposal GridPP = Vertically integrated programme = component model... Input to development of £15-20M funding scenarios

GridPPe-Science PresentationSlide 4 GridPP Workgroups A - Workload Management Provision of software that schedule application processing requests amongst resources B - Information Services and Data Management Provision of software tools to provide flexible transparent and reliable access to the data C - Monitoring Services All aspects of monitoring Grid services D - Fabric Management and Mass Storage Integration of heterogeneous resources into common Grid framework E - Security Security mechanisms from Certification Authorities to low level components F - Networking Network fabric provision through to integration of network services into middleware G - Prototype Grid Implementation of a UK Grid prototype tying together new and existing facilities H - Software Support Provide services to enable the development, testing and deployment of middleware and applications at institutes I - Experimental Objectives Responsible for ensuring development of GridPP is driven by needs of UK PP experiments J - Dissemination Ensure good dissemination of developments arising from GridPP into other communities and vice versa Technical work broken down into several workgroups - broad overlap with EU DataGrid

GridPPe-Science PresentationSlide 5 Components 1-4: £21M I: 11.9% Experiment Objectives H*: 5.4% H: 3.2% Software Support G: Prototype Grid 9.7% F* 1.5% F 1.9% CERN Staff 27.0% CERN Hardware 6.8% J 2.6% E 1.7% D* 1.5% D 2.7% C* 0.6% C 1.1% B* 1.9% B 1.5% A* 0.4% A 1.4% UK Managers 1.9% UK Capital 15.3% Work Groups A - F

GridPPe-Science PresentationSlide 6 £20M Project I: 11.9% Experiment Objectives H*: 5.4% H: 3.2% Software Support G: Prototype Grid 9.7% F* 1.5% F 1.9% CERN J 2.6% E 1.7% D* 1.5% D 2.7% C* 0.6% C 1.1% B* 1.9% B 1.5% A* 0.4% A 1.4% UK Managers 1.9% UK Capital Work Groups A - F £7.1m £6.7m £3.2 £ %

GridPPe-Science PresentationSlide 7 £17M Project I: £2.49m £1.2m Experiment Objectives H*: 5.4% H: 3.2% Software Support G: Prototype Grid 9.7% F* 1.5% F 1.9% CERN J 2.6% E 1.7% D* 1.5% D 2.7% C* 0.6% C 1.1% B* 1.9% B 1.5% A* 0.4% A 1.4% UK Managers 1.9% UK Capital Work Groups A - F £7.1m £6.7m £6.0m £3.2 £2.9 £2.45m 90.0%

GridPPe-Science PresentationSlide 8 Experiment Objectives Vertically integrated programme? Broken component model… Specific experiments or overall reduction? To be determined by Experiments Board 50% reduction? 23 SY

GridPPe-Science PresentationSlide 9 CERN (Component 3) Basic Grid functionality: UK-CERN integrated programme - synergies, but cuts here will impact… 10% reduction? 3.1 SY

GridPPe-Science PresentationSlide 10 CERN (Component 4) Experiments support: similar conclusions to UK- based programme Non-UK funding dependencies? 50% reduction? 11 SY Pro-rata reduction on disk, tape, CPU... +HARDWARE 15% reduction? £0.2M

GridPPe-Science PresentationSlide 11 Workload/Data Management Reduced long-term programme? e.g. scheduler optimisation (WG A) query optimisation (WG B) … or overall reduction? 10% reduction? 1.2 SY

GridPPe-Science PresentationSlide 12 £15M Project I: £2.49m £0 Experiment Objectives H*: 5.4% H: 3.2% Software Support G: Prototype Grid 9.7% F* 1.5% F 1.9% CERN J 2.6% E 1.7% D* 1.5% D 2.7% C* 0.6% C 1.1% B* 1.9% B 1.5% A* 0.4% A 1.4% UK Managers 1.9% UK Capital Work Groups A - F £5m £3.2 £2.9 £2.45m 90.0%

GridPPe-Science PresentationSlide 13 £15M Project Summary Even a £21-20M reduction is not trivial.. EU DataGrid commitments are built in Focus on CERN and UK Capital as largest single items, then reduce workgroup allocations £17M budget cuts hard into the project –Examples are based on original Component Model £15M budget is impossible within the Component Model A fixed allocation help in planning the start-up phase

GridPPe-Science PresentationSlide 14 International Comparisons PP Grids under development France Germany Italy US –CMS –Atlas Tier1 starting up at Karlsruhe BaBar TierB at Karlsruhe Tier2 for ALICE at Darmstadt No national Grid - project led Tier-1 RC for all 4 LHC experiments at CC-IN2P3 in Lyon BaBar TierA an LHC prototype starting now National Core Grid (2M/year) Tier-1 at FNAL and 5 Tier-2 centres Prototype built during , with full deployment during Staff estimates for the Tier-1 centre are 14 FTE by 2003, reaching 35 FTE in Integrated costs to 2006 are $54.7M excluding, GriPhyN and PPDG Atlas plans very similar to CMS with costs foreseen to be the same Tier1 at Brookhaven INFN National Grid based round EU- DataGrid Tier-1 RC and a prototype starting now in CNAF, Bologna 15.9M is allocated during for Tier-1 hardware alone Tier1 staff rising to 25 FTE by Tier2 centres at 1M/year

GridPPe-Science PresentationSlide 15 International Comparisons Summary - different countries, different models France & Germany budget for hardware, assume staff Italy - lots of hardware and staff US - funds split between Tier1/2, Universities, infrastructure, and R&D Italy > UK ~ France (EU) ~US (GriPhyN, PPDG and iVDGL characteristics within GridPP: single UK programme) ~

GridPPe-Science PresentationSlide 16 GridPP Architecture Based on EU DataGrid developments feeding into GGF Status: Version 2 (2/7/01) Key elements: –Evolutionary capability –Service via Protocols and Client APIs –Representation using UML (TogetherSoft) –Defines responsibilities of Work Packages –Built from Infrastructure –Based on PP Use Cases (applies to GridPP) The DataGrid Architecture Version 2 German Cancio, CERNSteve M. Fisher, RALTim Folkes, RAL Francesco Giacomini, INFNWolfgang Hoschek, CERNDave Kelsey, RAL Brian L. Tierney, LBL/CERN July 2, 2001

GridPPe-Science PresentationSlide 17 The Grid and Industry Help us develop the Grid: –Supply hardware - PCs, Disks, Mass Storage, Networking etc –Supply software, middleware, management systems, databases etc Use the Grid for themselves: –Collaborative Engineering –Massive simulation –Federating their own worldwide databases Sell or develop the Grid for others: –Computation Services, Data services etc

GridPPe-Science PresentationSlide 18 Summary Balanced exploitation programme costs £21M £20M-£17M-£15M 3-year funding scenarios examined £20M = maintains balanced programme £17M = reduced experimental objectives £15M = eliminates experimental objectives Final balance depends on funding allocation Emphasis on vertical integration: component model International comparisons: Italy > UK ~ France (EU) ~US (GriPhyN, PPDG and iVDGL characteristics within GridPP: single UK programme) Contacts established with GriPhyN, PPDG and iVDGL InterGrid Co-ordination Group in development Architecture defined by GGF via lead in DataGrid Industry links: emphasis on partnership ~

GridPPe-Science PresentationSlide 19 GridPP and VISTA Astrogrid will federate VISTA data with other large databases elsewhere –this requires that VISTA data has already been processed and catalogues and images are available. VISTA have a proposal (e-VPAS) that concentrates on producing the databases on which the Astrogrid tools will work. This work has much in common with GridPP: –a similar timescale –very large data flows from one remote site –many distributed users –reprocessing of data –utilization of distributed computing resources GridPP have started discussions with VISTA and EPCC (GenGrid) as to how we can collaborate and share expertise and middleware

GridPPe-Science PresentationSlide 20 GridPP Monitoring Page Various sites now set up with UK Globus certificates –Grid Monitoring Polls Grid test-bed sites via globus-job-run command Runs basic script producing XML encoded status information Load average and timestamp information retrieved Current status and archived load information is plotted... To be done... –JAVA CoG kit being investigated (more robust) –Simple monitoring system to verify test-bed timestamps (in case not everyone is using NTP) –Integrate with the Grid Monitoring Architecture –Incorporate current network bandwidth measurements into graphical system –Automatic notification system