Steve Lloyd Tony Doyle GridPP Presentation to PPARC e-Science Committee 31 May 2001.

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
S.L.LloydGridPP Collaboration Meeting Introduction Slide 1 GridPP - Introduction Welcome to the First GridPP Collaboration Meeting Introduction A brief.
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
Fabric and Storage Management GridPP Fabric and Storage Management GridPP 24/24 May 2001.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
GridPP Funding Model(s) D. Britton Imperial College 24/5/01 £21m + £5m.
Your university or experiment logo here What is it? What is it for? The Grid.
Tony Doyle GridPP2 Specification Process Grid Steering Committee Meeting, MRC, London, 18 February 2004.
Partner Logo UK GridPP Testbed Rollout John Gordon GridPP 3rd Collaboration Meeting Cambridge 15th February 2002.
S.L.LloydGridPP Collaboration Meeting IC Sept 2002Slide 1 Introduction Welcome to the 5 th GridPP Collaboration Meeting Steve Lloyd, Chair of GridPP.
GridPP Building a UK Computing Grid for Particle Physics A PPARC funded project.
1Oxford eSc – 1 st July03 GridPP2: Application Requirement & Developments Nick Brook University of Bristol ALICE Hardware Projections Applications Programme.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
S.L.LloydGridPP9 IntroductionSlide 1 Introduction Welcome to the 9 th GridPP Collaboration Meeting Dissemination Officer GridPP2 Posts Tier-2 Centres Steve.
UK Agency for the support of: High Energy Physics - the nature of matter and mass Particle Astrophysics - laws from natural phenomena Astronomy - the.
GridPP Presentation to PPARC Grid Steering Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
GridPP Presentation to PPARC e-Science Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon.
Research Councils ICT Conference Welcome Malcolm Atkinson Director 17 th May 2004.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
GridPP From Prototype to Production David Britton 21/Sep/06 1.Context – Introduction to GridPP 2.Performance of the GridPP/EGEE/wLCG Grid 3.Some Successes.
S.L.LloydGridPP CB 4 December 2003Slide 1 Agenda 1.Minutes of Previous Meeting (19 Feb 2003) 2.Matters Arising 3.Announcements (Steve) 4.GridPP2 Proposal.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
EGEE statement EU and EU member states major investment in Grid Technology Several good prototype results Next Step: –Leverage current and planned national.
Tony Doyle “GridPP2 Proposal”, GridPP7 Collab. Meeting, Oxford, 1 July 2003.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Vendor Day 30 th April.
1st Project Management Board Meeting 3 Sep 2001 Tony Doyle.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
Tony Doyle GridPP – From Prototype To Production, GridPP10 Meeting, CERN, 2 June 2004.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.
Nick Brook Current status Future Collaboration Plans Future UK plans.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
October 30, 2001ATLAS PCAP1 LHC Computing at CERN and elsewhere The LHC Computing Grid Project as approved by Council, on September 20, 2001 M Kasemann,
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Tony Doyle - University of Glasgow Introduction. Tony Doyle - University of Glasgow 6 November 2006ScotGrid Expression of Interest Universities of Aberdeen,
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
18/12/03PPD Christmas Lectures 2003 Grid in the Department A Guide for the Uninvolved PPD Computing Group Christmas Lecture 2003 Chris Brew.
Understanding the nature of matter -
UK GridPP Tier-1/A Centre at CLRC
Fabric and Storage Management
Collaboration Board Meeting
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

Steve Lloyd Tony Doyle GridPP Presentation to PPARC e-Science Committee 31 May 2001

e-Science Presentation GridPP History Collaboration formed by all UK PP Experimental Groups in 1999 to submit £5.9M JIF bid for Prototype Tier-1 centre at RAL (later withdrawn) Added some Tier-2 support to become part of PPARC LTSR - “The LHC Computing Challenge”  Input to SR2000 Formed GridPP in Dec 2000 included CERN, CLRC and UK PP Theory Groups From Jan 2001 handling PPARC’s commitment to EU DataGrid UK GridPP e-Science Presentation

e-Science Presentation Physics Drivers Addresses one of PPARC's highest priority Science programmes. LHC Experiments are the key to understanding the origins of Mass (Higgs?), Supersymmetry, CP Violation, Quark gluon plasma and possible new phenomena e.g. extra dimensions Maximise return from substantial UK investment in LHC Detectors GridPP e-Science Presentation

The LHC Computing Challenge The problem is Huge: Total data/year from one experiment ~ 1 to 8 PB (Petabytes = 1015 Bytes) Estimate total requirement to be ~ 8M SI-95 of CPU Power ( 200,000 1 GHz PCs) ~ 28 PB of 'Tape' storage ~ 10 PB of disk storage The problem is Complex: > 108 electronic channels are read out each event The LHC will produce 8x108 pp interactions per second The Higgs to  rate for example is expected to be 2x10-4 per second A 2x10-4 needle in a 8x108 haystack Distributed Solution to maximise use of facilities and resources worldwide GridPP e-Science Presentation

e-Science Presentation LHC Computing Model Uni x Lab m USA Brookhaven Uni a UK Lab a USA FermiLab France Tier 1 CERN Uni n Tier2 Physics Department Italy ………. Desktop NL Germany Lab b Lab c  Uni y  Uni b  GridPP e-Science Presentation

Proposal Executive Summary £40M 3-Year Programme LHC Computing Challenge = Grid Technology Five Components: Foundation Production Middleware Exploitation Value-added Exploitation Emphasis on Grid Services and Core Middleware UK computing strength within CERN Integrated with EU DataGrid, PPDG and GriPhyN Facilities at CERN, RAL and up to four UK Tier-2 sites Centres = Dissemination LHC developments integrated into current programme (BaBar, CDF, D0, ...) Robust Management Structure Deliverables in March 2002, 2003, 2004 GridPP e-Science Presentation

GridPP Component Model Value-added Component 5: Value-Added Full exploitation of Grid potential for Particle Physics £25.9M Exploitation Component 4: Exploitation The applications necessary to deliver Grid based Particle Physics £21.0M Middleware Component 3: Middleware Connecting the Foundation and the Production environments to create a functional Grid £17.0M Production Component 2: Production Built on Foundation to provide an environment for experiments to use £12.7M Foundation Component 1: Foundation The key infrastructure at CERN and within the UK £8.5M GridPP e-Science Presentation

e-Science Presentation Major Deliverables Prototype I - March 2002 Performance and scalability testing of components Testing of the job scheduling and data replication software from the first DataGrid release. Prototype II - March 2003 Prototyping of the integrated local computing fabric, with emphasis on scaling, reliability and resilience to errors. Performance testing of LHC applications. Distributed HEP and other science application models using the second DataGrid release. Prototype III - March 2004 Full scale testing of the LHC computing model with fabric management and Grid management software for Tier-0 and Tier-1 centres, with some Tier-2 components. GridPP e-Science Presentation

e-Science Presentation Financial Summary Components 1-4: PPARC External Funds UK Staff £10.7M £5.9M (EPSRC?) UK Capital £3.2M £4.5M (SRIF?) CERN Staff £5.7M CERN Capital £1.4M Total £21.0M £10.3M Computing Science LHC Tier-0 Up to 4 Tier-2 LHC Tier-1/BaBar Tier A GridPP e-Science Presentation

e-Science Presentation GridPP Organisation Hardware development organised around a number Regional Centres Likely Tier-2 Regional Centres Focus for Dissemination and Collaboration with other disciplines and Industry Clear mapping onto Core Regional e-Science Centres Software development organised around a number of Workgroups GridPP e-Science Presentation

e-Science Presentation GridPP Workgroups Technical work broken down into several workgroups - broad overlap with EU DataGrid A - Workload Management Provision of software that schedule application processing requests amongst resources F - Networking Network fabric provision through to integration of network services into middleware B - Information Services and Data Management Provision of software tools to provide flexible transparent and reliable access to the data G - Prototype Grid Implementation of a UK Grid prototype tying together new and existing facilities H - Software Support Provide services to enable the development, testing and deployment of middleware and applications at institutes C - Monitoring Services All aspects of monitoring Grid services D - Fabric Management and Mass Storage Integration of heterogeneous resources into common Grid framework I - Experimental Objectives Responsible for ensuring development of GridPP is driven by needs of UK PP experiments E - Security Security mechanisms from Certification Authorities to low level components J - Dissemination Ensure good dissemination of developments arising from GridPP into other communities and vice versa GridPP e-Science Presentation

e-Science Presentation GridPP and CERN For UK to exploit LHC to the full: Requires substantial investment at CERN to support LHC computing. UK involvement through GridPP will boost CERN investment in key areas: Fabric management software Grid security Grid data management Networking Adaptation of physics applications Computer Centre fabric (Tier-0) GridPP e-Science Presentation

e-Science Presentation GridPP and CERN This investment will: Allow operation of a production quality prototype of the distributed model prior to acquisition of final LHC configuration Train staff for management and operation of distributed computing centres Provide excellent training ground for young people Enable the technology to be re-used by other sciences and industry GridPP e-Science Presentation

e-Science Presentation Staff at CERN Proposal is that staff are hired by UK Universities or Laboratories and sent on long-term mission to CERN. Employed as CERN associates in IT Division Integrated part of the CERN LHC activity Staff assigned to CERN teams with operational and development responsibilities Each team responsible for LHC development and prototyping as well as for operating current services Developers need hands-on operational experience Ensures that CERN experience is fully utilised Formal LHC Computing Project structure being defined, to ensure overseeing role by funding bodies – to be agreed with CERN Council GridPP e-Science Presentation

e-Science Presentation Hardware at CERN GridPP e-Science Presentation

GridPP Management Structure e-Science Presentation

Management Status The Collaboration Board (CB) The governing body of the project - consists of Group Leaders of all Institutes - established and Collaboration Board Chair elected The Peer Review Selection Committee (PRSC) Pending approval of Project The Project Management Board (PMB) The Executive board chaired by Project Leader - Project Leader being appointed. Shadow Board in operation The Dissemination Board (DB) Pending approval of Project The Technical Board (TB) The main working forum chaired by the Technical Board Chair - interim task force in place The Experiments Board (EB) The forum for experimental input into the project - nominations from experiments underway GridPP e-Science Presentation

e-Science Presentation Information Flow GridPP e-Science Presentation

Meetings Schedule Yearly Reporting to PPARC? Quarterly Reporting to EU GridPP e-Science Presentation

GridPP Collaboration Meeting 1st GridPP Collaboration Meeting - Coseners House - May 24/25 2001 GridPP e-Science Presentation

e-Science Presentation UK Strengths Wish to build on UK strengths - Information Services Networking - world leaders in monitoring Security Mass storage UK Major Grid Leadership roles - Lead DataGrid Architecture Task Force (Steve Fisher) Lead DataGrid WP3 Information Services (Robin Middleton) Lead DataGrid WP5 Mass Storage (John Gordon) ATLAS Software Coordinator (Norman McCubbin) LHCb Grid Coordinator (Frank Harris) Strong UK Collaboration with Globus Globus people gave 2 day tutorial at RAL to PP community Carl Kesselman attended UK Grid Technical meeting 3 UK people visited Globus at Argonne Natural UK Collaboration with US PPDG and GriPhyN GridPP e-Science Presentation

e-Science Presentation Funding Requirements Full exploitation requires £25.9M from PPARC plus £11.6M external funds Minimum programme requires £21.0M from PPARC plus £10.3M external funds Profiling is driven by: Hardware: Immediate need (Buy now) v Moore's law (Buy later)  UK Flat, CERN rising Manpower: Immediate need + requirement for 3 year positions (Hire now) v Availability (spread out)  UK + CERN want to front load Does not match PPARC funding profile Our proposal profiled this as 2001/2 2002/3 2003/4 £3.91M £8.43M £8.64M c.f. PPARC profile £3.0M £8.0M £15M GridPP e-Science Presentation

e-Science Presentation Proposal Profile GridPP e-Science Presentation

e-Science Presentation Reprofiling One attempt to match PPARC profile Too many staff hired in 3rd year (for 1 year!) GridPP e-Science Presentation

First Year Deliverables Each Workgroup has detailed deliverables. These will be refined each year and build on the successes of the previous year. The Global Objectives for the first year are: Deliver EU DataGrid Middleware (First prototype [M9]) Running experiments to integrate their data management systems into existing facilities (e.g. mass storage) Assessment of technological and sociological Grid analysis needs Experiments refine data models for analyses Develop tools to allow bulk data transfer Assess and implement metadata definitions Develop relationships across multi-Tier structures and countries Integrate Monte Carlo production tools Provide experimental software installation kits LHC experiments start Data Challenges Feedback assessment of middleware tools GridPP e-Science Presentation

e-Science Presentation External Resources External Funds (additional to PPARC Grants and central facilities) have provided computing equipment for several experiments and institutes BaBar (Birmingham, Bristol, Brunel, Edinburgh, £0.8M (JREI) + £1.0M (JIF) Imperial, Liverpool, Manchester, QMUL, RAL, RHUL) MAP (Liverpool ) £0.3M (JREI) ScotGrid (Edinburgh, Glasgow) £0.8M (JREI) D0 (Lancaster) £0.4M (JREI) + £0.1M (Univ) Dark Matter (Sheffield) £0.03M (JIF) CDF/Minos (Glasgow, Liverpool, Oxford, UCL) £1.7M (JIF) CMS (Imperial) £0.15M (JREI) ALICE (Birmingham) £0.15M (JREI) Total £5.4M All these Resources will contribute directly to GridPP Many Particle Physics Groups are involved in large SRIF bids in collaboration with other disciplines mostly to form e-Science centres. The amount of resource available to GridPP from this SRIF round could be several £M GridPP e-Science Presentation

e-Science Presentation First Year Priorities Funding of PPARC's EU DataGrid staff commitment Staff to implement initial Grid Testbed in UK Hardware to satisfy BaBar requirements and EU DataGrid testbed commitment Staff for CERN LHC computing Contribution to CERN Tier-0 (Staff costs assume only 6 months of salary in first year) 15 EU DataGrid 3 year posts (already committed) £0.4M 15 GridPP 3 year posts £0.4M Hardware for BaBar Tier-A/LHC Tier-1 £0.9M 15 CERN 3 year posts £0.5M Hardware for CERN Tier-0 £0.3M Total £2.5M Minimum viable programme to meet commitments GridPP e-Science Presentation

e-Science Presentation Summary Have been working towards this project for ~ 2 years building up hardware Balanced exploitation programme costing £21M Will put PPARC and the UK at the forefront of Grid development in Europe Funds installation and operation of experimental testbeds, key infrastructure, generic middleware and making application code grid aware Does NOT fund physics analysis or experiment specific algorithms Does not match well with PPARC's funding profile A mechanism for moving some money forward from the third year needs to be found Requires substantial investment NOW £2.5M required this year to kick start programme GridPP e-Science Presentation