Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tony Doyle “GridPP – Project Elements” UK e-Science All Hands Conference, Sheffield 3 September 2002.

Similar presentations


Presentation on theme: "Tony Doyle “GridPP – Project Elements” UK e-Science All Hands Conference, Sheffield 3 September 2002."— Presentation transcript:

1 Tony Doyle a.doyle@physics.gla.ac.uk “GridPP – Project Elements” UK e-Science All Hands Conference, Sheffield 3 September 2002

2 Tony Doyle - University of Glasgow Outline GridPP – Project Elements Who are we? Motivation Overview Project Map 1.CERN 2.DataGrid 3.Applications 4.Infrastructure 5.Interoperability 6.Dissemination 7.Finances Achievements and Issues Summary

3 Tony Doyle - University of Glasgow Who are we? Nick White /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Nick White member Roger Jones /O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Roger Jones member Sabah Salih /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Sabah Salih member Santanu Das /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=Santanu Das member Tony Cass /O=Grid/O=CERN/OU=cern.ch/CN=Tony Cass member David Kelsey /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=David Kelsey member Henry Nebrensky /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Henry Nebrensky member Paul Kyberd /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Paul Kyberd member Peter Hobson /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Peter R Hobson member Robin Middleton /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Robin Middleton member Alexander Holt /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alexander Holt member Alasdair Earl /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alasdair Earl member Akram Khan /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Akram Khan member Stephen Burke /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Stephen Burke member Paul Millar /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Paul Millar member Andy Parker /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=M.A.Parker member Neville Harnew /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Neville Harnew member Pete Watkins /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Watkins member Owen Maroney /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Owen Maroney member Alex Finch /O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Alex Finch member Antony Wilson /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Antony Wilson member Tim Folkes /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Tim Folkes member Stan Thompson /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=A. Stan Thompson member Mark Hayes /O=Grid/O=UKHEP/OU=amtp.cam.ac.uk/CN=Mark Hayes member Todd Huffman /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=B. Todd Huffman member Glenn Patrick /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=G N Patrick member Pete Gronbech /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Pete Gronbech member Nick Brook /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Nick Brook member Marc Kelly /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Marc Kelly member Dave Newbold /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Dave Newbold member Kate Mackay /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Catherine Mackay member Girish Patel /O=Grid/O=UKHEP/OU=ph.liv.ac.uk/CN=Girish D. Patel member David Martin /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=David J. Martin member Peter Faulkner /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Faulkner member David Smith /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=David Smith member Steve Traylen /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Steve Traylen member Ruth Dixon del Tufo /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Ruth Dixon del Tufo member Linda Cornwall /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Linda Cornwall member /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Yee-Ting Li member Paul D. Mealor /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul D Mealor member /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul A Crosby member David Waters /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=David Waters member Bob Cranfield /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Bob Cranfield member Ben West /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Ben West member Rod Walker /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Rod Walker member /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Philip Lewis member Dave Colling /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Dr D J Colling member Alex Howard /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Alex Howard member Roger Barlow /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Roger Barlow member Joe Foster /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Joe Foster member Alessandra Forti /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Alessandra Forti member Peter Clarke /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Peter Clarke member Andrew Sansum /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Andrew Sansum member John Gordon /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon member Andrew McNab /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Andrew McNab member Richard Hughes-Jones /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Richard Hughes-Jones member Gavin McCance /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Gavin McCance member Tony Doyle /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Tony Doyle admin Alex Martin /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=A.J.Martin member Steve Lloyd /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=S.L.Lloyd admin John Gordon /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon member/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Nick White/O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Roger Jones/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Sabah Salih /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=Santanu Das/O=Grid/O=CERN/OU=cern.ch/CN=Tony Cass /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=David Kelsey/O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Henry Nebrensky/O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Paul Kyberd/O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Peter R Hobson/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Robin Middleton/O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alexander Holt/O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alasdair Earl/O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Akram Khan/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Stephen Burke/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Paul Millar/O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=M.A.Parker /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Neville Harnew/O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Watkins/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Owen Maroney/O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Alex Finch/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Antony Wilson/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Tim Folkes/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=A. Stan Thompson /O=Grid/O=UKHEP/OU=amtp.cam.ac.uk/CN=Mark Hayes/O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=B. Todd Huffman/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=G N Patrick/O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Pete Gronbech/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Nick Brook/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Marc Kelly/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Dave Newbold /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Catherine Mackay/O=Grid/O=UKHEP/OU=ph.liv.ac.uk/CN=Girish D. Patel/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=David J. Martin/O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Faulkner/O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=David Smith /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Steve Traylen/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Ruth Dixon del Tufo/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Linda Cornwall/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Yee-Ting Li/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul D Mealor/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul A Crosby/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=David Waters/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Bob Cranfield/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Ben West/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Rod Walker/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Philip Lewis/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Dr D J Colling/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Alex Howard /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Roger Barlow/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Joe Foster/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Alessandra Forti/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Peter Clarke/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Andrew Sansum /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Andrew McNab/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Richard Hughes-Jones /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Gavin McCance/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Tony Doyle /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=A.J.Martin/O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=S.L.Lloyd /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon

4 Tony Doyle - University of Glasgow Origin of Mass - Rare Phenomenon 9 orders of magnitude! The HIGGS All interactions

5 Tony Doyle - University of Glasgow Matter-Antimatter Asymmetry Complex Interactions DesignSimulation Complexity Understanding

6 Tony Doyle - University of Glasgow GridPP Overview EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA LCFG MDS deployment GridSite SlashGrid Spitfire… Applications (start-up phase) BaBar CDF/D0 (SAM) ATLAS/LHCb CMS (ALICE) UKQCD £17m 3-year project funded by PPARC through the e-Science Programme CERN - LCG (start-up phase) funding for staff and hardware... £3.78m £5.67m £3.66m £1.99m £1.88m CERN DataGrid Tier - 1/A Applications Operations http://www.gridpp.ac.uk

7 Tony Doyle - University of Glasgow Provide architecture and middleware Use the Grid with simulated data Use the Grid with real data Future LHC Experiments Running US Experiments Build Tier-A/prototype Tier-1 and Tier-2 centres in the UK and join worldwide effort to develop middleware for the experiments GridPP Bridge

8 Tony Doyle - University of Glasgow GridPP Vision From Web to Grid - Building the next IT Revolution Premise The next IT revolution will be the Grid. The Grid is a practical solution to the data-intensive problems that must be overcome if the computing needs of many scientific communities and industry are to be fulfilled over the next decade. Aim The GridPP Collaboration aims to develop and deploy a large-scale science Grid in the UK for use by the worldwide particle physics community. Many Challenges.. Shared distributed infrastructure For all applications

9 Tony Doyle - University of Glasgow GridPP Objectives 1.SCALE: GridPP will deploy open source Grid software (middleware) and hardware infrastructure to enable the testing of a prototype of the Grid for the LHC of significant scale. 2.INTEGRATION: The GridPP project is designed to integrate with the existing Particle Physics programme within the UK, thus enabling early deployment and full testing of Grid technology and efficient use of limited resources. 3.DISSEMINATION: The project will disseminate the GridPP deliverables in the multi-disciplinary e-science environment and will seek to build collaborations with emerging non- PPARC Grid activities both nationally and internationally. 4.UK PHYSICS ANALYSES (LHC): The main aim is to provide a computing environment for the UK Particle Physics Community capable of meeting the challenges posed by the unprecedented data requirements of the LHC experiments. 5.UK PHYSICS ANALYSES (OTHER): The process of creating and testing the computing environment for the LHC will naturally provide for the needs of the current generation of highly data intensive Particle Physics experiments: these will provide a live test environment for GridPP research and development. 6.DATAGRID: Open source Grid technology is the framework used to develop this capability. Key components will be developed as part of the EU DataGrid project and elsewhere. 7.LHC COMPUTING GRID: The collaboration builds on the strong computing traditions of the UK at CERN. The CERN working groups will make a major contribution to the LCG research and development programme. 8.INTEROPERABILITY: The proposal is also integrated with developments from elsewhere in order to ensure the development of a common set of principles, protocols and standards that can support a wide range of applications. 9.INFRASTRUCTURE: Provision is made for facilities at CERN (Tier-0), RAL (Tier-1) and use of up to four Regional Centres (Tier-2). 10.OTHER FUNDING: These centres will provide a focus for dissemination to the academic and commercial sector and are expected to attract funds from elsewhere such that the full programme can be realised.

10 Tony Doyle - University of Glasgow GridPP Organisation

11 Tony Doyle - University of Glasgow GridPP Project Map - Elements

12 Tony Doyle - University of Glasgow GridPP Project Map - Metrics and Tasks Available from Web Pages Provides Structure for this talk

13 Tony Doyle - University of Glasgow LHC Computing Challenge Tier2 Centre ~1 TIPS Online System Offline Farm ~20 TIPS CERN Computer Centre >20 TIPS RAL Regional Centre US Regional Centre French Regional Centre Italian Regional Centre Institute Institute ~0.25TIPS Workstations ~100 MBytes/sec 100 - 1000 Mbits/sec One bunch crossing per 25 ns 100 triggers per second Each event is ~1 Mbyte Physicists work on analysis “channels” Each institute has ~10 physicists working on one or more channels Data for these channels should be cached by the institute server Physics data cache ~PBytes/sec ~ Gbits/sec or Air Freight Tier2 Centre ~1 TIPS ~Gbits/sec Tier 0 Tier 1 Tier 3 Tier 4 1 TIPS = 25,000 SpecInt95 PC (1999) = ~15 SpecInt95 ScotGRID++ ~1 TIPS Tier 2 1. CERN

14 Tony Doyle - University of Glasgow LHC Computing Grid: High Level Planning 2002200520042003 Q1 Q2 Q3 Q4 Prototype of Hybrid Event Store (Persistency Framework) Hybrid Event Store available for general users Distributed production using grid services First Global Grid Service (LCG-1) available Distributed end-user interactive analysis Full Persistency Framework LCG-1 reliability and performance targets “50% prototype” (LCG-3) available LHC Global Grid TDR applications Grid as a Service 1. CERN

15 Tony Doyle - University of Glasgow DataGrid Middleware Work Packages Collect requirements for middleware –Take into account requirements from application groups Survey current technology –For all middleware Core Services testbed –Testbed 0: Globus (no EDG middleware) Grid testbed releases Testbed 1.2: current release WP1: workload –Job resource specification & scheduling WP2: data management –Data access, migration & replication WP3: grid monitoring services –Monitoring infrastructure, directories & presentation tools WP4: fabric management –Framework for fabric configuration management & automatic software installation WP5: mass storage management –Common interface for Mass Storage WP7: network services –Network services and monitoring Talk: “GridPP – Developing an Operational Grid” Dave Colling 2. DataGrid

16 Tony Doyle - University of Glasgow EDG TestBed 1 Status 30 Aug 2002 17:38 Web interface showing status of (~400) servers at testbed 1 sites Production Centres

17 Tony Doyle - University of Glasgow GridPP Sites in Testbed: Status 30 Aug 2002 17:38 Project Map Software releases at each site

18 Tony Doyle - University of Glasgow 2003 CMS data grid system vision A CMS Data Grid Job 3. Applications Demo: Current status of the system vision

19 Tony Doyle - University of Glasgow ATLAS/LHCb Architecture Converter Algorithm Event Data Service Persistency Service Data Files Algorithm Transient Event Store Detec. Data Service Persistency Service Data Files Transient Detector Store Message Service JobOptions Service Particle Prop. Service Other Services Histogram Service Persistency Service Data Files Transient Histogram Store Application Manager Converter The Gaudi Framework - developed by LHCb - adopted by ATLAS (Athena) 3. Applications

20 Tony Doyle - University of Glasgow GANGA: Gaudi ANd Grid Alliance GAUDI Program GANGA GUI JobOptions Algorithms Collective & Resource Grid Services Histograms Monitoring Results Making the Grid Work for the Experiments 3. Applications

21 Tony Doyle - University of Glasgow Overview of SAM Database Server(s) (Central Database) Name Server Global Resource Manager(s) Log server Station 1 Servers Station 2 Servers Station 3 Servers Station n Servers Mass Storage System(s) Shared Globally Local Shared Locally Arrows indicate Control and data flow 3. Applications

22 Tony Doyle - University of Glasgow Overview of SAM SAM and DataGrid using common (lower) middleware 3. Applications

23 Tony Doyle - University of Glasgow and the Grid Running Experiment at SLAC (San Francisco) Producing many Terabytes of useful data (500 TByte Objectivity Database) Computationally intense analysis 500+ physicists spread over 72 institutes in 9 countries 50+ in UK Scale forces move from central to distributed computing 1/3-size prototype for LHC experiments In place – must respect existing practice Running and needing solutions today 3. Applications

24 Tony Doyle - University of Glasgow Experiment Deployment

25 Tony Doyle - University of GlasgowAdvertisement Talk: “The Quantum ChromoDynamics Grid” James Perry 3. Applications

26 Tony Doyle - University of Glasgow Tier-0 - CERN Commodity Processors +IBM (mirrored) EIDE Disks.. 2004 Scale: ~1,000 CPUs ~0.5 PBytes Compute Element (CE) Storage Element (SE) User Interface (UI) Information Node (IN) Storage Systems.. 4. Infrastructure

27 Tony Doyle - University of Glasgow UK Tier-1 RAL New Computing Farm 4 racks holding 156 dual 1.4GHz Pentium III cpus. Each box has 1GB of memory, a 40GB internal disk and 100Mb ethernet. 50TByte disk-based Mass Storage Unit after RAID 5 overhead. PCs are clustered on network switches with up to 8x1000Mb ethernet out of each rack. Tape Robot upgraded last year uses 60GB STK 9940 tapes 45TB currrent capacity could hold 330TB. 2004 Scale: 1000 CPUs 0.5 PBytes 4. Infrastructure

28 Tony Doyle - University of Glasgow Network Network Internal networking is currently a hybrid of –100Mb(ps) to nodes of cpu farms –1Gb to disk servers –1Gb to tape servers UK: academic network SuperJANET4 –2.5Gb backbone upgrading to 20Gb in 2003 EU: SJ4 has 2.5Gb interconnect to Geant US: New 2.5Gb link to ESnet and Abilene for researchers UK involved in networking development –internal with Cisco on QoS –external with DataTAG 4. Infrastructure

29 Tony Doyle - University of Glasgow Regional Centres SRIF Infrastructure Local Perspective: Consolidate Research Computing Optimisation of Number of Nodes? 4 Relative size dependent on funding dynamics Global Perspective: V. Basic Grid Skeleton

30 Tony Doyle - University of Glasgow UK Tier-2 ScotGRID ScotGrid Processing nodes at Glasgow 59 IBM X Series 330 dual 1 GHz Pentium III with 2GB memory 2 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and dual ethernet 3 IBM X Series 340 dual 1 GHz Pentium III with 2GB memory and 100 + 1000 Mbit/s ethernet 1TB disk LTO/Ultrium Tape Library Cisco ethernet switches ScotGrid Storage at Edinburgh IBM X Series 370 PIII Xeon with 512 MB memory 32 x 512 MB RAM 70 x 73.4 GB IBM FC Hot- Swap HDD CDF equipment at Glasgow 8 x 700 MHz Xeon IBM xSeries 370 4 GB memory 1 TB disk Griddev testrig at Glasgow 4 x 233 MHz Pentium II 2004 Scale: 300 CPUs 0.1 PBytes BaBar UltraGrid System at Edinburgh 4 UltraSparc 80 machines in a rack 450 MHz CPUs in each 4Mb cache, 1 GB memory Fast Ethernet and Myrinet switching 4. Infrastructure

31 Tony Doyle - University of Glasgow GridPP Context 5. Interoperability

32 Tony Doyle - University of Glasgow Grid issues – Coordination Technical part is not the only problem Sociological problems? resource sharing –Short-term productivity loss but long-term gain Key? communication/coordination between people/centres/countries –This kind of world-wide close coordination across multi-national collaborations has never been done in the past We need mechanisms here to make sure that all centres are part of a global planning –In spite of different conditions of funding, internal planning, timescales etc The Grid organisation mechanisms should be complementary and not parallel or conflicting to existing experiment organisation –LCG-DataGRID-eSC-GridPP –BaBar-CDF-D0-ALICE-ATLAS-CMS-LHCb-UKQCD Local Perspective: build upon existing strong PP links in the UK to build a single Grid for all experiments

33 Tony Doyle - University of GlasgowAuthentication/Authorization Authentication (CA Working Group) –11 national certification authorities –policies & procedures  mutual trust –users identified by CA’s certificates Authorization (Authorization Working Group) –Based on Virtual Organizations (VO). –Management tools for LDAP-based membership lists. –6+1 Virtual Organizations VO’s ALICEEarth Obs. ATLASBiomedical CMS LHCbGuidelines CA’s CERN CESNET CNRS DataGrid- ES GridPP Grid-Ireland INFN LIP NIKHEF NorduGrid Russian DataGrid 2. DataGrid 5. Interoperability Built In

34 Tony Doyle - University of Glasgow Current User Base Grid Support Centre GridPP (UKHEP) CA uses primitive technology –It works but takes effort –201 personal certs issued –119 other certs issued GSC will run a CA for UK escience CA –Uses openCA; Registration Authority uses web –We plan to use it –Namespace identifies RA, not Project –Authentication not Authorisation Through GSC we have access to skills of CLRC eSC Use helpdesk to formalise support later in the rollout  UK e-Science  Certification  Authority 5. Interoperability

35 Tony Doyle - University of Glasgow Trust Relationships 5. Interoperability

36 Tony Doyle - University of Glasgow Dissemination: Godel’s Theorem? 6. Dissemination Project Map Elements

37 Tony Doyle - University of Glasgow t0t0 t1t1 From Grid to Web… using GridSite 6. Dissemination

38 Tony Doyle - University of Glasgow £17m++ 3-Year Project Five components –Tier-1/A = Hardware + CLRC ITD Support Staff –DataGrid = DataGrid Posts + CLRC PPD Staff –Applications = Experiments Posts –Operations = Travel + Management +  Early Investment –CERN = LCG posts + Tier-0 +  LTA 7. Finances

39 Tony Doyle - University of Glasgow GridPP – Achievements and Issues 1st Year Achievements Complete Project Map –Applications: Middleware: Hardware Fully integrated with EU DataGrid and LCG Projects Rapid middleware deployment /testing Integrated US-EU applications development e.g. BaBar+EDG Roll-out document for all sites in the UK (Core Sites, Friendly Testers, User Only). Testbed up and running at 15 sites in the UK Tier-1 Deployment 200 GridPP Certificates issued First significant use of Grid by an external user (LISA simulations) in May 2002 Web page development (GridSite) Issues for Year 2 Status: 30 Aug 2002 17:38 GMT – monitor and improve testbed deployment efficiency from Sep 1… Importance of EU-wide development of middleware Integrated Testbed for use/testing by all applications Common “integration” layer between middleware and application software Integrated US-EU applications development Tier-1 Grid Production Mode Tier-2 Definitions and Deployment Integrated Tier-1 + Tier-2 Testbed Transfer to UK e-Science CA Integration with other UK projects e.g. AstroGrid, MyGrid…

40 Tony Doyle - University of GlasgowSummary Grid success is fundamental for PP 1.CERN = LCG, Grid as a Service. 2.DataGrid = Middleware built upon Globus and Condor-G. Testbed 1 deployed. 3.Applications – complex, need to interface to middleware.  LHC Analyses – ongoing feedback/development.  Other Analyses have immediate requirements. Integrated using Globus, Condor, EDG tools 4.Infrastructure = Tiered computing to the physicist desktop:  Scale in UK? 1 PByte and 2,000 distributed CPUs  GridPP in Sept 2004 5.Integration = ongoing… 6.Dissemination Co-operation required with other disciplines/industry 7.Finances – under control Year 1 was a good starting point. First Grid jobs have been submitted.. Looking forward to Year 2. Web services ahead..

41 Tony Doyle - University of Glasgow Holistic View: Multi-layered Issues


Download ppt "Tony Doyle “GridPP – Project Elements” UK e-Science All Hands Conference, Sheffield 3 September 2002."

Similar presentations


Ads by Google