Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tony Doyle “GridPP – Year 1 to Year 2”, Collaboration Meeting, Imperial College, 16 September 2002.

Similar presentations


Presentation on theme: "Tony Doyle “GridPP – Year 1 to Year 2”, Collaboration Meeting, Imperial College, 16 September 2002."— Presentation transcript:

1 Tony Doyle a.doyle@physics.gla.ac.uk “GridPP – Year 1 to Year 2”, Collaboration Meeting, Imperial College, 16 September 2002

2 Tony Doyle - University of Glasgow Outline GridPP – Year 1 to Year 2.. Who are we? Are we a Grid? Historical Perspective Philosophy of the Grid? Shared Distributed Resources 2003 Cartology of the Grid  Will the EDG middleware be robust?  LHC Computing Grid Status Report  Are we organised? Achievements and Issues (Not really a) Summary

3 Tony Doyle - University of Glasgow Who are we? Nick White /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Nick White member Roger Jones /O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Roger Jones member Sabah Salih /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Sabah Salih member Santanu Das /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=Santanu Das member Tony Cass /O=Grid/O=CERN/OU=cern.ch/CN=Tony Cass member David Kelsey /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=David Kelsey member Henry Nebrensky /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Henry Nebrensky member Paul Kyberd /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Paul Kyberd member Peter Hobson /O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Peter R Hobson member Robin Middleton /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Robin Middleton member Alexander Holt /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alexander Holt member Alasdair Earl /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alasdair Earl member Akram Khan /O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Akram Khan member Stephen Burke /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Stephen Burke member Paul Millar /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Paul Millar member Andy Parker /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=M.A.Parker member Neville Harnew /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Neville Harnew member Pete Watkins /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Watkins member Owen Maroney /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Owen Maroney member Alex Finch /O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Alex Finch member Antony Wilson /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Antony Wilson member Tim Folkes /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Tim Folkes member Stan Thompson /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=A. Stan Thompson member Mark Hayes /O=Grid/O=UKHEP/OU=amtp.cam.ac.uk/CN=Mark Hayes member Todd Huffman /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=B. Todd Huffman member Glenn Patrick /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=G N Patrick member Pete Gronbech /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Pete Gronbech member Nick Brook /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Nick Brook member Marc Kelly /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Marc Kelly member Dave Newbold /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Dave Newbold member Kate Mackay /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Catherine Mackay member Girish Patel /O=Grid/O=UKHEP/OU=ph.liv.ac.uk/CN=Girish D. Patel member David Martin /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=David J. Martin member Peter Faulkner /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Faulkner member David Smith /O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=David Smith member Steve Traylen /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Steve Traylen member Ruth Dixon del Tufo /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Ruth Dixon del Tufo member Linda Cornwall /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Linda Cornwall member /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Yee-Ting Li member Paul D. Mealor /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul D Mealor member /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul A Crosby member David Waters /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=David Waters member Bob Cranfield /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Bob Cranfield member Ben West /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Ben West member Rod Walker /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Rod Walker member /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Philip Lewis member Dave Colling /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Dr D J Colling member Alex Howard /O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Alex Howard member Roger Barlow /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Roger Barlow member Joe Foster /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Joe Foster member Alessandra Forti /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Alessandra Forti member Peter Clarke /O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Peter Clarke member Andrew Sansum /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Andrew Sansum member John Gordon /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon member Andrew McNab /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Andrew McNab member Richard Hughes-Jones /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Richard Hughes-Jones member Gavin McCance /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Gavin McCance member Tony Doyle /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Tony Doyle admin Alex Martin /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=A.J.Martin member Steve Lloyd /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=S.L.Lloyd admin John Gordon /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon member/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Nick White/O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Roger Jones/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Sabah Salih /O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=Santanu Das/O=Grid/O=CERN/OU=cern.ch/CN=Tony Cass /O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=David Kelsey/O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Henry Nebrensky/O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Paul Kyberd/O=Grid/O=UKHEP/OU=brunel.ac.uk/CN=Peter R Hobson/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Robin Middleton/O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alexander Holt/O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Alasdair Earl/O=Grid/O=UKHEP/OU=ph.ed.ac.uk/CN=Akram Khan/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Stephen Burke/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Paul Millar/O=Grid/O=UKHEP/OU=hep.phy.cam.ac.uk/CN=M.A.Parker /O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Neville Harnew/O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Watkins/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Owen Maroney/O=Grid/O=UKHEP/OU=lancs.ac.uk/CN=Alex Finch/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=Antony Wilson/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Tim Folkes/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=A. Stan Thompson /O=Grid/O=UKHEP/OU=amtp.cam.ac.uk/CN=Mark Hayes/O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=B. Todd Huffman/O=Grid/O=UKHEP/OU=pp.rl.ac.uk/CN=G N Patrick/O=Grid/O=UKHEP/OU=physics.ox.ac.uk/CN=Pete Gronbech/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Nick Brook/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Marc Kelly/O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Dave Newbold /O=Grid/O=UKHEP/OU=phy.bris.ac.uk/CN=Catherine Mackay/O=Grid/O=UKHEP/OU=ph.liv.ac.uk/CN=Girish D. Patel/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=David J. Martin/O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=Peter Faulkner/O=Grid/O=UKHEP/OU=ph.bham.ac.uk/CN=David Smith /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Steve Traylen/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Ruth Dixon del Tufo/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Linda Cornwall/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Yee-Ting Li/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul D Mealor/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Paul A Crosby/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=David Waters/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Bob Cranfield/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Ben West/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Rod Walker/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Philip Lewis/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Dr D J Colling/O=Grid/O=UKHEP/OU=hep.ph.ic.ac.uk/CN=Alex Howard /O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Roger Barlow/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Joe Foster/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Alessandra Forti/O=Grid/O=UKHEP/OU=hep.ucl.ac.uk/CN=Peter Clarke/O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=Andrew Sansum /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Andrew McNab/O=Grid/O=UKHEP/OU=hep.man.ac.uk/CN=Richard Hughes-Jones /O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Gavin McCance/O=Grid/O=UKHEP/OU=ph.gla.ac.uk/CN=Tony Doyle /O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=A.J.Martin/O=Grid/O=UKHEP/OU=ph.qmw.ac.uk/CN=S.L.Lloyd /O=Grid/O=UKHEP/OU=hepgrid.clrc.ac.uk/CN=John Gordon We need a New “Year 2” Group Photo

4 Tony Doyle - University of Glasgow Are we a Grid? 1.Coordinates resources that are not subject to centralized control 2.… using standard, open, general-purpose protocols and interfaces 3.… to deliver nontrivial qualities of service 1.YES. This is why development and maintenance of a UK-EU-US testbed is important. 2.YES... Globus/Condor- G/EDG ~meet this requirement. Common experiment application layers are also important here e.g. SAM, GANGA. 3.NO(T YET)… Experiments should define whether this is true via this year’s data analyses and challenges. http://www-fp.mcs.anl.gov/~foster/Articles/WhatIsTheGrid.pdf

5 Tony Doyle - University of Glasgow Historical Perspective I wrote in 1990 a program called "WorlDwidEweb", a point and click hypertext editor which ran on the "NeXT" machine. This, together with the first Web server, I released to the High Energy Physics community at first, and to the hypertext and NeXT communities in the summer of 1991. Tim Berners-Lee The first three years were a phase of persuasion, aided by my colleague and first convert Robert Cailliau, to get the Web adopted… We needed seed servers to provide incentive and examples, and all over the world inspired people put up all kinds of things… Between the summers of 1991 and 1994, the load on the first Web server ("info.cern.ch") rose steadily by a factor of 10 every year…

6 Tony Doyle - University of Glasgow News - Summer 2002 GridPP Demonstrations at the UK e-Science All Hands Meeting Fri 30 August 2002 EU DataGrid Testbed 1.2 released Mon 12 August 2002 Computer Science Fellowships Wed 31 July 2002 GGF5 in Edinburgh Thu 25 July 2002 OGSA Early Adopters Workshop Sat 6 July 2002 GridPP sponsors joint ATLAS, LHCb Workshop Fri 31 May 2002 Getting started on the EDG Testbed Fri 24 May 2002 First significant use of UK particle physics Grid Sat 11 May 2002

7 Tony Doyle - University of Glasgow News – Spring 2002 GridPP demonstrated at NeSC opening Thu 25 April 2002 First TierA/Prototype Tier1 Hardware delivered Wed 13 March 2002 LHC Computing Grid Project launched Mon 11 March 2002 Fourth EDG Conference in Paris Mon 4 March 2002 The DataGrid project successfully passes the first year review Fri 1 March 2002 RAL included in successful deployment of DataGrid 1.1 Fri 1 March 2002

8 Tony Doyle - University of Glasgow News – Winter 2002 25 February 2002 Second round of PPARC funded GRID Computing Opportunities at CERN announced 25 February 2002 Second round of PPARC e_Science Studentships announced 21 February 2002 X.509 Certificates authenticate file transfers across the Atlantic 17 February 2002 Fourth Global Grid Forum held in Toronto

9 Tony Doyle - University of Glasgow News – Autumn 2001 Internet2, Geant and the Grid discussed in Guardian article, 8 November 2001 IBM announces worldwide participation in grid initiatives, 2 August 2001 What might this tell us about Year 1? 1.The year has been busy for everyone.. 2.(2-4-6-8) A lot has been happening inside and outside.. Linear growth 3.In the end we demonstrated that we lead Grid development/deployment in the UK.. 4.Which is being recognised externally, but we all need to plan for future success(es)..

10 Tony Doyle - University of Glasgow Interlude… Last week’s News: E-Science testbed – we can (and should) help The issue in question is how most effectively to encourage use of the emerging UK Grid infrastructure by scientists and engineers. The proposal we discussed at our meeting this week was that JCSR might fund a 'Grid Computing Testbed' with the specific purpose of accelerating development and deployment of the research Grid. This would be of a scale which would provide a significant level of computational resource and it would be available to researchers only through the use of digital certificates and Globus but it would be free at the point of use. Through access to this testbed, users would provide convincing evidence of the usability and usefulness of the Grid. It has been suggested that the most useful and cost effective type of resource to provide would be one or more Beowulf clusters (commodity processor clusters with high throughput, low latency interconnect) of a size not readily available to researchers at their home institution, say with 256 processors. David Boyd (JCSR, 13/9/02) “We needed seed servers to provide incentive and examples…”

11 Tony Doyle - University of Glasgow Philosophy of the Grid? “Everything is becoming, nothing is.” Plato “Common sense is the best distributed commodity in the world. For every (wo)man is convinced (s)he is well supplied with it.” Descartes “The superfluous is very necessary” Voltaire “Heidegger, Heidegger was a boozy beggar, I drink therefore I am” Python “Only daring speculation can lead us further, and not accumulation of facts.” Einstein “The real, then, is that which, sooner or later, information and reasoning would finally result in.” C. S. Pierce “The philosophers have only interpreted the world in various ways; the point is to change it.” Marx (some of) these may be relevant to your view of “The Grid”…

12 Tony Doyle - University of Glasgow Another Grid?. Net March 2001 My (i.e. Microsoft) Services Alerts, Application settings, Calendar, Categories, Contacts, Devices, Documents, FavouriteWebSites, Inbox, Lists, Location, Presence, Profile, Services, Wallet(?!) Microsoft drops My Services – centralised planning June 2002 Microsoft unveils new identification plans… July 2002 TrustBridge = “Grid” Distributed Authentication using.Net Passport (and other) servers (Kerberos 5.0) Authorisation defined by a “Federation” = VO OO using C# Microsoft Increased research budget by 20% in 2003 (from £2.8B to £3.4B) to develop.Net

13 Tony Doyle - University of Glasgow Red pill or blue pill? Microsoft®.NET is a set of Microsoft software technologies for connecting your world of information, people, systems and devices. It enables an unprecedented level of software integration through the use of XML web services: small, discrete, building-block applications that connect to each other - as well as to other, larger applications - via the Internet. The three criteria apply most clearly to the various large-scale Grid deployments being undertaken within the scientific community... Each of these systems integrates resources from multiple institutions, each with their own policies and mechanisms; uses open, general-purpose (Globus Toolkit) protocols to negotiate and manage sharing; and addresses multiple quality of service dimensions, including security, reliability, and performance.

14 Tony Doyle - University of Glasgow GridPP Vision From Web to Grid - Building the next IT Revolution Premise The next IT revolution will be the Grid. The Grid is a practical solution to the data-intensive problems that must be overcome if the computing needs of many scientific communities and industry are to be fulfilled over the next decade. Aim The GridPP Collaboration aims to develop and deploy a large-scale science Grid in the UK for use by the worldwide particle physics community. Many Challenges.. Shared distributed infrastructure For all applications

15 Tony Doyle - University of Glasgow GridPP Overview EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA LCFG MDS deployment GridSite SlashGrid Spitfire… Applications (start-up phase) BaBar CDF/D0 (SAM) ATLAS/LHCb CMS (ALICE) UKQCD £17m 3-year project funded by PPARC through the e-Science Programme CERN - LCG (start-up phase) funding for staff and hardware... £3.78m £5.67m £3.66m £1.99m £1.88m CERN DataGrid Tier - 1/A Applications Operations http://www.gridpp.ac.uk

16 Tony Doyle - University of Glasgow Provide architecture and middleware Use the Grid with simulated data Use the Grid with real data Future LHC Experiments Running US Experiments Build Tier-A/prototype Tier-1 and Tier-2 centres in the UK and join worldwide effort to develop middleware for the experiments GridPP Bridge

17 Tony Doyle - University of Glasgow Grid issues – Coordination Technical part is not the only problem Sociological problems? resource sharing –Short-term productivity loss but long-term gain Key? communication/coordination between people/centres/countries –This kind of world-wide close coordination across multi-national collaborations has never been done in the past We need mechanisms here to make sure that all centres are part of a global planning –In spite of different conditions of funding, internal planning, timescales etc The Grid organisation mechanisms should be complementary and not parallel or conflicting to existing experiment organisation –LCG-DataGRID-eSC-GridPP –BaBar-CDF-D0-ALICE-ATLAS-CMS-LHCb-UKQCD Local Perspective: build upon existing strong PP links in the UK to build a single Grid for all experiments

18 Tony Doyle - University of Glasgow Shared Distributed Resources 2003 Tier-1: 600 CPUs + 150 TB Tier-2:  (4000 CPUs + 200 TB) which could be shared by the experiments to first test the concept and then meet particular deadlines. The key is to create a Grid. Steps could be: 1.Agree to allow other experiments’ software to be installed; 2.Agree to share on a limited basis e.g. limited tests on allocated days; 3.Aim to extend this capability... to increase  (efficiency) Tier-1 “The real, then, is that which, sooner or later, information and reasoning would finally result in…”

19 Tony Doyle - University of Glasgow A Grid in 2003? This will all be a bit ad-hoc… Just as it is in Testbed-1… … but Testbed-2 will be different Information Services will be increasingly important Tier-1 “ … to deliver nontrivial qualities of service …”

20 Tony Doyle - University of Glasgow Distributed Resources in 2003? This will be less ad-hoc… …should also consider other applications Tier-1 “The first three years were a phase of persuasion…”

21 Tony Doyle - University of Glasgow Cartology of the Grid Dynamic version Including: Resource Discovery? Network monitoring? CPU load average? Disk resource availability? What can we learn by looking at a few maps?…

22 Tony Doyle - University of Glasgow Manchester 1G 100M 1 G soon Liverpool 155M 100M 4*155M soon. To hep ? Lancaster 155M 100M move to c&nlman at 155Mbit Glasgow 1G 100M 30M? Edinburgh 1G 100M UCL 155M 1G 30M? RAL 622M 100M Gig on site soon Oxford 622M 100M Durham 155M ??100M Birmingham 622M ?? 100M IC 155M 34M then 1G to Hep QMW 155M ?? Cambridge 1G 16M? RHBNC 34M 155M soon ?? 100M Brunel 155M ?? Bristol 622M 100M Sheffield 155M ??100M Swansea 155M 100M Portsmouth 155M 100M Southampton 155 100M Sussex155 100M DL 155M 100M Connectivity of UK Grid Sites. BW to Campus, BW to site, limit

23 Tony Doyle - University of Glasgow EDG TestBed 1 Status 13 Sep 2002 14:46 Web interface showing status of (~400) servers at testbed 1 sites Production Centres

24 Tony Doyle - University of Glasgow EDG TestBed 1 Status 13 Sep 2002 14:46 Spot the difference? Dynamic version, regenerated each night from MDSDynamic version, regenerated each night from MDS

25 Tony Doyle - University of Glasgow We’re getting there.. Status 13 Sep 2002 14:46 Integrated Information Service inc. resource discovery

26 Tony Doyle - University of Glasgow GridPP Sites in Testbed: Status 13 Sep 2002 14:46?? Dynamic version, regenerated from R-GMA Including: Resource Discovery Network monitoring CPU load average Disk resources More work needed: Part of Testbed and Network Monitoring. Input from Information Service Part of GUIDO? (What’s GUIDO?)

27 Tony Doyle - University of Glasgow Network 2003 Network 2003 Internal networking is currently a hybrid of –100Mb(ps) to nodes of cpu farms –1Gb to disk servers –1Gb to tape servers UK: academic network SuperJANET4 –2.5Gb backbone upgrading to 20Gb in 2003 EU: SJ4 has 2.5Gb interconnect to Geant US: New 2.5Gb link to ESnet and Abilene for researchers UK involved in networking development –internal with Cisco on QoS –external with DataTAG

28 Tony Doyle - University of Glasgow Robust? Development Infrastructure CVS Repository –management of DataGrid source code –all code available (some mirrored) Bugzilla Package Repository –public access to packaged DataGrid code Development of Management Tools –statistics concerning DataGrid code –auto-building of DataGrid RPMs –publishing of generated API documentation –latest build = Release 1.2 (August 2002) 140506 Lines of Code 10 Languages (Release 1.0)

29 Tony Doyle - University of Glasgow ComponentETTUTITNINFFMBSD Resource Broker vvvl Job Desc. Lang. vvvl Info. Index vvvl User Interface vvvl Log. & Book. Svc. vvvl Job Sub. Svc. vvvl Broker Info. API vvl SpitFire vvl GDMP l Rep. Cat. API vvl Globus Rep. Cat. vvl ETTExtensively Tested in Testbed UTUnit Testing ITIntegrated Testing NINot Installed NFFSome Non-Functioning Features MBSome Minor Bugs SDSuccessfully Deployed ComponentETTUTITNINFFMBSD Schema vvvl FTree vvl R-GMA vvl Archiver Module vvl GRM/PROVE vvl LCFG vvvl CCM vl Image Install. vl PBS Info. Prov. vvvl LSF Info. Prov. vvl ComponentETTUTITNINFFMBSDSD SE Info. Prov. Vvl File Elem. Script l Info. Prov. Config. Vvl RFIO Vvl MSS Staging l Mkgridmap & daemon vl CRL update & daemon vl Security RPMs vl EDG Globus Config. vvl ComponentETTUTITNINFFMBSD PingER vvl UDPMon vvl IPerf vvl Globus2 Toolkit vvl Robust? Software Evaluation

30 Tony Doyle - University of Glasgow Robust? Middleware Testbed(s) Validation/ Maintenance =>Testbed(s) EU-wide development

31 Tony Doyle - University of Glasgow Robust? Code Development Issues Reverse Engineering (C++ code analysis and restructuring; coding standards) => abstraction of existing code to UML architecture diagrams Language choice (currently 10 used in DataGrid) –Java = C++ - - “features” (global variables, pointer manipulation, goto statements, etc.). –Constraints (performance, libraries, legacy code) Testing (automation, object oriented testing) Industrial strength? OGSA-compliant? O(20 year) Future proof?? ETTExtensively Tested in Testbed UTUnit Testing ITIntegrated Testing NINot Installed NFFSome Non-Functioning Features MBSome Minor Bugs SDSuccessfully Deployed

32 Tony Doyle - University of Glasgow LHC Computing Grid: High Level Planning 2002200520042003 Q1 Q2 Q3 Q4 Prototype of Hybrid Event Store (Persistency Framework) Hybrid Event Store available for general users Distributed production using grid services First Global Grid Service (LCG-1) available Distributed end-user interactive analysis Full Persistency Framework LCG-1 reliability and performance targets “50% prototype” (LCG-3) available LHC Global Grid TDR applications Grid as a Service 1. CERN

33 Tony Doyle - University of Glasgow LCG Level 1 Milestones proposed to LHCC M1.1 - June 03First Global Grid Service (LCG-1) available -- this milestone and M1.3 defined in detail by end 2002 M1.2 - June 03Hybrid Event Store (Persistency Framework) available for general users M1.3a - November 03LCG-1 reliability and performance targets achieved M1.3b - November 03Distributed batch production using grid services M1.4 - May 04Distributed end-user interactive analysis -- detailed definition of this milestone by November 03 M1.5 - December 04“50% prototype” (LCG-3) available -- detailed definition of this milestone by June 04 M1.6 - March 05Full Persistency Framework M1.7 - June 05LHC Global Grid TDR

34 Tony Doyle - University of Glasgow LCG Level 1 Milestones 2002200520042003 Q1 Q2 Q3 Q4 Hybrid Event Store available for general users Distributed production using grid services First Global Grid Service (LCG-1) available Distributed end-user interactive analysis Full Persistency Framework LCG-1 reliability and performance targets “50% prototype” (LCG-3) available LHC Global Grid TDR applications grid 1. CERN

35 Tony Doyle - University of Glasgow SC2 Architects Forum Design decisions, implementation strategy for physics applications Grid Deployment Board Coordination, standards, management policies for operating the LCG Grid Service PEB requirements LCG Process

36 Tony Doyle - University of Glasgow LCG Project Execution Board Project Execution Board Membership Les Robertson Project leader / chairperson Area managers Torre WenausApplications Bernd PanzerFabrics Fabrizio Gagliardi Grid Technology Ian Bird Grid deployment Mirco Mazzucato Grid Deployment Board Chair LHC collaboration delegates Alberto MasoniALICE Gilbert PoulardATLAS Vincenzo InnocenteCMS Philippe CharpentierLHCb Additional members Tony Doyle GridPP project leader François Etienne French HEP Grid Projects Miron Livny University of Wisconsin Peter Malzacher German HEP Grid Projects Mirco Mazzucato INFN Grid project leader Ruth Pordes iVDGL/PP DG Project office & ex officio Chris Eck Resource Manager David Foster Chief Technical Officer Matthias KasemannSC2 Chair Massimo Lamanna SC2 Secretary Miguel Marquina Planning Officer, Recruitment, PEB Secretary All UK-funded posts now filled (~20 people) Management Areas Applications Fabric: PASTA 2002 ReportPASTA 2002 Report GRID Technology GRID Deployment: Grid Deployment Board Each report is available via http://lhcgrid.web.cern.ch/LHCgrid/peb/ The process is open, clear and intuitive.

37 Tony Doyle - University of Glasgow Recruitment status

38 Tony Doyle - University of Glasgow

39 Events.. to Files.. to Events RAW ESD AOD TAG “Interesting Events List” RAW ESD AOD TAG RAW ESD AOD TAG Tier-0(International) Tier-1(National) Tier-2(Regional) Tier-3(Local) Data Files Data Files Data Files TAG Data Files Data Files Data Files RAW Data File Data Files Data Files ESD Data Files Data Files AOD Data Event 1 Event 2 Event 3 Not all pre-filtered events are interesting… Non pre-filtered events may be… File Replication Overhead.

40 Tony Doyle - University of Glasgow Events.. to Events Event Replication and Query Optimisation RAW ESD AOD TAG “Interesting Events List” RAW ESD AOD TAG RAW ESD AOD TAG Tier-0(International) Tier-1(National) Tier-2(Regional) Tier-3(Local) Event 1 Event 2 Event 3 Knowledge “Stars in Stripes” Distributed (Replicated) Database

41 Tony Doyle - University of GlasgowPOOL Persistency Framework:

42 Tony Doyle - University of GlasgowGiggle LRC RLI LRC Hierarchical indexing. The higher- level RLI contains pointers to lower-level RLIs or LRCs. Storage Element Storage Element Storage Element Storage Element Storage Element RLI = Replica Location Index LRC = Local Replica Catalog LRC “Scalable?” Trade-off: Consistency Versus Efficiency

43 Tony Doyle - University of Glasgow LCG SC2 (See Nick’s Talk)

44 Tony Doyle - University of Glasgow LCG SC2 (See Nick’s Talk) e.g. Persistency (POOL): The RTAG convened in ten three-hour sessions during the weeks of 28 January, 18 February, and 11 March, and delivered an interim report to the SC2 on 8 March. An additional report was provided during the LCG Launch Workshop on 12 March. A final report to the SC2 is expected on 5 April 2002. Developer Release: November 2003.

45 Tony Doyle - University of Glasgow Grid Technology & Deployment Close collaboration between LCG and EDG on integration and certification of grid middleware –common teams being established –prepares the ground for long-term LCG support of grid middleware Importance of a common grid middleware toolkit –compatible with implementations in Europe, US –flexible enough to evolve with mainline grid developments –responding to the needs of the experiments GLUE – common US-European activity to achieve compatible solutions –supported by DataTAG, iVDGL,.. Grid Deployment Board –first task is the detailed definition of LCG-1, the initial LCG Global Grid Service –this will include defining the set of grid middleware tools to be deployed –target – full definition of LCG-1 by the end of the year - LCG-1 in operation mid-2003

46 Tony Doyle - University of Glasgow Are we (sufficiently) well –organised? Is each part of this structure working? Is the whole working? Comments welcome.

47 Tony Doyle - University of Glasgow GridPP Project Map - Elements

48 Tony Doyle - University of Glasgow GridPP Project Map - Metrics and Tasks Available from Web Pages Provides Structure for PMB (and Dave’s talk)

49 Tony Doyle - University of Glasgow This year’s high point?

50 Tony Doyle - University of Glasgow Things Missing, apparently… i.e. not ideal… …but it works

51 Tony Doyle - University of Glasgow Next year’s high point? Which experiments will use the Grid most efficiently? (Experiment monitoring throughout the year - a common way of measuring experiment’s data-handling metrics… CPU, Disk, #users, #jobs, memory reqts. etc..) What middleware will be used? How efficient will the UK testbed be? How well integrated will it be? How well will we share resources? Need to anticipate questions, in order to answer them…

52 Tony Doyle - University of Glasgow GUI – last meeting

53 Tony Doyle - University of Glasgow GUI – this meeting GridPP User Interface: GUIDO Talk: Demonstration of GridPP portal - Sarah Marr and Dave Colling generic user interface to enable experiments to access resources efficiently..

54 Tony Doyle - University of Glasgow GridPP User Interface Pulling it all together… GridPP Web Links 2003?: Demonstration elements: Gridsite, GUIDO, R-GMA,… Short and Long-Term Monitoring, Local and WAN Monitoring.

55 Tony Doyle - University of Glasgow GridPP – Achievements and Issues 1st Year Achievements Complete Project Map –Applications: Middleware: Hardware Fully integrated with EU DataGrid, LCG and SAM Projects Rapid middleware deployment /testing Integrated US-EU applications development e.g. BaBar+EDG Roll-out document for all sites in the UK (Core Sites, Friendly Testers, User Only). Testbed up and running at 15 sites in the UK Tier-1 Deployment 200 GridPP Certificates issued First significant use of Grid by an external user (LISA simulations) in May 2002 Web page development (GridSite) Issues for Year 2 Status: 13 Sep 2002 14:46 GMT – monitor and improve testbed deployment efficiency short (10 min) and long-term (monthly)… Importance of EU-wide development of middleware and integration with US-led approach Integrated Testbed for use/testing by all applications Common “integration” layer between middleware and application software Integrated US-EU applications development Tier-1 Grid Production Mode Tier-2 Definitions and Deployment Integrated Tier-1 + Tier-2 Testbed Transfer to UK e-Science CA Integration with other UK projects e.g. AstroGrid, MyGrid… Publication of YOUR work

56 Tony Doyle - University of Glasgow Summary (by Project Map areas) Grid success is fundamental for PP 1.CERN = LCG, Grid as a Service. 2.DataGrid = Middleware built upon Globus and Condor-G. Testbed 1 deployed. 3.Applications – complex, need to interface to middleware.  LHC Analyses – ongoing feedback/development.  Other Analyses have immediate requirements. Integrated using Globus, Condor, EDG/SAM tools 4.Infrastructure = Tiered computing to the physicist desktop:  Scale in UK? 1 PByte and 2,000 distributed CPUs  GridPP in Sept 2004 5.Integration = ongoing with UK e-science… 6.Dissemination Co-operation required with other disciplines/industry 7.Finances – under control, but need to start looking to Year 4.. Year 1 was a good starting point. First Grid jobs have been submitted.. Looking forward to Year 2. Web services ahead.. but Experiments will define whether this experiment is successful (or not)

57 Tony Doyle - University of Glasgow Holistic View: Multi-layered Issues (Not a Status Report)

58 Tony Doyle - University of Glasgow GridPP5 - Welcome Opening Session (Chair: Dave Britton) 11:00-11:30 Welcome and Introduction - Steve LloydWelcome and Introduction 11:30-12:00 GridPP Project Status - Tony DoyleGridPP Project Status 12:00-12:30 Project Management - Dave BrittonProject Management Experiment Developments I (Chair: Roger Barlow) 13:30-13:50 EB News and LCG SC2 Activities - Nick BrookEB News and LCG SC2 Activities 13:50-14:10 WP8 status - Frank HarrisWP8 status 14:10-14:35 ATLAS/LHCb GANGA Development - Alexander SorokoATLAS/LHCb GANGA Development 14:35-15:00 ATLAS Installation and Validation Tools - Roger JonesATLAS Installation and Validation Tools 15:00-15:15 UK e-Science Grid for ATLAS - Matt PalmerUK e-Science Grid for ATLAS 15:15-15:40 CMS status and future plans - Peter HobsonCMS status and future plans Experiment Developments II (Chair: Nick Brook) 16:00-16:25 UKQCD Status and Future Plans - James PerryUKQCD Status and Future Plans 16:25-16:50 BaBar Status and Future Plans - David SmithBaBar Status and Future Plans 16:50-17:00 SAM - Introduction - Rick St DenisSAM - Introduction 17:00-17:20 SAM Status and Future Plans - Stefan StonjekSAM Status and Future Plans 17:20-17:40 SAM-Grid Status and Future Plans - Rod WalkerSAM-Grid Status and Future Plans 17:40-17:55 Demonstration of GridPP portal - Sarah Marr and Dave CollingDemonstration of GridPP portal LeSC Perspective, Middleware and Testbed (Chair: Pete Clarke) 9:00-9:30 London eScience Centre - Steven Newhouse, Technical Director of LeSCLondon eScience Centre 9:30-10:00 EDG Overview, inc. Feedback from EDG Retreat and Plans for Testbed 2 - Steve FisherEDG Overview, inc. Feedback from EDG Retreat and Plans for Testbed 2 10:00-10:30 Tier 1/A Status Report - Andrew SansumTier 1/A Status Report 10:30-11:00 Testbed Deployment Status - Andrew McNabTestbed Deployment Status Testbed Installation Experiences, Issues and Plans (Chair: John Gordon) 11:30-11:45 Meta Directory Service - Steve TraylenMeta Directory Service 11:45-12:00 Virtual Organisation - Andrew McNabVirtual Organisation 12:00-12:15 Replica Catalog - Owen MoroneyReplica Catalog 12:15-12:30 Resource Broker - Dave CollingResource Broker Grid Middleware Status and Plans (Chair: Steve Lloyd) 13:30-13:50 WP1 Overview - Dave CollingWP1 Overview 13:50-14:10 WP2 Overview - Gavin McCanceWP2 Overview 14:10-14:30 WP3 Overview - Steve FisherWP3 Overview 14:30-14:50 WP4 Status - Tony CassWP4 Status 14:50-15:10 WP5 Overview - John GordonWP5 Overview 15:10-15:30 WP7: Networking and Security - Paul MealorWP7: Networking and Security


Download ppt "Tony Doyle “GridPP – Year 1 to Year 2”, Collaboration Meeting, Imperial College, 16 September 2002."

Similar presentations


Ads by Google