Presentation is loading. Please wait.

Presentation is loading. Please wait.

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 1 UK eScience BoF Session AHM – Nottingham - September 2003 Intersecting UK Grid and EGEE/LCG/GridPP.

Similar presentations


Presentation on theme: "R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 1 UK eScience BoF Session AHM – Nottingham - September 2003 Intersecting UK Grid and EGEE/LCG/GridPP."— Presentation transcript:

1 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 1 UK eScience BoF Session AHM – Nottingham - September 2003 Intersecting UK Grid and EGEE/LCG/GridPP

2 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 2 BoF Agenda Applications & Requirements Technical Exchanges & Collaboration Common Strategies / Roadmap Discussion Applications & Requirements Technical Exchanges & Collaboration Common Strategies / Roadmap Discussion

3 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 3 Applications…

4 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 4 What does the eScience Grid currently look like? Globus v2 installed at all regional eScience centres. Heterogenous resources (linux clusters, SGI O2/3000, SMP Sun machines) eScience certificate authority Globus v2 installed at all regional eScience centres. Heterogenous resources (linux clusters, SGI O2/3000, SMP Sun machines) eScience certificate authority Mark Hayes

5 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 5 GridPP & EDG Dedicated linux clusters running EDG middleware (globus++) Very homogenous resources Resource broker (based on Condor) LDAP based VO management Dedicated linux clusters running EDG middleware (globus++) Very homogenous resources Resource broker (based on Condor) LDAP based VO management Mark Hayes

6 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 6 Applications Applications on the eScience Grid E-Minerals - Monte Carlo simulations of radiation damage to crystal structures (Condor-G, home-grown shell scripts) Geodise - genetic algorithm for optimisation of satellite truss design (Java COG plugins in Matlab) GENIE - ocean-atmosphere modelling (flocked Condor pools) Other tools in use: HPCPortal, InfoPortal, Nimrod/G, SunDCG Applications on the eScience Grid E-Minerals - Monte Carlo simulations of radiation damage to crystal structures (Condor-G, home-grown shell scripts) Geodise - genetic algorithm for optimisation of satellite truss design (Java COG plugins in Matlab) GENIE - ocean-atmosphere modelling (flocked Condor pools) Other tools in use: HPCPortal, InfoPortal, Nimrod/G, SunDCG GridPP & EDG (e.g.) ATLAS data challenge - monte carlo event generation, tracking & reconstruction Large FORTAN/C++ codes, optimised for Linux (and packaged as RPMs) Runs scripted using EDG job submit toolsGUIs under development (e..g GANGA) GridPP & EDG (e.g.) ATLAS data challenge - monte carlo event generation, tracking & reconstruction Large FORTAN/C++ codes, optimised for Linux (and packaged as RPMs) Runs scripted using EDG job submit toolsGUIs under development (e..g GANGA) Mark Hayes

7 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 7 Technical…

8 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 8 Comparison GT2 EDG 2.0 Value added Components EDG Info VOMS RB Data Man. Only Linux 7.3 in places PP Applications GT2 Simple User management ??? UK MDS L2 Grid Value added components Application Network monitoring eScience CA Peter Clarke

9 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 9 UK Grid : Deployment Phases Level 0: Resources with Globus GT2 registering with the UK MDS at ginfo.grid-support.ac.uk; Level 1: Resources capable of running the Grid Integration Test Script; Level 2: Resources with one or more application packages pre-installed and capable of offering a service with local accounting and tools for simple user management, discovery of applications and description of resources in addition to MDS; Level 3: GT2 production platform with widely accessible application base, distributed user and resource management, auditing and accounting. Resources signing up to Level 3 will be monitored to establish their availability and service level offered. 7 Centres of Excellence Globus GT3 testbed (later talks and mini workshop) JISC JCSR resources Level 4: TBD, probably OGSA based. Level 0: Resources with Globus GT2 registering with the UK MDS at ginfo.grid-support.ac.uk; Level 1: Resources capable of running the Grid Integration Test Script; Level 2: Resources with one or more application packages pre-installed and capable of offering a service with local accounting and tools for simple user management, discovery of applications and description of resources in addition to MDS; Level 3: GT2 production platform with widely accessible application base, distributed user and resource management, auditing and accounting. Resources signing up to Level 3 will be monitored to establish their availability and service level offered. 7 Centres of Excellence Globus GT3 testbed (later talks and mini workshop) JISC JCSR resources Level 4: TBD, probably OGSA based. Rob Allan

10 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 10 UK Grid : Key Components ETF Coordination: activities are coordinated through regular Access Grid meetings, and the Web site; Resources: the components of this Grid are the computing and data resources contributed by the UK e-Science Centres linked through the SuperJanet4 backbone to regional networks; Middleware: many of the infrastructure services available on this Grid are provided by Globus GT2 software; Directory Services: a national Grid directory service using MDS links the information servers operated at each site and enables tasks to call on resources at any of the e-Science Centres; Security and User Authentication: the Grid operates a security infrastructure based on x.509 certificates issued by the e-Science Certificate Authority at the UK Grid Support Centre at CCLRC; Access Grid: on-line meeting facilities with dedicated rooms and multicast network access. ETF Coordination: activities are coordinated through regular Access Grid meetings, and the Web site; Resources: the components of this Grid are the computing and data resources contributed by the UK e-Science Centres linked through the SuperJanet4 backbone to regional networks; Middleware: many of the infrastructure services available on this Grid are provided by Globus GT2 software; Directory Services: a national Grid directory service using MDS links the information servers operated at each site and enables tasks to call on resources at any of the e-Science Centres; Security and User Authentication: the Grid operates a security infrastructure based on x.509 certificates issued by the e-Science Certificate Authority at the UK Grid Support Centre at CCLRC; Access Grid: on-line meeting facilities with dedicated rooms and multicast network access. Rob Allan

11 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 11 UK Grid : GT3 Testbeds 2 testbeds funded: Edinburgh/ Newcastle/ UCL/ Imperial –OGSA-DAI –E-Materials e-Science pilot demonstrator –AstroGrid application demonstrator Portsmouth/ Daresbury/ Westminster/ Manchester/ Reading –Tackle issues of inter-working between OGSI implementations –Report on deployment and ease of use –HPCPortal services demonstrator –CCP application demonstrator –E-HTPX e-Science pilot demonstrator –InfoPortal demonstrator using OGSA-DAI 2 testbeds funded: Edinburgh/ Newcastle/ UCL/ Imperial –OGSA-DAI –E-Materials e-Science pilot demonstrator –AstroGrid application demonstrator Portsmouth/ Daresbury/ Westminster/ Manchester/ Reading –Tackle issues of inter-working between OGSI implementations –Report on deployment and ease of use –HPCPortal services demonstrator –CCP application demonstrator –E-HTPX e-Science pilot demonstrator –InfoPortal demonstrator using OGSA-DAI Rob Allan

12 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 12 UK Grid : Issues to be Tackled A number of areas significant for a production Grid environment have hardly been tackled using GT2. Issues include: Grid information systems, service registration, discovery and definition of facilities. Schema important; Security, in particular role-based authorisation and security of middleware; Portable parallel job specifications; Meta-scheduling, resource reservation and on demand; Linking and interacting with remote data sources; Wide-area visualisation and computational steering; Workflow composition and optimisation; Distributed user, s/w and application management; Data management and replication services; Grid programming environments, PSEs and user interfaces; Auditing, advertising and billing in a Grid-based resource market; Semantic and autonomic tools; etc. etc. A number of areas significant for a production Grid environment have hardly been tackled using GT2. Issues include: Grid information systems, service registration, discovery and definition of facilities. Schema important; Security, in particular role-based authorisation and security of middleware; Portable parallel job specifications; Meta-scheduling, resource reservation and on demand; Linking and interacting with remote data sources; Wide-area visualisation and computational steering; Workflow composition and optimisation; Distributed user, s/w and application management; Data management and replication services; Grid programming environments, PSEs and user interfaces; Auditing, advertising and billing in a Grid-based resource market; Semantic and autonomic tools; etc. etc. Rob Allan

13 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 13 Coordination & Management

14 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 14 SR2004 e-Science soft-landing Key e-Science Infrastructure components: –Persistent National e-Science Research Grid –Grid Operations Centre –UK e-Science Middleware Infrastructure Repository –National e-Science Institute (cf Newton Institute) –National Digital Curation Centre –AccessGrid Support Service –e-Science/Grid collaboratories Legal Service –International Standards Activity Key e-Science Infrastructure components: –Persistent National e-Science Research Grid –Grid Operations Centre –UK e-Science Middleware Infrastructure Repository –National e-Science Institute (cf Newton Institute) –National Digital Curation Centre –AccessGrid Support Service –e-Science/Grid collaboratories Legal Service –International Standards Activity Paul Jeffreys

15 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 15 Post 2006 e-Science soft-landing Components foreseen:- –Baseline OST Core Programme and collaborate with JISC/JCSR about long-term support issues –OST should support Persistent Research Grid and e- Science Institute –JISC should support Grid Operations Centre, AccessGrid Support Service –OST and JISC should support jointly Repository, Curation Centre, e-Science Legal Service and International Standards Activity Components foreseen:- –Baseline OST Core Programme and collaborate with JISC/JCSR about long-term support issues –OST should support Persistent Research Grid and e- Science Institute –JISC should support Grid Operations Centre, AccessGrid Support Service –OST and JISC should support jointly Repository, Curation Centre, e-Science Legal Service and International Standards Activity Paul Jeffreys

16 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 16 Intersecting UK Grid and EGEE/LCG/GridPP Coordination and Management GridPP –Very substantial resources (computing and infrastructure) –Built-in international connections.. EGEE, LCG –Need to create production service Possible points of intersection of UK programme with GridPP/EGEE/LCG:- –Resources – shared sites, shared personnel –Grid Operations and Support –Security (operational level – firewalls etc) –Change Management of software suite –OMII –CA –Interface to Europe –Training, dissemination –Collection of requirements GridPP –Very substantial resources (computing and infrastructure) –Built-in international connections.. EGEE, LCG –Need to create production service Possible points of intersection of UK programme with GridPP/EGEE/LCG:- –Resources – shared sites, shared personnel –Grid Operations and Support –Security (operational level – firewalls etc) –Change Management of software suite –OMII –CA –Interface to Europe –Training, dissemination –Collection of requirements Paul Jeffreys

17 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 17 Discussion

18 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 18 Discussion EDG not yet heterogeneous, OGSA, timelines, packaging issues Get people working together as investment for future –Working parties : Security, Resource Brokering, Information Services, Workflow, Ops & Support Avoid being too ambitious or complex Need over-arching strategic / roadmap view Outcomes : –Pursue grass-roots collaboration & strategic view in parallel –Set up working party to recommend common elements of roadmap –Set up small number of technical working parties…definition and details being taken forward by Andy Parker – report in preparation. EDG not yet heterogeneous, OGSA, timelines, packaging issues Get people working together as investment for future –Working parties : Security, Resource Brokering, Information Services, Workflow, Ops & Support Avoid being too ambitious or complex Need over-arching strategic / roadmap view Outcomes : –Pursue grass-roots collaboration & strategic view in parallel –Set up working party to recommend common elements of roadmap –Set up small number of technical working parties…definition and details being taken forward by Andy Parker – report in preparation.

19 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 19 Status GT2 L2 Grid Application GT2 PP Grid PP Application Common Core Layer Shared Engineering ??? LCG-1/EGEE-0 ??? (GT2) ??? OGSA ???

20 R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 20 Conclusions Possible route for technical collaboration –Shared engineering – working together –Common objective in OGSA (similar timescales) No overall co-ordination - yet Much still to do to make it happen !! Possible route for technical collaboration –Shared engineering – working together –Common objective in OGSA (similar timescales) No overall co-ordination - yet Much still to do to make it happen !!


Download ppt "R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 1 UK eScience BoF Session AHM – Nottingham - September 2003 Intersecting UK Grid and EGEE/LCG/GridPP."

Similar presentations


Ads by Google