The National Grid Service Mike Mineter NeSC-TOE

Slides:



Advertisements
Similar presentations
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
Advertisements

The National Grid Service Mike Mineter.
Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre 1.The vision Thing, recent history and Organisation 2.Roles and Relationships.
Data services on the NGS.
The National Grid Service and OGSA-DAI Mike Mineter
Current status of grids: the need for standards Mike Mineter TOE-NeSC, Edinburgh.
Legacy code support for commercial production Grids G.Terstyanszky, T. Kiss, T. Delaitre, S. Winter School of Informatics, University.
INFSO-RI Enabling Grids for E-sciencE EGEE and the National Grid Service Mike Mineter
INFSO-RI Enabling Grids for E-sciencE Concepts of grid computing Guy Warner NeSC Training Team
Seminar Grid Computing ‘05 Hui Li Sep 19, Overview Brief Introduction Presentations Projects Remarks.
The UK National Grid Service Using the NGS. Outline NGS Background Getting Certificates Acceptable usage policies Joining VO’s What resources will be.
Grid Execution Management for Legacy Code Applications Exposing Application as Grid Services Porto, Portugal, 23 January 2007.
The OMII Position At the University of Southampton.
Next Steps Mike Mineter
Next Steps Guy Warner
Next Steps Mike Mineter
The National Grid Service Guy Warner.
Production Grids Mike Mineter NeSC-TOE. EU project: RIO31844-OMII-EUROPE 2 Production Grids - examples 1.EGEE: Enabling Grids for e-Science 2.National.
Neil Geddes GridPP-10, June 2004 UK e-Science Grid Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
The UK’s Grid Operations Support Centre and National Grid Service Core Components of the UK’s e-Infrstructure Neil Geddes Director, GOSC All Hands Meeting,
Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
Stephen Pickles Technical Director, Grid Operations Support Centre University of Manchester Neil Geddes CCLRC Head of e-Science Director of the UK Grid.
Resource Brokering in the PROGRESS Project Juliusz Pukacki Grid Resource Management Workshop, October 2003.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
NGS Innovation Forum, Manchester4 th November 2008 Condor and the NGS John Kewley NGS Support Centre Manager.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Code Applications Tamas Kiss Centre for Parallel.
The National Grid Service Mike Mineter
NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 Moving Beyond Campus Grids Steven Young Oxford NGS.
Workflow Level Grid Interoperability By GEMLCA and the P-GRADE Portal.
P-GRADE and GEMLCA.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
Next Steps: becoming users of the NGS Mike Mineter
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Next Steps.
The National Grid Service Mike Mineter.
1 e-Science AHM st Aug – 3 rd Sept 2004 Nottingham Distributed Storage management using SRB on UK National Grid Service Manandhar A, Haines K,
HEP SYSMAN 23 May 2007 National Grid Service Steven Young National Grid Service Manager Oxford e-Research Centre University of Oxford.
UK e-Infrastructure Mike Mineter.
The National Grid Service Mike Mineter.
The National Grid Service Mike Mineter
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
1 Porting applications to the NGS, using the P-GRADE portal and GEMLCA Peter Kacsuk MTA SZTAKI Hungarian Academy of Sciences Centre for.
INFSO-RI Enabling Grids for E-sciencE Middleware Concepts and Services in Campus, National and International Grids Mike Mineter.
Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
National Grid Service and EGEE Dr Andrew Richards Executive Director NGS e-Science Department, CCLRC.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
UK Grid Operations Support Centre All slides stolen by P.Clarke from a talk given by: Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.
The UK National Grid Service Andrew Richards – CCLRC, RAL.
1 eScience Grid Environments th May 2004 NESC - Edinburgh Deployment of Storage Resource Broker at CCLRC for E-science Projects Ananta Manandhar.
SCI-BUS project Pre-kick-off meeting University of Westminster Centre for Parallel Computing Tamas Kiss, Stephen Winter, Gabor.
Grid Computing and The National Grid Service.
Bob Jones EGEE Technical Director
Next Steps.
Clouds , Grids and Clusters
Tamas Kiss University Of Westminster
Data services on the NGS
UK Grid: Moving from Research to Production
Stephen Pickles Technical Director, GOSC
Data services on the NGS
Overview of IPB responsibilities in EGEE-III SA1
P-GRADE and GEMLCA.
Grid Portal Services IeSE (the Integrated e-Science Environment)
Study course: “Computing clusters, grids and clouds” Andrey Y. Shevel
NGS Oracle Service.
The National Grid Service
Constructing a system with multiple computers or processors
Status of Grids for HEP and HENP
Short to middle term GRID deployment plan for LHCb
Presentation transcript:

The National Grid Service Mike Mineter NeSC-TOE

Goal of talk Revision of concepts of the NGS Current developments Overview of the course content

The National Grid Service The core UK grid, resulting from the UK's e-Science programme. Production use of computational and data grid resources. Supported by JISC NGS-2 began 3Q-2006

NGS Compute Facilities Leeds and Oxford (core compute nodes) 64 dual CPU intel 3.06GHz (1MB cache). Each node: 2GB memory, 2x120GB disk, Redhat ES3.0. Gigabit Myrinet connection. 2TB data server. Manchester and Rutherford Appleton Laboratory (core data nodes) 20 dual CPU (as above). 18TB SAN. Bristol initially 20 2.3GHz Athlon processors in 10 dual CPU nodes. Cardiff 1000 hrs/week on a SGI Origin system comprising 4 dual CPU Origin 300 servers with a Myrinet™ interconnect. Lancaster 8 Sun Blade 1000 execution nodes, each with dual UltraSPARC IIICu processors connected via a Dell 1750 head node. UPGRADE IN NEAR FUTURE! Westminster 32 Sun V60 compute nodes HPCx … For more details: http://www.ngs.ac.uk/resources.html

NGS Vision Commercial Provider UofD PSRE U of A H P C x RAL Oxford Leeds Man. H E C T O R U of B U of C NGS Core Nodes: Host core services, coordinate integration, deployment and support +free to access resources for all VOs. Monitored interfaces + services NGS Partner Sites: Integrated with NGS, some services/resources available for all VOs Monitored interfaces + services NGS Affiliated Sites: Integrated with NGS, support for some VO’s Monitored interfaces (+security etc.) General principle here: establish core and grow it: compute, data and operational services

NGS software Computation services based on Globus Toolkit Use compute nodes for sequential or parallel jobs, primarily from batch queues Can run multiple jobs concurrently (be reasonable!) Data services: SRB – vaults: virtual filesystem OGSA-DAI: allows projects, data services,…. to expose their data to NGS users (and more) GridFTP for fast data transfer Also ORACLE database Friday’s talks

NGS Software - 2 Middleware recently deployed Developed by partners: Portal v2 GridSAM – alternative job submission and monitoring Developed by partners: Application Hosting Environment: AHE P-GRADE portal and GEMLCA Being deployed VOMS support WS-GRAM: GT4 job submission Resource Broker Under development Shibboleth integration

NGS development Next 3 slides are a personal view not (yet) reviewed by Development Board NGS-2 has goal of 3000 users Embracing current and new VOs is vital to achieving this

Grids: Where are we now? Early adopters Routine production Unimagined possibilities Research Pilot projects Networks Web Grids is simplistic but seeks to highlight that: 1. the more compute-intensive, IT-aware disciplines have established user communities whilst awareness of the possible contributions to the arts, humanities, social sciences is growing, in part through pilot projects. 2. There is a long way to go before the use of grids is as easy as that of networks… the maturity of the applications and the grid middleware will bring that about…. But grids remain at their foundations basedon agreements between people – resource providers and VOs… so always effort to set that up. 3. Early production grids are now established. “Unimagined” by many but imagined by some for decades. Arts Humanities e-Soc-Sci Sciences, engineering Early production grids: International - EGEE Service-oriented, workflow, “legacy” data High throughput, new data Types of use:

NGS growth -1 Data Ability to expose data to grid use: national, project, institute,… Project support (incl. VOMS) Portal Current users are in general seeking access to compute power. The NGS had a focus on gaining users to buld the number of grid-aware researchers Now there are initiatives to expand the “use cases” in the two axes shown Recall that grids can be located in three dimensional space: computation, data and collaboration. Current use cases Collaborative research

NGS growth -2 Computation Workflow, service orchestration Data Trivial parallelism, High-throughput Current use cases VO support

Also: Condor and Condor-G: campus grid – NGS interoperability This course Also: Condor and Condor-G: campus grid – NGS interoperability Middleware recently deployed Portal v2 GridSAM – alternative job submission and monitoring Developed by partners: Application Hosting Environment: AHE P-GRADE portal and GEMLCA Being deployed VOMS support WS-GRAM: GT4 job submission Resource Broker Under development Shibboleth integration Practicals Friday afternoon: talk