The National Grid Service

Slides:



Advertisements
Similar presentations
Neil Geddes CCLRC Director, e-Science Director, Grid Operations Support Centre The UK National Grid Service.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
The National Grid Service Mike Mineter.
Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre 1.The vision Thing, recent history and Organisation 2.Roles and Relationships.
Research Councils ICT Conference Welcome Malcolm Atkinson Director 17 th May 2004.
National e-Science Centre Glasgow e-Science Hub Opening: Remarks NeSCs Role Prof. Malcolm Atkinson Director 17 th September 2003.
Data services on the NGS.
The National Grid Service and OGSA-DAI Mike Mineter
HPCx Power for the Grid Dr Alan D Simpson HPCx Project Director EPCC Technical Director.
Current status of grids: the need for standards Mike Mineter TOE-NeSC, Edinburgh.
INFSO-RI Enabling Grids for E-sciencE EGEE and the National Grid Service Mike Mineter
E-Science – Multidisciplinarity supported and working International Review Edinburgh Malcolm Atkinson Director e-Science Institute & e-Science Envoy
Peter Clarke UK National e-Science Centre University of Edinburgh e-Infrastructure in the UK.
INFSO-RI Enabling Grids for E-sciencE Concepts of grid computing Guy Warner NeSC Training Team
The UK National Grid Service Using the NGS. Outline NGS Background Getting Certificates Acceptable usage policies Joining VO’s What resources will be.
User requirements for and concerns about a European e-Infrastructure Steven Newhouse, Director.
Next Steps Mike Mineter
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Next Steps Guy Warner
Next Steps Mike Mineter
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
The National Grid Service Guy Warner.
GGF-16 Athens Production Grid Computing in the UK Neil Geddes CCLRC Director, e-Science.
Production Grids Mike Mineter NeSC-TOE. EU project: RIO31844-OMII-EUROPE 2 Production Grids - examples 1.EGEE: Enabling Grids for e-Science 2.National.
Neil Geddes GridPP-10, June 2004 UK e-Science Grid Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
The UK’s Grid Operations Support Centre and National Grid Service Core Components of the UK’s e-Infrstructure Neil Geddes Director, GOSC All Hands Meeting,
Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
Stephen Pickles Technical Director, Grid Operations Support Centre University of Manchester Neil Geddes CCLRC Head of e-Science Director of the UK Grid.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
SEEK Welcome Malcolm Atkinson Director 12 th May 2004.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
The National Grid Service Mike Mineter
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Rob Allan Daresbury Laboratory NW-GRID Training Event 25 th January 2007 Introduction to NW-GRID R.J. Allan CCLRC Daresbury Laboratory.
Next Steps: becoming users of the NGS Mike Mineter
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Next Steps.
The National Grid Service Mike Mineter.
17/1/20051 NGS Status Stephen Pickles Technical Director, GOSC (with thanks to Andy Richards, NGS Co-ordinator) CERN, 17/1/2005
1 The National Grid Service: An Overview Stephen Pickles University of Manchester Technical Director, GOSC Towards an NGS User Induction Course, NeSC,
UK e-Infrastructure Mike Mineter.
Rob Allan Daresbury Laboratory NW-GRID Training Event 26 th January 2007 Next Steps R.J. Allan CCLRC Daresbury Laboratory.
The National Grid Service Mike Mineter.
The National Grid Service Mike Mineter
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
1 Porting applications to the NGS, using the P-GRADE portal and GEMLCA Peter Kacsuk MTA SZTAKI Hungarian Academy of Sciences Centre for.
INFSO-RI Enabling Grids for E-sciencE Middleware Concepts and Services in Campus, National and International Grids Mike Mineter.
Welcome Grids and Applied Language Theory Dave Berry Research Manager 16 th October 2003.
Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
National Grid Service and EGEE Dr Andrew Richards Executive Director NGS e-Science Department, CCLRC.
UK Grid Operations Support Centre All slides stolen by P.Clarke from a talk given by: Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations.
The UK National Grid Service Andrew Richards – CCLRC, RAL.
1 eScience Grid Environments th May 2004 NESC - Edinburgh Deployment of Storage Resource Broker at CCLRC for E-science Projects Ananta Manandhar.
Grid Computing and The National Grid Service.
Operations Support for the UK National Grid Service
Bob Jones EGEE Technical Director
Next Steps.
Next Steps.
Ian Bird GDB Meeting CERN 9 September 2003
UK Grid: Moving from Research to Production
Stephen Pickles Technical Director, GOSC
Data services on the NGS
UK e-Science OGSA-DAI November 2002 Malcolm Atkinson
P-GRADE and GEMLCA.
Grid Portal Services IeSE (the Integrated e-Science Environment)
NGS Oracle Service.
The National Grid Service Mike Mineter NeSC-TOE
Presentation transcript:

The National Grid Service Richard Hopkins rph@nesc.ac.uk

Acknowledgements Some NGS slides are taken from talks by Stephen Pickles and Andy Richards Also slides from Malcolm Atkinson on the UK e-Science programme

Overview e-Infrastructure in the UK The National Grid Service

e-Science Centres in the UK Coordination & Leadership: NeSC & e-Science Directors’ Forum Oxford NeSC Belfast Cambridge Daresbury Manchester London Newcastle Southampton Cardiff e-Science Centres RAL Leicester Birmingham York Bristol Lancaster Reading Centres of Excellence Access Grid Support Centre Other Centres Digital Curation Centre National Grid Service National Centre for e-Social Science Text Mining National Institute for Environmental e-Science Open Middleware Infrastructure Institute + ~2 years

Edinburgh +5 years

OMII-UK nodes Edinburgh Manchester Southampton EPCC & National e-Science Centre Manchester School of Computer Science University of Manchester Southampton School of Electronics and Computer Science University of Southampton OMII-UK is funded by EPSRC through the UK e-Science Core programme. It is a collaboration between the School of Electronics and Computer Science at the University of Southampton, the OGSA-DAI project at the National e-Science Centre and EPCC, and the myGrid project at the School of Computer Science at the University of Manchester. +3 years

The National Grid Service

The National Grid Service The core UK grid, resulting from the UK's e-Science programme. Grid: virtual computing across admin domains Production use of computational and data grid resources. Supported by JISC October 2006: entered 2nd phase of funding, 2 years

UofD PSRE GOSC Commercial Provider U of A H P C x C S A R U of B U of RAL Oxford Leeds Man. C S A R U of B U of C NGS Core Nodes: Host core services, coordinate integration, deployment and support +free to access resources for all VOs. Monitored interfaces + services NGS Partner Sites: Integrated with NGS, some services/resources available for all VOs Monitored interfaces + services NGS Affiliated Sites: Integrated with NGS, support for some VO’s Monitored interfaces (+security etc.)

National Grid Service and partners Edinburgh Edinburgh Lancaster Manchester Didcot CCLRC Rutherford Appleton Laboratory York NGS Partners HPCx Edinburgh Bristol Cardiff Lancaster Westminster NGS Affiliates National e-Science Centre Cardiff Westminster Bristol +2.5 years

NGS Use Files stored CPU time by user Users by institution Users by discipline biology Large facilities Eng. + Phys. Sci Env. Sci PP + Astronomy Medicine Humanities Sociology Users by institution

Supporting Services UK Grid Services Administration: National Services Authentication, authorisation, certificate management, VO registration, security, network monitoring, help desk + support centre. NGS Services and interfaces Job submission, simple registry, data transfer, data access and integration, resource brokering, monitoring and accounting, grid management services, workflow, notification, operations centre. NGS core-node Services CPU, (meta-) data storage, key software Services coordinated with others (eg OMII, NeSC, EGEE, LCG): Integration testing, compatibility & Validation Tests, User Management, training Administration: Policies and acceptable use Service Level Agreements and Definitions Coordinate deployment and Operations Operational Security

Applications: 2 Systems Biology Econometric analysis Neutron Scattering Climate modelling

Membership options Two levels of membership (for sharing resources): Affiliates run compatible stack, integrate support arrangements adopt NGS security policies all access to affiliate’s resources is up to the affiliate except allowing NGS to insert probes for monitoring purposes Partners also make “significant resources” available to NGS users enforce NGS acceptable use policies provide accounting information define commitments through formal Service Level Descriptions influence NGS direction through representation on NGS Technical Board

NGS Facilities Westminster Leeds and Oxford (core compute nodes) 64 dual CPU intel 3.06GHz (1MB cache). Each node: 2GB memory, 2x120GB disk, Redhat ES3.0. Gigabit Myrinet connection. 2TB data server. Manchester and Rutherford Appleton Laboratory (core data nodes) 20 dual CPU (as above). 18TB SAN. Bristol initially 20 2.3GHz Athlon processors in 10 dual CPU nodes. Cardiff 1000 hrs/week on a SGI Origin system comprising 4 dual CPU Origin 300 servers with a Myrinet™ interconnect. Lancaster 8 Sun Blade 1000 execution nodes, each with dual UltraSPARC IIICu processors connected via a Dell 1750 head node. Westminster 32 Sun V60 compute nodes HPCx … For more details: http://www.ngs.ac.uk/resources.html

NGS software Computation services based on Globus Toolkit 2 Use compute nodes for sequential or parallel jobs, primarily from batch queues Can run multiple jobs concurrently (be reasonable!) Data services: Storage Resource Broker: Primarily for file storage and access Virtual filesystem with replicated files “OGSA-DAI”: Data Access and Integration grid-enabling data (relational, XML; files) NGS Oracle service

Gaining Access Free (at point of use) access to core and partner NGS nodes Obtain digital X.509 certificate from UK e-Science CA or recognized peer Apply for access to the NGS National HPC services HPCx Must apply separately to research councils Digital certificate and conventional (username/ password) access supported

Key facts Production: deploying middleware after selection and testing Evolving: Middleware Number of sites Organisation: VO management – for collaborative projects Policy negotiation: sites, VOs International commitment Gathering users’ requirements – National Grid Service

Web Sites NGS HPCx http://www.ngs.ac.uk To see what’s happening: http://ganglia.ngs.rl.ac.uk/ New wiki service: http://wiki.ngs.ac.uk Training events: http://www.nesc.ac.uk/training HPCx http://www.hpcx.ac.uk

Summary NGS is a production service Therefore cannot include latest research prototypes! Formalised commitments - service level agreements Core sites provide computation and data services NGS is evolving OMII, EGEE, Globus Alliance all have m/w under assessment for the NGS Selected, deployed middleware currently provides “low-level” tools New deployments will follow New sites and resources being added