Presentation is loading. Please wait.

Presentation is loading. Please wait.

Mike Mineter mjm@nesc.ac.uk UK e-Infrastructure Mike Mineter mjm@nesc.ac.uk.

Similar presentations


Presentation on theme: "Mike Mineter mjm@nesc.ac.uk UK e-Infrastructure Mike Mineter mjm@nesc.ac.uk."— Presentation transcript:

1 Mike Mineter mjm@nesc.ac.uk
UK e-Infrastructure Mike Mineter

2 Acknowledgements For slides and information:
NGS and GOSC - Stephen Pickles, Technical Director of GOSC OMII – Steven Newhouse JISC – Ann Borda, Sarah Porter, Sara Hassan, Shirley Wood Data centres – Peter Burnhill Integrative Biology UK e-Infrastructure

3 Overview The UK e-science programme The National Grid Service
GOSC – Grid Operations Support Centre UK e-Science Certification Authority (CA) NGS middleware OMII-UK – Open Middleware Infrastructure Institute JISC - Joint Information Systems Committee UK e-Infrastructure

4 UK e-Science Budget (2001-2006)
Total: £213M + £100M via JISC EPSRC Breakdown Staff costs only - Grid Resources Computers & Network funded separately Main purpose of this slide here: sets context. There is massive investment in e-science in the UK. + Industrial Contributions £25M Source: Science Budget 2003/4 – 2005/6, DTI(OST) UK e-Infrastructure

5 The e-Science Centres EGEE e-Science Institute Globus Alliance
National eScience Centre Digital Curation Centre National Centre for e-Social Science Grid Operations Support Centre This shows the National Centre, 8 regional centres and 2 laboratories in blue That is the original set up at the start of UK e-Science August 2001 NeSC is jointly run by Edinburgh & Glasgow Universities In 2003 several smaller centres were added (vermilion) (1st call out) The e-Science Institute is run by the National e-Science Centre. It runs a programme of events and hosts visiting international researchers. It was established in 2001. The Open Middleware Infrastructure Institute was established in 2004, to provide support and direction for Grid middleware developed in the UK. It is based at the University of Southampton. The Grid Operations Centre will be established in 2004. (2nd call out) The Digital Curation Centre was established in 2004 by the Universities of Edinburgh and Glasgow, the UK Online Library Network at the University of Bath, and the Central Laboratories at Daresbury and Rutherford. It’s job is to provide advice on curating scientific data and on preserving digital media, formats, and access software. (3rd call out) Edinburgh is one of the 4 founders of the Globus Alliance (Sept 2003) which will take responsibility for the future of the Globus Toolkit. The other founders are: Chicago University (Argonne National Lab), University of Southern California, Los Angeles (Information Sciences Institute) and the PDC, Stockholm, Sweden (4th call out) The EU EGEE project (Enabling Grids for E-SciencE) is establishing a common framework for Grids in Europe. The UK e-Science programme has several connections with EGEE. NeSC leads the training component for the whole of Europe. Open Middleware Infrastructure Institute - UK National Institute for Environmental e-Science CeSC (Cambridge) Arts and Humanities e-Science Support Centre EGEE

6 The National Grid Service
UK e-Infrastructure

7 The National Grid Service
The core UK grid, resulting from the UK's e-Science programme. Production use of computational and data grid resources. Supported by JISC, and is run by the Grid Operations Support Centre (GOSC). UK e-Infrastructure

8 The National Grid Service
Launched April 2004 Full production - September 2004 UK e-Infrastructure

9 UofD PSRE GOSC Commercial Provider U of A H P C x C S A R U of B U of
RAL Oxford Leeds Man. C S A R U of B U of C NGS Core Nodes: Host core services, coordinate integration, deployment and support +free to access resources for all VOs. Monitored interfaces + services NGS Partner Sites: Integrated with NGS, some services/resources available for all VOs Monitored interfaces + services NGS Affiliated Sites: Integrated with NGS, support for some VO’s Monitored interfaces (+security etc.) UK e-Infrastructure

10 Applications: 1 Molecular Dynamics Lattice Boltzmann Text mining
substrate complex product complex ADP ATP Molecular Dynamics Lattice Boltzmann Text mining This slide is simply a breakdown of the top users (in terms of jobs submitted) from the first site we reviewed. It is interesting mainly to show the diversity of users. UK e-Infrastructure

11 Applications Systems Biology Econometric analysis Neutron Scattering
Climate modelling UK e-Infrastructure

12 Applications… Applications (incomplete list!): nano-particles
protein folding ab-initio protein structure prediction radiation transport (radiotherapy) IXI (medical imaging) Biological membranes Micromagnetics Archaeology Text mining Lattice QCD (analysis) Astronomy (VO services) Many, but not all, applications cover traditional computational sciences Both user and pre-installed software Common features: Distributed data and/or collaborators Not just pre-existing large collaborations Explicitly encourage new users UK e-Infrastructure

13 User Registration (Process)
Accepted (User Notified) User Applies (Via Website) Application QC User added to NGS VO (includes SRB account) Application Submitted to Peer Review Panel User added NGS-USER and NGS-ANNOUNCE Mailing lists User Registration Process Schematic overview of the user registration process including peer review process now in place. The process can take up to 7 days (maximum) during which time if no reject statements have been received from anyone on the peer review panel, then the user is accepted and granted an account and subscribed to the User related NGS mailing lists. (Note: SRB account is also automatically generated at this stage, but until we know the licensing issue then we are not making this publicly known!) Approved ? Rejected (User Notified) UK e-Infrastructure

14 NGS Overview: Organisational view
Management GOSC Board Strategic direction Technical Board Technical coordination and policy Grid Operations Support Centre Manages the NGS Operates the UK CA + over 30 RA’s Operates central helpdesk Minimum software stack Policies and procedures Manage and monitor partners UK e-Infrastructure

15 NGS Overview: User view
Resources 4 Core clusters UK’s National HPC services A range of partner contributions clusters, shared mem. portals, data … Partners and affiliates Access Support UK academic researchers Free at the point of use Light weight peer review for limited “free” resources Partners can provide larger commitments as required Central help desk UK e-Infrastructure

16 How Do I Get A Certificate?
You need a valid UK certificate before applying for an NGS account: https://ca.grid-support.ac.uk, the UK Certificate Authority. You will probably need to provide non-electronic proof of identity to your local representative of the CA. For example: show your passport. See Always keep this certificate secure. E.g. DO NOT LEAVE IT ON ANY NGS CORE NODE ! Do store it on a USB drive, that you keep safe!! UK e-Infrastructure

17 Projects and VOs Just need access to compute and data resources for users in your project? Currently, mainly applications from individuals project-based applications being dealt with case-by-case, as procedures are established – for these, talk to GOSC! Want to host your data on NGS? consider SRB, Oracle, or OGSA-DAI NGS maintains infrastructure you populate and manage data Want to use NGS resources to provision services, portals for a community of users? Want researchers to access your data? UK e-Infrastructure

18 NGS Users UK e-Infrastructure

19 Overview The UK e-science programme The National Grid Service
GOSC – Grid Operations Support Centre UK e-Science Certification Authority (CA) NGS middleware OMII-UK – Open Middleware Infrastructure Institute JISC - Joint Information Systems Committee UK e-Infrastructure

20 NGS software Computation services based on Globus Toolkit 2
Use compute nodes for sequential or parallel jobs, primarily from batch queues Can run multiple jobs concurrently Data services: Storage Resource Broker: Primarily for file storage and access Virtual filesystem with replicated files “OGSA-DAI”: Data Access and Integration Grid-enabling databases (relational, XML) NGS Oracle service GridFTP for efficient file transfer Authorisation and Authentication using GSI Portal to support collaboration and ease use GSISSH for Windows users UK e-Infrastructure

21 Data services on NGS Simple data files Storage Resource Broker
Middleware supporting Replica files Logical filenames Catalogue: maps logical name to physical storage device/file Virtual filesystems, POSIX-like I/O Storage Resource Broker (3.3.1 since December 2005) Structured data RDBMS, XML databases Often pre-existing Do NOT want to replicate Require extendable middleware tools to support Move computation near to data Underpin integration and federation OGSA –DAI DAI: Data access and integration UK e-Infrastructure

22 OGSA-DAI In One Slide An extensible framework for data access and integration. Expose heterogeneous data resources to a grid through web services. Interact with data resources: Queries and updates. Data transformation / compression Data delivery. Customise for your project using Additional Activities Client Toolkit APIs Data Resource handlers A base for higher-level services federation, mining, visualisation,… Slide from Neil Chue Hong

23 NGS: Managing middleware evolution
Important to coordinate and integrate with deployment and operations work in EGEE and similar projects. Engineering Task Force makes recommendations for deployment on NGS. EGEE… ETF Other software sources NGS Software with proven capability & realistic deployment experience UK, Campus and other grids Prototypes & specifications Operations ‘Gold’ services Feedback & future requirements Engineering Task Force Deployment/testing/advice UK e-Infrastructure

24 Enhancing useability - 1
NGS has deployed low-level tools: these are reliable and give a production service but a user interacts at a low level with resources. GOSC had no adequate alternatives. Need for higher abstractions & tools is evident Example: P-GRADE and GEMLCA, developed at SZTAKI, Hungary and University of Westminster are made available to NGS users UK e-Infrastructure

25 P-GRADE Portal and GEMLCA
Grid Execution Management for Legacy Code Applications Tamas Kiss, Gabor Terstyanszky Centre for Parallel Computing University of Westminster Peter Kacsuk SZTAKI Hungary University of Westminster UK e-Infrastructure

26 The P-GRADE solution UK NGS Manchester Leeds Oxford Rutherford GT2
User-friendly Web interface UK NGS Manchester Leeds Oxford Rutherford GT2 Portal server at UoW

27 Ultra-short range weather forecast (Hungarian Meteorology Service)
Forecasting dangerous weather situations (storms, fog, etc.), crucial task in the protection of life and property 25 x 10 x 25 x 5 x Processed information: surface level measurements, high-altitude measurements, radar, satellite, lightning, results of previous computed models Requirements: Execution time < 10 min High resolution (1km)

28 What is a P-GRADE Portal workflow?
a directed acyclic graph (DAG) where Nodes represent jobs (executable batch programs) Ports represent input/output files the jobs expect/ produce Arcs represent file transfer between the jobs semantics of the workflow: A job can be executed if all of its input files are available local input files: on the portal server remote input files: at Grid storage service providers

29 Multi-Grid P-GRADE Portal
Different jobs of a workflow can be executed in different grids EGEE Grid e.g. VOCE P-GRADE-Portal The portal can be connected to multiple grids UK NGS London Rome Athens

30 P-GRADE portal in a nutshell
Proxy management Grid resources management Workflow creation Job mapping to Grid resources Workflow management and execution visualization

31 P-GRADE Portal Integration
GEMLCA objectives To deploy legacy code applications as Grid services without reengineering the original code and minimal user effort To create Grid workflows where components can also be legacy code applications To make these functions available from a Grid Portal GEMLCA GEMLCA & P-GRADE Portal Integration

32 GEMLCA repository GEMLCA resource NGS site1 (GT2)
3rd party service provider (UoW) GEMLCA resource (GT4 + GEMLCA classes) NGS site1 (GT2) Workflow definition NGS site2 (GT2) P-Grade Portal Central repository legacy code1 legacy code2 …. legacy coden NGS siten (GT2) user job submission

33 Enhancing useability -2
Some VOs have distinct user and application-developer communities – e.g. bioinformatics, medical imaging Need is for use of Grids without individual certificates – hard to manage for non-specialist and can make the Grid itself vulnerable BRIDGES and GOSC worked out a solution in which the VO logs who did what when, but users can simply access a portal. UK e-Infrastructure

34 UK e-Infrastructure

35 Security in BRIDGES – summary
make host proxy, authenticate with NGS and submit job job request is passed on securely with username NeSC grid server with host credentials NGS clusters authenticate at BRIDGES web portal with username and password only get user authorisations Leeds Oxford end user BRIDGES web portal RAL Manchester NeSC machine with PERMIS authorisation service (GT3.3) Slide by Micha Bayer, NeSC

36 Commercial Break Attend an NGS Induction course run by NeSC
10 people are enough to bring us back here! UK e-Infrastructure

37 Migration paths Desktop Administrators: migration
path from provision within campus to provision as national service Condor Users: migration path from simpler distributed computing to full multisite grid use Campus Grid National Grid Service UK e-Infrastructure Based on slide by Jonathan Giddy, Welsh e-Science Centre

38 Overview The UK e-science programme The National Grid Service
GOSC – Grid Operations Support Centre UK e-Science Certification Authority (CA) NGS middleware OMII-UK – Open Middleware Infrastructure Institute JISC - Joint Information Systems Committee UK e-Infrastructure

39 OMII-UK: Open Middleware Infrastructure Institute

40 Building e-Research OMII-UK exists to help bridge this gap!
Early adopters Routine production Unimagined possibilities Research Pilot projects Researchers are not funded to provide production quality software for others to use OMII-UK exists to help bridge this gap! UK e-Infrastructure

41 Open Middleware Infrastructure Institute
To be a leading provider of reliable interoperable and open-source Grid middleware components services and tools to support advanced Grid enabled solutions in academia and industry. Formed University of Southampton (2004) Focus on an easy to install e-Infrastructure solution Utilise existing software & standards Expanded with new partners in 2006 OGSA-DAI team at Edinburgh myGrid team at Manchester

42 Activity By providing a software repository of Grid components and tools from e-science projects By re-engineering software, hardening it and providing support for components sourced from the community By a managed programme to contract the development of “missing” software components necessary in grid middleware By providing an integrated grid middleware release of the sourced software components

43 The Managed Programme:
Integrated with the OMII Distribution OGSA-DAI (Data Access service) GridSAM (Job Submission & Monitoring service) Grimoires (Registry service based on UDDI) GeodiseLab (Matlab & Jython environments) FINS (Notification services using WS-Eventing) Delivering into the repository BPEL (Workflow service) MANGO (Managing workflows with BPEL) FIRMS (Reliable messaging)

44 Overview The UK e-science programme The National Grid Service
GOSC – Grid Operations Support Centre OMII – Open Middleware Infrastructure Institute JISC – Joint Information Systems Committee Services: NGS, Networking, Data Centres Programmes UK e-Infrastructure

45 Super JANET JISC provide budget to UKERNA
JISC guide UKERNA through the JCN UKERNA provide and manage JANET Present incarnation is SJ4 SJ5 en route

46 UKLight Funded by HEFCE and managed by UKERNA
UK’s first national switched circuit optical network Complements the SuperJanet4 production network National dark fibre facility for use by the photonics research community 10Gbit/s backbone to selected points in the UK Channels can be multiplexed, e.g. 4 x 2.5Gbit/s Connects to global optical networks via 10Gbit/s links to Chicago (StarLight) and Amsterdam (NetherLight) ESLEA is the first widely scoped project to exploit UKLight for a range of scientific applications HEFCE (Higher Education Funding Council for England)  Through SRIF (Scientific Research Investment Fund); UKERNA also mange the operation and development of the Janet Network Provides participating research centres throughout the UK with the infrastructure necessary to develop optical networking; also provides dedicated point-to-point paths/channels for projects Projects can undertake potentially disruptive work without impact on the production IP service, e.g. applications with large data-rates (eg 1Gb/s and above), Work with new equipment, protocols etc Currently, a collection of 1Gbit/s links from the London hub to selected points in the UK Enables many projects to be supported simultaneously

47

48 What are (JISC) Data Centres?
EDINA Multimedia Geospatial MIMAS Census What do they do? Content delivery – with AA (currently ATHENS) Licensing authority OGSA-DAI territory! Access databases as services on the National Grid Service Amongst challenges: ATHENS / Shibboleth / X.509 integration UK e-Infrastructure

49 JISC Programmes Research is only the start ! Digital libraries
e-Learning Digital libraries e-Research e-Infrastructure Figure derived from JISC Research is the start Need to curate results Feedback to new research If you can find data, knowledge! Digital libraries needed UK e-Infrastructure

50 JISC Programmes British Library/JISC Online Audio Usability Evaluation Workshop Core Middleware Infrastructure Core Middleware: Technology Development Programme Digital Libraries in the Classroom Programme Digital Prreservation and Records Management Digital Repositories Programme Digitisation Programme Distributed e-Learning Strand                                e-Learning Programme e-Learning Tools Projects - Phase 2 Exchange for Learning (X4L) Phase 2 Exchange for Learning (X4L) Programme                                  Focus on Access to Institutional Resources (FAIR) Programme JISC Framework Programme JISC-SURF Partnering on Copyright                                  Network Development Programme                                  Portals Programme Presentation Programme                                Semantic Grid and Autonomic Computing Programme Shared Services Programme Supporting Digital Preservation and Asset Management in Institutions                    Virtual Research Environments Programme JISC programmes – thanks to Ann Borda for this list!! Support for eResearch And see: Building Collaborative Environments for eResearch UK e-Science Programme Semantic Grid And see: Building the Semantic Grid Core Middleware Technology Programme Core Middleware Infrastructure Programme Virtual Research Environments JISC eFrameworks Programme And the eFrameworks initiative: Common Information Environment And interview with Paul Miller (Director): Digital Repositories Digital Preservation and Records Management

51 VRE development for Integrative Biology
“Whereas the existing IB Grid services have focussed on supporting the core IB experimental workflow - moving, processing and visualising data on the Grid - the IBVRE will support the research process in its widest sense i.e. activities such as identifying research areas and funding sources, building and managing projects/consortia, real-time communication, disseminating results, and provision of training to new researchers entering the field (learning and teaching support tools)”. UK e-Infrastructure

52 UK e-Infrastructure

53 e-Frameworks Programme
Everyone (e-Learning, Research, Core middleware,…) are all developing services Domain Services User Agents (Tools, Applications, Portlets, Rich Clients, etc.) Learning Services Research Services Administration Services Common Services Etc… Resource Services Security Services Messaging Services Users The Frameworks programme is attempt to engender a coherent approach across all JISC programmes where possible

54 UK e-Infrastructure The UK e-science programme
The National Grid Service GOSC – Grid Operations Support Centre OMII – Open Middleware Infrastructure Institute JISC - Joint Information Systems Committee UK e-Infrastructure

55 Web Sites NGS GOSC OMII CSAR HPCx
To see what’s happening: GOSC OMII CSAR HPCx Grid Operations Support Centre National e-Science Centre UK Training events UK e-Infrastructure

56 JANET http://www.ja.net/ UKLight http://www.uklight.ac.uk
ESLEA UK e-Infrastructure


Download ppt "Mike Mineter mjm@nesc.ac.uk UK e-Infrastructure Mike Mineter mjm@nesc.ac.uk."

Similar presentations


Ads by Google