UK Campus Grid Special Interest Group Dr. David Wallom University of Oxford.

Slides:



Advertisements
Similar presentations
Overview of local security issues in Campus Grid environments Bruce Beckles University of Cambridge Computing Service.
Advertisements

The National Grid Service and OGSA-DAI Mike Mineter
Current status of grids: the need for standards Mike Mineter TOE-NeSC, Edinburgh.
OMII-Europe Repository Steven Newhouse Director, OMII-UK.
MyProxy Guy Warner NeSC Training.
VO Support and directions in OMII-UK Steven Newhouse, Director.
OMII-UK Steven Newhouse, Director. © 2 OMII-UK aims to provide software and support to enable a sustained future for the UK e-Science community and its.
Introducing e-SciNet Clare Gryce UCL. Current Status E-Scinet (The e-Science Network) is up and running –Aim: to develop and disseminate best practice.
3rd Campus Grid SIG Meeting. Agenda Welcome OMII Requirements document Grid Data Group HTC Workshop Research Computing SIG? AOB Next meeting (AG)
© University of Reading David Spence 20 April 2014 E-Science Update.
E-Science Update Steve Gough, ITS 19 Feb e-Science large scale science increasingly carried out through distributed global collaborations enabled.
© University of Reading IT Services ITS Support for e­ Research Stephen Gough Assistant Director of IT Services 18 June 2008.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
MTA SZTAKI Hungarian Academy of Sciences Grid Computing Course Porto, January Introduction to Grid portals Gergely Sipos
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
OxGrid, A Campus Grid for the University of Oxford Dr. David Wallom.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
The NERC Cluster Grid Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre Environmental Systems Science Centre.
Web: OMII-UK Campus Grid Toolkit NW-GRID Campus Grids Workshop 31 st October 2007 University of Liverpool Tim Parkinson.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
Next Steps Guy Warner
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
COMP3019 Coursework: Introduction to GridSAM Steve Crouch School of Electronics and Computer Science.
CHEP 2003Stefan Stonjek1 Physics with SAM-Grid Stefan Stonjek University of Oxford CHEP th March 2003 San Diego.
London e-Science Centre GridSAM Job Submission and Monitoring Web Service William Lee, Stephen McGough.
The National Grid Service Guy Warner.
MEDIU Learning for HE Ahmad Nimer | Project Manager.
Cliff Addison University of Liverpool Campus Grids Workshop October 2007 Setting the scene Cliff Addison.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
Tool Integration with Data and Computation Grid GWE - “Grid Wizard Enterprise”
09/02 ID099-1 September 9, 2002Grid Technology Panel Patrick Dreher Technical Panel Discussion: Progress in Developing a Web Services Data Analysis Grid.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 NW-GRID Campus Grids Workshop Liverpool31 Oct 2007 Moving Beyond Campus Grids Steven Young Oxford NGS.
MTA SZTAKI Hungarian Academy of Sciences Introduction to Grid portals Gergely Sipos
Athens – integrated AMS services Ed Zedlewski JISC/CNI Conference Edinburgh, June 2002.
EGEE-II INFSO-RI Enabling Grids for E-sciencE The GILDA training infrastructure.
Next Steps: becoming users of the NGS Mike Mineter
PwC New Technologies New Risks. PricewaterhouseCoopers Technology and Security Evolution Mainframe Technology –Single host –Limited Trusted users Security.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 14 February 2006.
Middleware for Campus Grids Steven Newhouse, ETF Chair (& Deputy Director, OMII)
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
Testing Grid Software on the Grid Steven Newhouse Deputy Director.
1 e-Science AHM st Aug – 3 rd Sept 2004 Nottingham Distributed Storage management using SRB on UK National Grid Service Manandhar A, Haines K,
Standards driven AAA for Job Management within the OMII-UK distribution Steven Newhouse Director, OMII-UK
Tool Integration with Data and Computation Grid “Grid Wizard 2”
John Kewley e-Science Centre All Hands Meeting st September, Nottingham GROWL: A Lightweight Grid Services Toolkit and Applications John Kewley.
Toward a common data and command representation for quantum chemistry Malcolm Atkinson Director 5 th April 2004.
Rob Allan Daresbury Laboratory NW-GRID Training Event 26 th January 2007 Next Steps R.J. Allan CCLRC Daresbury Laboratory.
Grid Remote Execution of Large Climate Models (NERC Cluster Grid) Dan Bretherton, Jon Blower and Keith Haines Reading e-Science Centre
The National Grid Service Mike Mineter.
The National Grid Service User Accounting System Katie Weeks Science and Technology Facilities Council.
1 Porting applications to the NGS, using the P-GRADE portal and GEMLCA Peter Kacsuk MTA SZTAKI Hungarian Academy of Sciences Centre for.
Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
Stephen Burke – Sysman meeting - 22/4/2002 Partner Logo The Testbed – A User View Stephen Burke, PPARC/RAL.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
Shibboleth Use at the National e-Science Centre Hub Glasgow at collaborating institutions in the Shibboleth federation depending.
Active Directory Domain Services (AD DS). Identity and Access (IDA) – An IDA infrastructure should: Store information about users, groups, computers and.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
UK Grid Operations Support Centre All slides stolen by P.Clarke from a talk given by: Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations.
Campus grids: e-Infrastructure within a University Mike Mineter National e-Science Centre 22 February 2006.
The NGS Portal Guy Warner NeSC Training.
Portlet Development Konrad Rokicki (SAIC) Manav Kher (SemanticBits) Joshua Phillips (SemanticBits) Arch/VCDE F2F November 28, 2008.
Accessing the VI-SEEM infrastructure
Next Steps.
Moving the LHCb Monte Carlo production system to the GRID
The National Grid Service
IIS.
Patrick Dreher Research Scientist & Associate Director
Presentation transcript:

UK Campus Grid Special Interest Group Dr. David Wallom University of Oxford

Agenda Who are we Aims Outputs Events

Who Chaired by –David Wallom, Oxford –Clare Gryce, UCL Members so far… –Reading, Cardiff, Edinburgh, Bristol, Cambridge, KCL, Liverpool, Lancaster, Newcastle, OMII, Birmingham (in no particular order!)

Aims of the SIG? Group for the promotion of Campus Grid facilities throughout the UK academic community Solidify best practice throughout the UK community Groundwork for larger collaboration including regional grids Representation of Campus Grid providers at e-Science Directors meeting

The story so far 1st meeting in Oxford, 16th April –Presentations from each campus grid represented –Discussion of top issues facing campus grid providers fEC & Business models User Interfaces Middleware and software evolution 2nd meeting over Access Grid, 28th June –Presentation on the OMII roadmap 3rd Meeting at UK e-Science AHM, 10th September –Presentation of first draft of OMII requirements documents

OMII Requirements Document Give OMII a concrete list of developments needed by our community Separated into five areas –Applications –Security –Accounting –Monitoring –Storage

Storage, Jon Blower Support reading/writing from/to SRB systems as these appear to be fairly widely used Some groups appear to be moving towards WebDAV-based systems so it is recommend that OMII consider supporting this. SFTP/SCP is also commonly used - I think this might already be supported but if it isn't, this would be very useful. This will be available to lots of users on Campus (since they can access files on any server they can SSH to) so might be a good fallback even where other options are available. Filesystem-based solutions (e.g. AFS, Parrot) are independent of middleware and so are probably not important for OMII.

Accounting, David Wallom Driven by –fEC –Tracing of resources donated to show fare share usage, especially as we move towards inter-institutional sharing and regional grids Important that this be auditable Standards based OMII should develop a lightweight RUS server and client set –Independent of platform and requires no external software e.g. Tomcat etc. –Includes more than just processing usage but storage and services

Security, Ian Bland Users do not like using grid certificates, –prefer the same authentication method as they used for more traditional computing/IT system. –mainly SSO systems such as Kerberos but increasingly Shibboleth. OMII should actively support the inclusion of Shibboleth support for middleware system. Support for authentication via SSO systems needs to be matched with support for authorisation via directory services, notably LDAP. Integrity issues for the users are that data sets are consistent and not corrupt and that code will execute in a precise manner the need for a trusted platform extends to one that is robust and cannot be compromised. This is something OMII should consider, possibly in the context of a trusted computing platform.

Monitoring, Mark Calleja GridSAM to provide a means of monitoring running jobs through connectivity to the underlying scheduler/system –through providing a standard view of the contents of the scratch space on a remote resource –Through being able to pipe on request stdout/stderr

HTC Workshop National e-Science Centre, Edinburgh 26th - 30th November Including –Condor –United Devices (possibly!) –Digipede Attendance from users, vendors and IT managers Practical demonstrations and exercises as well as talks on research that has been made possible using HTC

Future meeting topics Visualization –Talk by RAVE team Networks –Connecting high performance networks –Firewall settings –Network performance Access Methods –Portals NGS Application Repository Application Hosting Environment –Standard command line applications –API Single Sign-on and common authentication for disparate components Energy usage and tools…

Communication Website – Maillist

Grid Data Storage UK