Intersecting UK Grid & EGEE/LCG/GridPP Activities Applications & Requirements Mark Hayes, Technical Director, CeSC.

Slides:



Advertisements
Similar presentations
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
Advertisements

R. MiddletonGridPP8 – Bristol – 23 rd September 2003Slide 1 UK eScience BoF Session AHM – Nottingham - September 2003 Intersecting UK Grid and EGEE/LCG/GridPP.
02/07/03 Grid Support Centre 1 UK Grid Support Centre Alistair Mills CLRC e-Science Centre
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Andrew McNab - Manchester HEP - 2 May 2002 Testbed and Authorisation EU DataGrid Testbed 1 Job Lifecycle Software releases Authorisation at your site Grid/Web.
Andrew McNab - EDG Access Control - 14 Jan 2003 EU DataGrid security with GSI and Globus Andrew McNab University of Manchester
WP 1 Grid Workload Management Massimo Sgaravatto INFN Padova.
Dr. David Wallom Experience of Setting up and Running a Production Grid on a University Campus July 2004.
Dr. David Wallom Use of Condor in our Campus Grid and the University September 2004.
Monitoring and performance measurement in Production Grid Environments David Wallom.
GRID workload management system and CMS fall production Massimo Sgaravatto INFN Padova.
David Adams ATLAS DIAL Distributed Interactive Analysis of Large datasets David Adams BNL March 25, 2003 CHEP 2003 Data Analysis Environment and Visualization.
Workload Management Workpackage Massimo Sgaravatto INFN Padova.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
Experience with ATLAS Data Challenge Production on the U.S. Grid Testbed Kaushik De University of Texas at Arlington CHEP03 March 27, 2003.
AustrianGrid, LCG & more Reinhard Bischof HPC-Seminar April 8 th 2005.
Grid Architecture Grid Canada Certificates International Certificates Grid Canada Issued over 2000 certificates Condor G Resource TRIUMF.
GRID Workload Management System Massimo Sgaravatto INFN Padova.
Workload Management Massimo Sgaravatto INFN Padova.
Edinburgh University Experimental Particle Physics Alasdair Earl PPARC eScience Summer School September 2002.
Grid Enabled Optimisation and Design Search for Engineering (GEODISE)
Grid Toolkits Globus, Condor, BOINC, Xgrid Young Suk Moon.
Vladimir Litvin, Harvey Newman Caltech CMS Scott Koranda, Bruce Loftis, John Towns NCSA Miron Livny, Peter Couvares, Todd Tannenbaum, Jamie Frey Wisconsin.
The material in this presentation is the property of Fair Isaac Corporation. This material has been provided for the recipient only, and shall not be used,
Visualisation & Grid Applications of Electromagnetic Scattering from Aircraft Mark Spivack (PI), Andrew Usher, Xiaobo Yang, Mark Hayes CeSC, DAMTP & BAE.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Matthew Palmer, Cambridge University01/10/2015 First Use of the UK e-Science Grid Overview The Physics Experiences Looking forward Conclusions Matthew.
3 Sept 2001F HARRIS CHEP, Beijing 1 Moving the LHCb Monte Carlo production system to the GRID D.Galli,U.Marconi,V.Vagnoni INFN Bologna N Brook Bristol.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Using ICENI to run parameter sweep applications across multiple Grid resources Murtaza Gulamali Stephen McGough, Steven Newhouse, John Darlington London.
Tuning GENIE Earth System Model Components using a Grid Enabled Data Management System Andrew Price University of Southampton UK.
Nick Brook Current status Future Collaboration Plans Future UK plans.
Grid tool integration within the eMinerals project Mark Calleja.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
London e-Science Centre GridSAM Job Submission and Monitoring Web Service William Lee, Stephen McGough.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Neil Geddes GridPP-10, June 2004 UK e-Science Grid Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Production Tools in ATLAS RWL Jones GridPP EB 24 th June 2003.
Grid Architecture William E. Johnston Lawrence Berkeley National Lab and NASA Ames Research Center (These slides are available at grid.lbl.gov/~wej/Grids)
Report from USA Massimo Sgaravatto INFN Padova. Introduction Workload management system for productions Monte Carlo productions, data reconstructions.
SEEK Welcome Malcolm Atkinson Director 12 th May 2004.
“Grids and eScience” Mark Hayes Technical Director - Cambridge eScience Centre GEFD Summer School 2003.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
Applications & a Reality Check Mark Hayes. Applications on the UK Grid Ion diffusion through radiation damaged crystal structures (Mark Calleja, Mark.
© Geodise Project, University of Southampton, Geodise Middleware & Optimisation Graeme Pound, Hakki Eres, Gang Xue & Matthew Fairman Summer 2003.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
Building the e-Minerals Minigrid Rik Tyer, Lisa Blanshard, Kerstin Kleese (Data Management Group) Rob Allan, Andrew Richards (Grid Technology Group)
…building the next IT revolution From Web to Grid…
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
ATLAS is a general-purpose particle physics experiment which will study topics including the origin of mass, the processes that allowed an excess of matter.
2/8/00CHEP20001 AMUN A Practical Application Using the Nile Distributed Operating System Authors: R. Baker (Cornell University, Ithaca, NY USA) L. Zhou.
1 e-Science AHM st Aug – 3 rd Sept 2004 Nottingham Distributed Storage management using SRB on UK National Grid Service Manandhar A, Haines K,
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
HammerCloud Functional tests Valentina Mancinelli IT/SDC 28/2/2014.
© Geodise Project, University of Southampton, Geodise Middleware Graeme Pound, Gang Xue & Matthew Fairman Summer 2003.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
AHM04: Sep 2004 Nottingham CCLRC e-Science Centre eMinerals: Environment from the Molecular Level Managing simulation data Lisa Blanshard e- Science Data.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
Client installation DIRAC Project. DIRAC Client Software  Many operations can be performed through the Web interface  Even more to come  However, certain.
© Geodise Project, Scenario: Design optimisation v Model device, discretize, solve, postprocess, optimise Scripting.
John Kewley e-Science Centre CCLRC Daresbury Laboratory 15 th March 2005 Paradyn / Condor Week Madison, WI Caging the CCLRC Compute Zoo (Activities at.
LHCb Grid MeetingLiverpool, UK GRID Activities Glenn Patrick Not particularly knowledgeable-just based on attending 3 meetings.  UK-HEP.
© Geodise Project, University of Southampton, Workflow Support for Advanced Grid-Enabled Computing Fenglian Xu *, M.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
ATLAS Physics Analysis Framework James R. Catmore Lancaster University.
ETF and Level 2 Grid UK e-Science Grid Now up and running! Rob Allan David Boyd.
Presentation transcript:

Intersecting UK Grid & EGEE/LCG/GridPP Activities Applications & Requirements Mark Hayes, Technical Director, CeSC

What does the eScience Grid currently look like? Globus v2 installed at all regional eScience centres. Heterogenous resources (linux clusters, SGI O2/3000, SMP Sun machines) eScience certificate authority Operational & network monitoring Virtual organisation management

Applications on the eScience Grid E-Minerals - Monte Carlo simulations of radiation damage to crystal structures (Condor-G & home-grown shell scripts) Geodise - genetic algorithm for optimisation of satellite truss design (Java COG plugins in Matlab) GENIE - ocean-atmosphere modelling (flocked Condor pools) Other tools in use: HPCPortal, InfoPortal, Nimrod/G, SunDCG

GridPP & EDG Dedicated linux clusters running EDG middleware (globus++) Very homogenous resources Resource broker (based on Condor) LDAP based VO management

Particle Physics Applications e.g.ATLAS data challenge - monte carlo event generation, tracking & reconstruction Large FORTAN/C++ codes, optimised for Linux (and packaged as RPMs) Runs scripted using EDG job submit tools GUIs under development (e..g GANGA)

Questions Could I run an ATLAS data challenge on the eScience Grid? Could I run an e-Minerals monte carlo run on the EDG? EDG has few standard user interfaces: EDG UI/Ganga etc. eScience Grid relies on lower-level Globus protocols only. Do we rely on leaving Globus gatekeepers/MDS/etc exposed or develop common user interfaces?