Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

S.L.LloydGridPP Collaboration Meeting Introduction Slide 1 GridPP - Introduction Welcome to the First GridPP Collaboration Meeting Introduction A brief.
Fabric and Storage Management GridPP Fabric and Storage Management GridPP 24/24 May 2001.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Tony Doyle GridPP2 Specification Process Grid Steering Committee Meeting, MRC, London, 18 February 2004.
GridPP Presentation to PPARC Grid Steering Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
GridPP Presentation to PPARC e-Science Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon.
Steve Lloyd Tony Doyle GridPP Presentation to PPARC e-Science Committee 31 May 2001.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
LHCb Computing Activities in UK Current activities UK GRID activities RICH s/w activities.
EGEE statement EU and EU member states major investment in Grid Technology Several good prototype results Next Step: –Leverage current and planned national.
Tony Doyle “GridPP2 Proposal”, GridPP7 Collab. Meeting, Oxford, 1 July 2003.
S.L.LloydGridPP CB 29 Oct 2002Slide 1 Agenda 1.Introduction – Steve Lloyd 2.Minutes of Previous Meeting (23 Oct 2001) 3.Matters Arising 4.Project Leader's.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1st Project Management Board Meeting 3 Sep 2001 Tony Doyle.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
08/06/00 LHCb(UK) Meeting Glenn Patrick LHCb(UK) Computing/Grid: RAL Perspective Glenn Patrick Central UK Computing (what.
April 2001HEPix/HEPNT1 RAL Site Report John Gordon CLRC, UK.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
DataGrid Applications Federico Carminati WP6 WorkShop December 11, 2000.
Nick Brook Current status Future Collaboration Plans Future UK plans.
GRID IIII D UK Particle Physics GridPP Collaboration meeting - R.P.Middleton (RAL/PPD) 23-25th May Grid Monitoring Services Robin Middleton RAL/PPD24-May-01.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
WP8 Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid WP8 Meeting, 16th November 2000 Glenn Patrick (RAL)
InterGrid Meeting 7 Oct 2001 Tony Doyle. Tony Doyle - University of Glasgow GridPP Status  Financial Background  Deliverables  Recruitment  Regional.
D0RACE: Testbed Session Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP Computing for Particle.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
GridPP Presentation to AstroGrid 13 December 2001 Steve Lloyd Queen Mary University of London.
1 DØ Grid PP Plans – SAM, Grid, Ceiling Wax and Things Iain Bertram Lancaster University Monday 5 November 2001.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
GRID IIII D UK Particle Physics Tony Doyle - University of Glasgow GridPP Status Report  Executive Summary  News  “Unfunded” EU DataGrid Posts  Shadow.
October 30, 2001ATLAS PCAP1 LHC Computing at CERN and elsewhere The LHC Computing Grid Project as approved by Council, on September 20, 2001 M Kasemann,
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
UK Grid Meeting Glenn Patrick1 LHCb Grid Activities in UK Grid Prototype and Globus Technical Meeting QMW, 22nd November 2000 Glenn Patrick (RAL)
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
Partner Logo A Tier1 Centre at RAL and more John Gordon eScience Centre CLRC-RAL HEPiX/HEPNT - Catania 19th April 2002.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
DataGrid is a project funded by the European Commission under contract IST rd EU Review – 19-20/02/2004 The EU DataGrid Project Three years.
November 27, 2001DOE/NSF review of US LHC S&C projects1 The Software and Computing Committee (SC2) in the LHC Computing Grid Project M Kasemann, FNAL.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
LHCb Grid MeetingLiverpool, UK GRID Activities Glenn Patrick Not particularly knowledgeable-just based on attending 3 meetings.  UK-HEP.
J Jensen/J Gordon RAL Storage Storage at RAL Service Challenge Meeting 27 Jan 2005.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
14 June 2001LHCb workshop at Bologna1 LHCb and Datagrid - Status and Planning F Harris(Oxford)
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Bob Jones EGEE Technical Director
UK GridPP Tier-1/A Centre at CLRC
Fabric and Storage Management
Collaboration Board Meeting
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

Robin Middleton RAL/PPD DG Co-ordination Rome, 23rd June 2001

GridPPe-Science PresentationSlide 2 GridPP Collaboration Meeting 1st GridPP Collaboration Meeting - Coseners House - May 24/

GridPPe-Science PresentationSlide 3 GridPP History Collaboration formed by all UK PP Experimental Groups in 1999 to submit £5.9M JIF bid for Prototype Tier-1 centre at RAL (later superceded) Added some Tier-2 support to become part of PPARC LTSR - “The LHC Computing Challenge”  Input to SR2000 Formed GridPP in Dec 2000 included CERN, CLRC and UK PP Theory Groups From Jan 2001 handling PPARC’s commitment to EU DataGrid UK

GridPPe-Science PresentationSlide 4 Proposal Executive Summary £40M 3-Year Programme LHC Computing Challenge = Grid Technology Five Components: –Foundation –Production –Middleware –Exploitation –Value-added Exploitation Emphasis on Grid Services and Core Middleware Integrated with EU DataGrid, PPDG and GriPhyN Facilities at CERN, RAL and up to four UK Tier-2 sites Centres = Dissemination LHC developments integrated into current programme (BaBar, CDF, D0,...) Robust Management Structure Deliverables in March 2002, 2003, 2004

GridPPe-Science PresentationSlide 5 GridPP Component Model Foundation Value- added Production Middlewar e Exploitation Component 1: Foundation The key infrastructure at CERN and within the UK Component 2: Production Built on Foundation to provide an environment for experiments to use Component 3: Middleware Connecting the Foundation and the Production environments to create a functional Grid Component 4: Exploitation The applications necessary to deliver Grid based Particle Physics Component 5: Value-Added Full exploitation of Grid potential for Particle Physics £21.0M £25.9M

GridPPe-Science PresentationSlide 6 Major Deliverables Prototype I - March 2002 Performance and scalability testing of components Testing of the job scheduling and data replication software from the first DataGrid release. Prototype II - March 2003 Prototyping of the integrated local computing fabric, with emphasis on scaling, reliability and resilience to errors. Performance testing of LHC applications. Distributed HEP and other science application models using the second DataGrid release. Prototype III - March 2004 Full scale testing of the LHC computing model with fabric management and Grid management software for Tier-0 and Tier-1 centres, with some Tier-2 components.

GridPPe-Science PresentationSlide 7 First Year Deliverables Each Workgroup has detailed deliverables. These will be refined each year and build on the successes of the previous year. The Global Objectives for the first year are: Deliver EU DataGrid Middleware (First prototype [M9]) Running experiments to integrate their data management systems into existing facilities (e.g. mass storage) Assessment of technological and sociological Grid analysis needs Experiments refine data models for analyses Develop tools to allow bulk data transfer Assess and implement metadata definitions Develop relationships across multi-Tier structures and countries Integrate Monte Carlo production tools Provide experimental software installation kits LHC experiments start Data Challenges Feedback assessment of middleware tools

GridPPe-Science PresentationSlide 8 GridPP Organisation Software development organised around a number of Workgroups Hardware development organised around a number Regional Centres Likely Tier-2 Regional Centres Focus for Dissemination and Collaboration with other disciplines and Industry Clear mapping onto Core Regional e-Science Centres

GridPPe-Science PresentationSlide 9 GridPP Workgroups A - Workload Management Provision of software that schedule application processing requests amongst resources B - Information Services and Data Management Provision of software tools to provide flexible transparent and reliable access to the data C - Monitoring Services All aspects of monitoring Grid services D - Fabric Management and Mass Storage Integration of heterogeneous resources into common Grid framework E - Security Security mechanisms from Certification Authorities to low level components F - Networking Network fabric provision through to integration of network services into middleware G - Prototype Grid Implementation of a UK Grid prototype tying together new and existing facilities H - Software Support Provide services to enable the development, testing and deployment of middleware and applications at institutes I - Experimental Objectives Responsible for ensuring development of GridPP is driven by needs of UK PP experiments J - Dissemination Ensure good dissemination of developments arising from GridPP into other communities and vice versa Technical work broken down into several workgroups - broad overlap with EU DataGrid

GridPPe-Science PresentationSlide 10 GridPP and CERN UK involvement through GridPP will boost CERN investment in key areas: –Fabric management software –Grid security –Grid data management –Networking –Adaptation of physics applications –Computer Centre fabric (Tier-0) For UK to exploit LHC to the full: Requires substantial investment at CERN to support LHC computing.

GridPPe-Science PresentationSlide 11 GridPP Management Structure

GridPPe-Science PresentationSlide 12 Management Status The Project Management Board (PMB) The Executive board chaired by Project Leader - Project Leader being appointed. Shadow Board in operation The Collaboration Board (CB) The governing body of the project - consists of Group Leaders of all Institutes - established and Collaboration Board Chair elected The Technical Board (TB) The main working forum chaired by the Technical Board Chair - interim task force in place The Experiments Board (EB) The forum for experimental input into the project - nominations from experiments underway The Peer Review Selection Committee (PRSC) Pending approval of Project The Dissemination Board (DB) Pending approval of Project

GridPPe-Science PresentationSlide 13 GridPP is not just LHC UK

GridPPe-Science PresentationSlide 14 Tier1&2 Plans RAL already has 300 cpus, 10TB disk, and STK tape silo which can hold 330TB Install significant capacity at RAL this year to meet BaBar TierA Centre requirements Integrate with worldwide BaBar work Integrate with DataGrid testbed Integrate Tier1 and 2 within GridPP Upgrade Tier2 centres through SRIF (UK university funding programme)

GridPPe-Science PresentationSlide 15 Tier1 Integrated Resources

GridPPe-Science PresentationSlide 16 Liverpool MAP cpus + several TB of disk –delivered simulation for LHCb and others for several years Upgrades of cpus and storage planned for 2001 and 2002 –currently adding Globus –develop to allow analysis work also

GridPPe-Science PresentationSlide 17 Imperial College Currently –180 cpus –4TB disk 2002 –adding new cluster in 2002 –shared with Computational Engineering –850 nodes –20TB disk –24TB tape CMS, BaBar, D0

GridPPe-Science PresentationSlide 18 Not Fully Installed Lancaster Worker 196 Worker CPUs Switch Controller Node Switch 500 GB Bulkserver 100 MB/s Ethernet Tape Library Capacity ~ 30 TB k£11/30 TB Worker Controller Node 500 GB Bulkserver 1000 MB/s Ethernet Fiber Finalizing Installation of Mass Storage System ~ 2 Months

GridPPe-Science PresentationSlide 19 Lancaster Currently D0 –analysis data from FNAL for UK –simulation Future –upgrades planned –Tier2 RC –Atlas-specific

GridPPe-Science PresentationSlide 20 Tendering now 128 CPU at Glasgow 5 TB Datastore + server at Edinburgh ATLAS/LHCb Plans for future upgrades to 2006 Linked with UK Grid National Centre ScotGrid

GridPPe-Science PresentationSlide 21 Network UK Academic Network, SuperJANET entered phase 4 in GB backbone, December 2000-April Mbit to RAL, April 2001 Most MANs have plans for 2.5GB on their backbones Peering with GEANT planned at 2.5GB

GridPPe-Science PresentationSlide 22 Wider UK Grid Prof Tony Hey leading Core Grid Programme UK National Grid –National Centre –9 Regional Centres Computer Science lead includes many sites with PP links –Grid Support Centre (CLRC) –Grid Starter Kit vesion 1 based on Globus, Condor, Common software e-science Institute Grid Network Team Strong Industrial Links All Research Areas have their own e-science plans

GridPPe-Science PresentationSlide 23 Summary UK has plans for a national grid for particle physics –to deliver the computing for several virtual organisations (LHC and non-LHC) Collaboration established, proposal approved, plan in place Will deliver –UK commitment to DataGrid, –prototype Tier1 and 2 –UK commitment to US experiments Work closely with other disciplines Have been working towards this project for ~ 2 years building up hardware Funds installation and operation of experimental testbeds, key infrastructure, generic middleware and making application code grid aware

GridPPe-Science PresentationSlide 24 The End

GridPPe-Science PresentationSlide 25 UK Strengths Wish to build on UK strengths - Information Services Networking Security Mass Storage UK Major Grid Leadership roles - Lead DataGrid Architecture Task Force (Steve Fisher) Lead DataGrid WP3 Information Services (Robin Middleton) Lead DataGrid WP5 Mass Storage (John Gordon) Strong Networking Role in WP7 (Peter Clarke, Robin Tasker) ATLAS Software Coordinator (Norman McCubbin) LHCb Grid Coordinator (Frank Harris) Strong UK Collaboration with Globus Globus people gave 2 day tutorial at RAL to PP community Carl Kesselman attended UK Grid Technical meeting 3 UK people visited Globus at Argonne Natural UK Collaboration with US PPDG and GriPhyN

GridPPe-Science PresentationSlide 26 8x80cpu farms 10 sites with 12TB disk and Suns simulation data mirroring from SLAC - Kanga, Objectiity data movement and mirroring across UK data location discovery across UK - mySQL remote job submission - Globus and PBS common usernames across UK - GSI gridmapfiles Find data - submit job to data - register output BaBar planning a distributed computing model –TierA centres BaBar

GridPPe-Science PresentationSlide 27 CDF Similar model to BaBar with disk and cpu resources at RAL and universities plus farm for simulation. Development of Grid access to CDF databases. Data replication from FNAL to UK and around UK Data Location Discovery through metadata

GridPPe-Science PresentationSlide 28 D0 Large data centre at Lancaster ship data from FNAL to UK simulation in UK and ship data back to FNAL Gridify SAM access to data –data at FNAL and Lancaster