David Britton, 28/7/03 Imperial, College GridPP1 Overview David Britton, 28/7/2003 Imperial College.

Slides:



Advertisements
Similar presentations
EU DataGrid progress Fabrizio Gagliardi EDG Project Leader
Advertisements

An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
30-31 Jan 2003J G Jensen, RAL/WP5 Storage Elephant Grid Access to Mass Storage.
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
Fabric and Storage Management GridPP Fabric and Storage Management GridPP 24/24 May 2001.
EU DataGrid TestBed 2 Component Review Paul Millar (University of Glasgow) (slides based on a presentation by Erwin Laure)
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Tony Doyle Executive Summary, PPARC, MRC London, 15 May 2003.
ATLAS/LHCb GANGA DEVELOPMENT Introduction Requirements Architecture and design Interfacing to the Grid Ganga prototyping A. Soroko (Oxford), K. Harrison.
Partner Logo UK GridPP Testbed Rollout John Gordon GridPP 3rd Collaboration Meeting Cambridge 15th February 2002.
GridPP Project Overview
Tony Doyle Grid Development in UK, GlueX Meeting, Glasgow, 5 August 2003.
Level 1 Components of the Project. Level 0 Goal or Aim of GridPP. Level 2 Elements of the components. Level 2 Milestones for the elements.
Tony Doyle Overview of UK Development and Deployment Programme, LCG PEB Meeting, CERN, 16 September 2003.
Tony Doyle GridPP2 Proposal, BT Meeting, Imperial, 23 July 2003.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Data Management Expert Panel - WP2. WP2 Overview.
Data Management Expert Panel. RLS Globus-EDG Replica Location Service u Joint Design in the form of the Giggle architecture u Reference Implementation.
Presenter Name Facility Name EDG Testbed Status Moving to Testbed Two.
Andrew McNab - Manchester HEP - 2 May 2002 Testbed and Authorisation EU DataGrid Testbed 1 Job Lifecycle Software releases Authorisation at your site Grid/Web.
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Nick Brook University of Bristol The LHC Experiments & Lattice EB News Brief overview of the expts  ATLAS  CMS  LHCb  Lattice.
FP7-INFRA Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
Plateforme de Calcul pour les Sciences du Vivant SRB & gLite V. Breton.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
GRACE Project IST EGAAP meeting – Den Haag, 25/11/2004 Giuseppe Sisto – Telecom Italia Lab.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
Grid Job and Information Management (JIM) for D0 and CDF Gabriele Garzoglio for the JIM Team.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
Δ Storage Middleware GridPP10 What’s new since GridPP9? CERN, June 2004.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
CERN LCG Deployment Overview Ian Bird CERN IT/GD LHCC Comprehensive Review November 2003.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
The Experiments – progress and status Roger Barlow GridPP7 Oxford 2 nd July 2003.
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
…building the next IT revolution From Web to Grid…
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
Evolution of a High Performance Computing and Monitoring system onto the GRID for High Energy Experiments T.L. Hsieh, S. Hou, P.K. Teng Academia Sinica,
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
INFSO-RI Enabling Grids for E-sciencE Experience of using gLite for analysis of ATLAS combined test beam data A. Zalite / PNPI.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
29/1/2002A.Ghiselli, INFN-CNAF1 DataTAG / WP4 meeting Cern, 29 January 2002 Agenda  start at  Project introduction, Olivier Martin  WP4 introduction,
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
MND review. Main directions of work  Development and support of the Experiment Dashboard Applications - Data management monitoring - Job processing monitoring.
Tony Doyle “Grid Development in UK”, GlueX Meeting, Glasgow, 5 August 2003.
DataGrid is a project funded by the European Commission under contract IST rd EU Review – 19-20/02/2004 The EU DataGrid Project Three years.
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Distributed Analysis Tutorial Dietrich Liko. Overview  Three grid flavors in ATLAS EGEE OSG Nordugrid  Distributed Analysis Activities GANGA/LCG PANDA/OSG.
Breaking the frontiers of the Grid R. Graciani EGI TF 2012.
J Jensen / WP5 /RAL UCL 4/5 March 2004 GridPP / DataGrid wrap-up Mass Storage Management J Jensen
Bob Jones EGEE Technical Director
Grid related projects CERN openlab LCG EDG F.Fluckiger
Fabric and Storage Management
LCG middleware and LHC experiments ARDA project
LHC Data Analysis using a worldwide computing grid
Future EU Grid Projects
Presentation transcript:

David Britton, 28/7/03 Imperial, College GridPP1 Overview David Britton, 28/7/2003 Imperial College

David Britton, 28/7/03 Imperial, College Project Overview

David Britton, 28/7/03 Imperial, College Institutes GridPP GridPP in Context Core e-Science Programme GridPP CERN LCG Tier-1/A Middleware Experiments Tier-2 Grid Support Centre EGEE Not to scale! Apps Dev Apps Int

David Britton, 28/7/03 Imperial, College The Project Map

David Britton, 28/7/03 Imperial, College LHC Computing Grid (LCG) Help create LCG project Components shared by LHC experiments; e.g. POOL Prototyping the CERN Tier0/1 center. (Scalability issue) Requirements; Monitoring; recommendations. Deploying and operating the LHC Computing Grid 6.2* TOTAL 91.1 GridPP 24.0 * LCG management FTE in 2002

David Britton, 28/7/03 Imperial, College LHC Computing Grid LCG-1 Release Ref.milestone descriptiontarget date M1.1First Global Service (LCG-1) - Initial Availability This comprises the construction and commissioning of the first LHC Computing service suitable for physics usage. The service must offer reliably 24x7 availability to all four LHC experiments and include some ten Regional Centers from Europe, North America and Asia. The milestone includes delivery of the associated Technical Design, containing description of the architecture and functionality and quantified technical specifications of performance (capacity, throughput, reliability, availability). It must also include middleware specifications, agreed as a common toolkit by Europe and US. The service must prove functional, providing a batch service for event production and analysis of the simulated data set. For the milestone to be met, operation must be sustained reliably during a 7 day period; stress tests and user productions will be executed, with a failure rate below 1%. July 2003

David Britton, 28/7/03 Imperial, College Certification and distribution process established Middleware package – components from – –European DataGrid (EDG) –US (Globus, Condor, PPDG, GriPhyN) the Virtual Data Toolkit Agreement reached on principles for registration and security RAL to provide the initial grid operations centre FZK to operate the call centre Initial service being deployed now to 10 centres US, Europe, Asia Expand to other centres as soon as the service is stable LCG Academia Sinica Taipei, BNL, CERN, CNAF, FNAL, FZK, IN2P3 Lyon, Moscow State Univ., RAL, Univ. Tokyo LHC Computing Grid Service

David Britton, 28/7/03 Imperial, College Resources committed for 1Q04 LCG Service - Target for 2004 Establish the LHC Grid as a service for data challenges, and computing model evaluation the basic infrastructure for distribution, coordination, operation building collaboration between the people who manage and operate the Regional Centres integrating the majority of the resources in Regional Centres needed for the LHC data challenges of 2004 reliability – this is the priority – and essential for …. providing measurable value for experiments production teams and attracting end-users to the grid CPU (kSI2K) Disk TB Support FTE Tape TB CERN Czech Repub France Germany Holland Italy Japan Poland Russia Taiwan Spain Sweden Switzerland UK USA Total LCG

David Britton, 28/7/03 Imperial, College European DataGrid (EDG) EDG 2.0EDG 2.1

David Britton, 28/7/03 Imperial, College European DataGrid UK Roles Quality Assurance Representative0.5 Deputy Leader, (Iteam)2.0 Group Leader, Deputy Leader, Iteam, ATF, QA EU 2.0 Group Leader, Deputy Leader, Iteam, ATF, QA EU CA Group Manager3.0 Deputy Leader, Security Group Leader, ATF3.7* * Includes Security Group Leader4.0 FTE

David Britton, 28/7/03 Imperial, College European DataGrid – WP1 WP1 Workload Management Deploy and support Resource Brokers at IC

David Britton, 28/7/03 Imperial, College European DataGrid – WP1 WP1 Workload Management Logging & Bookkeeping Server Saving of job checkpoint state state.saveState() Job Job checkpoint states saved in the LB server Retrieval of job checkpoint u Also used (even in rel. 1) as repository of job status info u Already proved to be robust and reliable u The load can be distributed between multiple LB servers, to address scalability problems Job Checkpointing in EDG2.0

David Britton, 28/7/03 Imperial, College European DataGrid – WP2 WP2 Data Management Storage Element Replica Manager Replica Location Service Replica Optimization Service Replica Metadata Catalog SE Monitor Network Monitor Information Service Resource Broker User Interface or Worker Node Storage Element Virtual Organization Membership Service UK Contributions RM in EDG2.0

David Britton, 28/7/03 Imperial, College European DataGrid – WP3 WP3 Information and Monitoring Services UK Product RGMA in EDG2.0 R-GMA Consumers LDAP InfoProvider GIN LDAP Server LDAP InfoProvider Stream Producer GIN Consumer (CE) Consumer (SE) Consumer (SiteInfo) RDBMS Latest Producer GOUT ConsumerA PI Archiver (LatestProducer) Stream Producer R-GMA GLUE Schema Push mode Updates every 30s >70 sites (simul.)

David Britton, 28/7/03 Imperial, College European DataGrid – WP4 WP4 Fabric Management UK Product LCFG configuration software from Univ. Edinburgh was used from Month 12 onwards of the EDG project. Newer version, LCFGng, in EDG-2.0 and in LCG-1

David Britton, 28/7/03 Imperial, College European DataGrid – WP5 WP5 Mass Storage Management SE in EDG2.0 Client App API SE HTTP library SSL socket library AXIS SE core SE Java Client Tomcat u The design of the SE follows a layered model with a central core handling all paths between client and MSS. Core is flexible and extensible making it easy to support new protocols, features and MSS Client App Java Client API C Client RMANMAN Apache Web Service UK Product

David Britton, 28/7/03 Imperial, College European DataGrid – WP6 WP6 Testbed EDG Application testbed: More than 40 sites More than 1000 CPUs 5 Terabyte of storage Testbed successfully demonstrated during 2 nd EU review in Feb 2003 Large UK participation

David Britton, 28/7/03 Imperial, College European DataGrid – WP7 WP7 Network Services

David Britton, 28/7/03 Imperial, College Applications

David Britton, 28/7/03 Imperial, College Applications: GANGA Underlying GRID services (GLOBUS toolkit) GRID middleware (EDG, PPDG,…) Application specific layer (Athena/Gaudi, …) GUI interface OS and Network services Multilayered Grid architecture GANGA A common interface to the Grid for Atlas and LHCb Server Bookkeeping DB Production DB EDG UI PYTHON SW BUS XML RPC server XML RPC module GANGA Module OS Module Athena\ GAUDI GaudiPython PythonROOT PYTHON SW BUS GUI Job Configuration DB Remote user (client) Local Job DB LAN/WAN GRID LRMS

David Britton, 28/7/03 Imperial, College Applications: CHEP03 Papers ATCom GANGA DIRAC Grid Tests Three papers Six papers Total of 14 Application papers (plus 7 middleware papers).

David Britton, 28/7/03 Imperial, College Applications: CMS GUIDO portal demonstrated at All-Hands New, generic version to be unveiled this year. Adding RGMA to BOSS

David Britton, 28/7/03 Imperial, College Applications: CDF/D0

David Britton, 28/7/03 Imperial, College Applications: CDF/D0 D0 plan to reprocess 22 TB of DST using SAMGRID between Sep 1 st and Nov 25th

David Britton, 28/7/03 Imperial, College Tier-1/A

David Britton, 28/7/03 Imperial, College Tier-2

David Britton, 28/7/03 Imperial, College

David Britton, 28/7/03 Imperial, College Testbed

David Britton, 28/7/03 Imperial, College All testbed sites can be said to be truly on a Grid by virtue of their registration in a comprehensive resource information publication scheme, their accessibility via a set of globally enabled resource brokers, and the use of one of the first scalable mechanisms to support distributed virtual communities (VOMS). There are few such Grids in operation in the World today.

David Britton, 28/7/03 Imperial, College Data Challenges

David Britton, 28/7/03 Imperial, College LHCb Data Challenge: 1/3 of events produced in the UK

David Britton, 28/7/03 Imperial, College ATLAS DC1 Phase 2 UK largest producer

David Britton, 28/7/03 Imperial, College Total number of MC events produced/processed in millions for Q1 and Q2 in 2002 CMS Data Challenge

David Britton, 28/7/03 Imperial, College Interoperability and Dissemination

David Britton, 28/7/03 Imperial, College

David Britton, 28/7/03 Imperial, College Resources

David Britton, 28/7/03 Imperial, College Summary Status: At the midpoint of GridPP1, 95 out of 182 tasks (52%) have been successfully completed. The LCG-1 release is imminent; a landmark moment. Achievements: A major factor in establishing the LCG project Leadership roles within 5 of the 8 (applicable) EDG workpackages. Significant middleware development. Deployed the UKs largest testbed (16 sites and >100 servers). Integration with EU-wide programme and linked to Worldwide efforts. Active Tier-1/A production centre, meeting demands. Tier-2 resources identified and a structure being developed. Established productive links with the experiment applications. Responded to the LHC Computing Challenge: Defined and initiated a programme that will be the basis for LHC computing.