GridPP 12th Collaboration Meeting Networking: Current Status Robin Tasker 31 January 2005.

Slides:



Advertisements
Similar presentations
TAB, 03. March 2006 Bruno Hoeft German LHC – WAN present state – future concept Forschungszentrum Karlsruhe GmbH Institute for Scientific Computing P.O.
Advertisements

CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Connect. Communicate. Collaborate GÉANT2 (and 3) CCIRN XiAn 26 August 2007 David West DANTE.
EU DataGrid progress Fabrizio Gagliardi EDG Project Leader
An open source approach for grids Bob Jones CERN EU DataGrid Project Deputy Project Leader EU EGEE Designated Technical Director
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 14 th Collaboration Meeting Birmingham 6-7 Sept 2005.
21 Sep 2005LCG's R-GMA Applications R-GMA and LCG Steve Fisher & Antony Wilson.
24-May-01D.P.Kelsey, GridPP WG E: Security1 GridPP Work Group E Security Development David Kelsey CLRC/RAL, UK
ESLEA at UKLight1 Some Network Developments UKLight – a new facility & opportunity SuperJanet Development Network Plans SuperJanet 5 Requirements Gathering.
ESLEA and HEPs Work on UKLight Network. ESLEA Exploitation of Switched Lightpaths in E- sciences Applications Exploitation of Switched Lightpaths in E-
S.L.LloydGrid Steering Committee 8 March 2002 Slide 1 Status of GridPP Overview Financial Summary Recruitment Status EU DataGrid UK Grid Status GridPP.
GridPP9 – 5 February 2004 – Data Management DataGrid is a project funded by the European Union GridPP is funded by PPARC GridPP2: Data and Storage Management.
1 ALICE Grid Status David Evans The University of Birmingham GridPP 16 th Collaboration Meeting QMUL June 2006.
Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Storage Review David Britton,21/Nov/ /03/2014 One Year Ago Time Line Apr-09 Jan-09 Oct-08 Jul-08 Apr-08 Jan-08 Oct-07 OC Data? Oversight.
Partner Logo Tier1/A and Tier2 in GridPP2 John Gordon GridPP6 31 January 2003.
B A B AR and the GRID Roger Barlow for Fergus Wilson GridPP 13 5 th July 2005, Durham.
NNW Manchester St Pancras EastNet Cambridge EastNet Cambridge Warrington C-PoP 10G ULCC Amsterdam 10G Chicago 10G Leeds C-PoP Leeds C-PoP EastMAN Edinburgh.
Pete Clarke– GridPP 6 – 31 Jan n° 1 EGEE EGEE - The Network Sector.
1Oxford eSc – 1 st July03 GridPP2: Application Requirement & Developments Nick Brook University of Bristol ALICE Hardware Projections Applications Programme.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
GridPP: Executive Summary Tony Doyle. Tony Doyle - University of Glasgow Oversight Committee 11 October 2007 Exec 2 Summary Grid Status: Geographical.
GridPP Presentation to PPARC Grid Steering Committee 26 July 2001 Steve Lloyd Tony Doyle John Gordon.
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Bernd Panzer-Steindel, CERN/IT WAN RAW/ESD Data Distribution for LHC.
The LHC experiments AuthZ Interoperation requirements GGF16, Athens 16 February 2006 David Kelsey CCLRC/RAL, UK
Service Data Challenge Meeting, Karlsruhe, Dec 2, 2004 Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Plans and outlook at GridKa Forschungszentrum.
Forschungszentrum Karlsruhe in der Helmholtz-Gemeinschaft Torsten Antoni – LCG Operations Workshop, CERN 02-04/11/04 Global Grid User Support - GGUS -
CHEP – Mumbai, February 2006 The LCG Service Challenges Focus on SC3 Re-run; Outlook for 2006 Jamie Shiers, LCG Service Manager.
15 May 2006Collaboration Board GridPP3 Planning Executive Summary Steve Lloyd.
Deployment Board Introduction David Kelsey 29 Oct 2004
KIT – University of the State of Baden-Wuerttemberg and National Research Center of the Helmholtz Association Steinbuch Centre for Computing (SCC)
LCG-France Project Status Fabio Hernandez Frédérique Chollet Fairouz Malek Réunion Sites LCG-France Annecy, May
GridPP meeting Feb 03 R. Hughes-Jones Manchester WP7 Networking Richard Hughes-Jones.
SuperJANET4 Update Roland Trice, ULCC SuperJANET4 Rollout Manager.
Connect. Communicate. Collaborate The Technological Landscape of GÉANT2 Roberto Sabatino, DANTE
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
Centre for Earth Systems Engineering Research Infrastructure Transitions Research Consortium (ITRC) David Alderson & Stuart Barr What is the aim of ITRC?
Monitoring the Grid at local, national, and Global levels Pete Gronbech GridPP Project Manager ACAT - Brunel Sept 2011.
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
3 June 2004GridPP10Slide 1 GridPP Dissemination Sarah Pearce Dissemination Officer
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
LHC Open Network Environment LHCONE David Foster CERN IT LCG OB 30th September
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
GridPP Collaboration Meeting Networking: Current Status Robin Tasker CCLRC, Daresbury Laboratory 3 June 2004.
US –Japan N+N 1 The Grid and the Network The UK Network Infrastructure A summary of E-Science supported Network projects in the UK Protocols Middleware.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
…building the next IT revolution From Web to Grid…
Tony Doyle - University of Glasgow 8 July 2005Collaboration Board Meeting GridPP Report Tony Doyle.
Metadata Mòrag Burgon-Lyon University of Glasgow.
Grid User Interface for ATLAS & LHCb A more recent UK mini production used input data stored on RAL’s tape server, the requirements in JDL and the IC Resource.
GridPP 11 th Collaboration Meeting Networking: Current Status Robin Tasker 14 September 2004.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Mark Leese Daresbury Laboratory GridMon EGEE JRA4 R-GMA workshop Thursday 22 nd July 2004 University College London Mark Leese.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
INFSO-RI Enabling Grids for E-sciencE Network Services Development Network Resource Provision 3 rd EGEE Conference, Athens, 20 th.
Slide § David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow GridPP delivering The UK Grid.
17 September 2004David Foster CERN IT-CS 1 Network Planning September 2004 David Foster Networks and Communications Systems Group Leader
T0-T1 Networking Meeting 16th June Meeting
Bob Jones EGEE Technical Director
James Casey, CERN IT-GD WLCG Workshop 1st September, 2007
“A Data Movement Service for the LHC”
UK GridPP Tier-1/A Centre at CLRC
WP7 objectives, achievements and plans
LHC Data Analysis using a worldwide computing grid
Collaboration Board Meeting
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

GridPP 12th Collaboration Meeting Networking: Current Status Robin Tasker 31 January 2005

1. To bring the technology of very high rate/long distance data transport to practical use in experimental production environments (both current and LHC experiments); and to demonstration of one or more UK HEP experiments being able to regularly transport data at rates in excess of 500 Mbit/s and preferably 1 Gbit/s. 2. To exploit the UKLIGHT infrastructure to utilise switched dedicated circuits between major centres including UK Tier1 and 2, CERN and FNAL. [Barney Garrett] 3. To participate in EGEE oriented network monitoring service developments and deployment within the UK, and development of diagnostic engines for Grid operations [Mark Leese] 4. To maintain the strategic relation which HEP holds with all relevant major Network authorities globally. 5. To provide PPNCG support and other work as specified Network Objectives

Network Performance Monitoring Current partners: Current collaborators:

Current Activities #1 GGF NM-WG 1. The group are looking at new, more powerful schemas 2. Also pushing the stabilisation of the current schemas. This is very important for early adopters, such as EGEE JRA4, DANTE, Internet2 and NCSA, as well as GridMon EGEE-JRA4 1. Collaboration with JRA4 resulting in their network performance monitoring prototype. 2. JRA4 “mediator” software was developed together with a sample human client. The mediator provided a single interface to two underlying monitoring infrastructures via the NM-WG interfaces The prototype was important because it demonstrated, albeit in a simplistic manner, obtaining network performance data from multiple domains using the same method. The deliverable detailing this approach – DJRA4.2, Specification of Interfaces for Network Performance Monitoring – was submitted on time and was successful in its EGEE review

Current Activities #2 GridMon The Web Service A first pass of a NM-WG compliant web services interface into the relational database by March It will allow more complex queries than the original JRA4 prototype, and thus demonstrate the power of this idea even further. Monitoring Nodes Moving to a model where the tools store their results in a central database at DL. Web services and human (web) access to the data will also be via services running at DL. Storing and providing access to the data from a central location allows us to reduce the complexity of the individual monitoring nodes. Critically, is the move to a relational database model, speeding up access to the data, and allowing far more advanced queries to be made.

Networking for LHC T0/T1 Meetings 8 October 2004 and 20 January 2005 Meeting between CERN, T1 representatives, NRENs and DANTE LHC T1 Requirements: provides a ballpark understanding of the raw data requirements. The prosaic approach says the provision of a 10Gbits/s access from a T1 will in the first instance (2007) be sufficient. On-going / refining work – Jamie Shiers (CERN) T1/T1: As yet limited estimations have been developed but if the capacity is not sufficient then quality will suffer but usage is still possible. T1/T2: The model for delivery from T1 to T2's and restoring datasets back from T2's to the T1 is not well developed. Also to be understood is the relationship between a T1 and the set of T2s accessing it (particularly across national boundaries) GN2 architecture: at one extreme a 10Gbits/s full mesh (T0/T1 and T1/T1) would solve the problem but other architectures were at least as effective particularly when the different Quality requirements were considered. i.e. a 10Gbits/s access into a "star" centred on CERN GN2 procurement and deployment through 2005 and early 2006

Service Challenge Issues Service Challenge Workshop 27/28 January 2005 Nominal LHC Data Rates: (from Les Robertson) ALICE ATLAS CMS LHCb Total CERN GBytes/s av T MBytes/s SC3: 2Q2005 (disk - network - disk) with aim of 60MB/s at each T1 and 500MB/s agg at CERN with modest T2 traffic. Aim that by end 2005 this is available as service for experiments to use SC4: 1Q2006 running at nominal data rate for LHC by summer 2006, and service by September 2006 and then run out the operational service at the full rate, i.e. twice the nominal rate. Purpose: 1. The point is to stress test the system and ensure that it is operationally capable 2. Needs to ensure that the planned connectivity T0 - T1 is actually in place! 3. Need to be sorted out

RAL Connectivity - Now

NNW Manchester St Pancras EastNet Cambridge EastNet Cambridge Warrington C-PoP 10G ULCC Amsterdam 10G Chicago 10G Leeds C-PoP Leeds C-PoP C&NLMAN Lancaster CLRC-RAL Reading C-PoP YHMAN Leeds UKLight Phases 1 & 2, as of January 2005 David Salmon, UKERNA

RAL Connectivity - UKLight

RAL Connectivity

Questions?