Grid Applications for High Energy Physics and Interoperability Dominique Boutigny CC-IN2P3 June 24, 2006 Centre de Calcul de l’IN2P3 et du DAPNIA.

Slides:



Advertisements
Similar presentations
EGEE-II INFSO-RI Enabling Grids for E-sciencE The gLite middleware distribution OSG Consortium Meeting Seattle,
Advertisements

LCG France Network Infrastructures Centre de Calcul IN2P3 June 2007
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
Plateforme de Calcul pour les Sciences du Vivant SRB & gLite V. Breton.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Jean-Yves Nief, CC-IN2P3 Wilko Kroeger, SCCS/SLAC Adil Hasan, CCLRC/RAL HEPiX, SLAC October 11th – 13th, 2005 BaBar data distribution using the Storage.
Grid Architecture Grid Canada Certificates International Certificates Grid Canada Issued over 2000 certificates Condor G Resource TRIUMF.
Joining the Grid Andrew McNab. 28 March 2006Andrew McNab – Joining the Grid Outline ● LCG – the grid you're joining ● Related projects ● Getting a certificate.
Overview of LCG-France Tier-2s and Tier-3s Frédérique Chollet (IN2P3-LAPP) on behalf of the LCG-France project and Tiers representatives CMS visit to Tier-1.
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
Assessment of Core Services provided to USLHC by OSG.
Grid Computing for High Energy Physics in Japan Hiroyuki Matsunaga International Center for Elementary Particle Physics (ICEPP), The University of Tokyo.
November 16, 2007 Dominique Boutigny – CC-IN2P3 Grids: Tools for e-Science DoSon AC GRID School.
Centre de Calcul IN2P3 Centre de Calcul de l'IN2P Boulevard Niels Bohr F VILLEURBANNE
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
José M. Hernández CIEMAT Grid Computing in the Experiment at LHC Jornada de usuarios de Infraestructuras Grid January 2012, CIEMAT, Madrid.
F.Fanzago – INFN Padova ; S.Lacaprara – LNL; D.Spiga – Universita’ Perugia M.Corvo - CERN; N.DeFilippis - Universita' Bari; A.Fanfani – Universita’ Bologna;
D0 SAM – status and needs Plagarized from: D0 Experiment SAM Project Fermilab Computing Division.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
LCG Service Challenge Phase 4: Piano di attività e impatto sulla infrastruttura di rete 1 Service Challenge Phase 4: Piano di attività e impatto sulla.
PPDG and ATLAS Particle Physics Data Grid Ed May - ANL ATLAS Software Week LBNL May 12, 2000.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Data transfer over the wide area network with a large round trip time H. Matsunaga, T. Isobe, T. Mashimo, H. Sakamoto, I. Ueda International Center for.
BaBar Data Distribution using the Storage Resource Broker Adil Hasan, Wilko Kroeger (SLAC Computing Services), Dominique Boutigny (LAPP), Cristina Bulfon.
© 2006 Open Grid Forum Enabling Pervasive Grids The OGF GIN Effort Erwin Laure GIN-CG co-chair, EGEE Technical Director
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
Status Report of WLCG Tier-1 candidate for KISTI-GSDC Sang-Un Ahn, for the GSDC Tier-1 Team GSDC Tier-1 Team 12 th CERN-Korea.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Overview of grid activities in France in relation to FKPPL FKPPL Workshop Thursday February 26th, 2009 Dominique Boutigny.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
LHC Computing, CERN, & Federated Identities
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
KEK GRID for ILC Experiments Akiya Miyamoto, Go Iwai, Katsumasa Ikematsu KEK LCWS March 2010.
Computing activities in France Dominique Boutigny CC-IN2P3 May 12, 2006 Centre de Calcul de l’IN2P3 et du DAPNIA Restricted ECFA Meeting in Paris.
Summary of SC4 Disk-Disk Transfers LCG MB, April Jamie Shiers, CERN.
A Computing Tier 2 Node Eric Fede – LAPP/IN2P3. 2 Eric Fede – 1st Chinese-French Workshop Plan What is a Tier 2 –Context and definition To be a Tier 2.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Summary GRID and Computing Takashi Sasaki KEK Computing Research Center.
LCG-France, the infrastructure, the activities Informal meeting France-Israel November 3rd, 2009.
Collaborative Research Projects in Australia: High Energy Physicists Dr. Greg Wickham (AARNet) Dr. Glenn Moloney (University of Melbourne) Global Collaborations.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
Grid Computing 4 th FCPPL Workshop Gang Chen & Eric Lançon.
ATLAS Computing Model Ghita Rahal CC-IN2P3 Tutorial Atlas CC, Lyon
CC-IN2P3: A High Performance Data Center for Research Dominique Boutigny February 2011 Toward a future cooperation with Israel.
Bob Jones EGEE Technical Director
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
(Prague, March 2009) Andrey Y Shevel
Grid Interoperability
BaBar-Grid Status and Prospects
“A Data Movement Service for the LHC”
Status report on LHC_2: ATLAS computing
Status Report on LHC_2 : ATLAS computing
LHC Computing Grid Project Status
LCG France Network Infrastructures
Data Challenge with the Grid in ATLAS
LCG-France activities
PanDA in a Federated Environment
Readiness of ATLAS Computing - A personal view
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
A high-performance computing facility for scientific research
Project: COMP_01 R&D for ATLAS Grid computing
LHC Data Analysis using a worldwide computing grid
GRIF : an EGEE site in Paris Region
Presentation transcript:

Grid Applications for High Energy Physics and Interoperability Dominique Boutigny CC-IN2P3 June 24, 2006 Centre de Calcul de l’IN2P3 et du DAPNIA

CERN Lab a Lab c Univ. n Lab m Lab b Univ. b Univ. y Univ. x Allemagne Tier1 1 USA FermiLab UK France Italy Hollande USA Brookhaven Japon Lab d France Lab e France ASCC/Taipei RAL/UK CCIN2P3/FR TRIUMF/CA GridKa/DE CNAF/IT PIC/SP BNL/US Tier-0 Tier-1 USC Krakow CIEMAT Rome Taipei Canada CSCS UB IFCA IC MSU IN2P3-LPC Cambridge IFIC Tier-2 Tier-3 Desktop/portables GRIF IN2P3-Subatech SARA/NL LAPP CPPM Labo X LHC Experiments  15 PByte of data / year A computing Grid architecture at the world level is mandatory to process them W-LCG Grid infrastructure

Framework Grid activities at IN2P3 are mainly related to LHC computing Deeply involved in the EGEE project –Grid Operation  ~10 engineers involved at CC-IN2P3 –Network The main goal is to setup a Tier-1 node at CC-IN2P3 for the worldwide grid W-LCG for physics oriented production –LHC will be starting in 2007, the Grid infrastructure should be up and running by that time. –Experiencing the Grid through Service and Data challenges Ramping up, step by step, the data throughput and the number of jobs up to the LHC nominal value

Network monitoring – Ouput from CERN 1.6 GB/s Network monitoring – Input to CC-IN2P3 250 MB/s Optical network connection to CERN : 10 Gbps

W-LCG = EGEE (gLite) + OSG (+ Nordic) –Need to interoperate both Grids At the job level –Some level of interoperability has been reached for LCG 2 / OSG interoperability –Still some work to do, especially for gLite / OSG interoperation. –Not clear how interoperation is working at the level of the data catalog At the Operation level –Work is just starting –Crucial in the context of LHC computing

IT / CC-IN2P3 relationships ANR: 3 year project: CC-IN2P3 – LIP (RESO) – RENATER + FNAL (Chicago) –2-fold: OSG / EGEE interoperability (CC-IN2P3) High bandwidth network transfers (LIP) –Understand data transfer patterns on a long distance network and with data from real applications –Optimize data transfers –Will get a 2×1 Gbps dedicated link between Chicago and CC-IN2P3 (RENATER) –Hope to have it upgraded up to 10 Gbps before the end of the project –Project started since 3-4 month – Quite some infrastructure to setup before actual work could begin

CC-IN2P3 and W-LCG grid –Large infrastructure with operational support –Provide real applications involving a huge amount of data Computer Science: –Develop Grid software from a more fundamental point of view Are able to handle workflow, bottlenecks, data access, networks etc… without the pressure of HEP data to analyze I am convinced that both world could benefit from each other and that next generation of HEP operational Grids will come from the IT world.

France / Japan relationship 2 computing projects within the Associated International Laboratory (AIL) –1 focused on LHC Computing –1 focused on interoperability between NAREGI and EGEE Large overlap with NEGST Close relationship between Tokyo University and CC-IN2P3 related to ATLAS experiment. –Wish to connect the Tokyo Tier-2 (ICEPP) to CC-IN2P3 Tier-1 Data exchange between both sites Strong interest to develop relationships in view of the future International Linear Colider project where France and Japan have common interest

SRB The Storage Resource Broker and its successor: RODS is also a subject of common interest (developed at SDSC) SRB is a distributed file catalog with a sophisticated Metadata management system. –Data distribution / replication is very efficient (heavily used in the BaBar experiment between Stanford and Lyon) SRB is not part of EGEE / OSG –KEK Japanese colleagues are working on SRB / EGEE interoperation –CC-IN2P3 is very interested to join this effort Recent interoperability test between several SRB server federations located in Japan, New-Zealand, USA, France, Italy, UK, Australia. –Performances were very good and worldwide SRB configuration was especially easy to setup