Centre de Calcul IN2P3 Centre de Calcul de l'IN2P3 12-14 Boulevard Niels Bohr F-69622 VILLEURBANNE

Slides:



Advertisements
Similar presentations
LCG France Network Infrastructures Centre de Calcul IN2P3 June 2007
Advertisements

Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
IN2P3 Status Report HTASC March 2003 Fabio HERNANDEZ et al. from CC-in2p3 François ETIENNE
Jean-Yves Nief, CC-IN2P3 Wilko Kroeger, SCCS/SLAC Adil Hasan, CCLRC/RAL HEPiX, SLAC October 11th – 13th, 2005 BaBar data distribution using the Storage.
Overview of LCG-France Tier-2s and Tier-3s Frédérique Chollet (IN2P3-LAPP) on behalf of the LCG-France project and Tiers representatives CMS visit to Tier-1.
Internet Services Alberto Pace. Internet Services Group u Mission and Goals u Provide core computing services, worldwide u Three specific areas u Collaborative.
Hall D Online Data Acquisition CEBAF provides us with a tremendous scientific opportunity for understanding one of the fundamental forces of nature. 75.
14th April 1999Hepix Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
November 15, 2000US CMS Tier2 Plans Matthias Kasemann1 US CMS Software and Computing Tier 2 Center Plans Matthias Kasemann Fermilab DOE/NSF Baseline Review.
CC - IN2P3 Site Report Hepix Fall meeting 2009 – Berkeley
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
Group Computing Strategy Introduction and BaBar Roger Barlow June 28 th 2005.
BINP/GCF Status Report BINP LCG Site Registration Oct 2009
Grid Applications for High Energy Physics and Interoperability Dominique Boutigny CC-IN2P3 June 24, 2006 Centre de Calcul de l’IN2P3 et du DAPNIA.
Jean-Yves Nief CC-IN2P3, Lyon HEPiX-HEPNT, Fermilab October 22nd – 25th, 2002.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
1 st EGEE Conference – April UK and Ireland Partner Dave Kant Deputy ROC Manager.
Workshop KEK - CC-IN2P3 KEK new Grid system 27 – 29 Oct. CC-IN2P3, Lyon, France Day2 14: :55 (40min) Koichi Murakami, KEK/CRC.
28 April 2003Imperial College1 Imperial College Site Report HEP Sysman meeting 28 April 2003.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
21 st October 2002BaBar Computing – Stephen J. Gowdy 1 Of 25 BaBar Computing Stephen J. Gowdy BaBar Computing Coordinator SLAC 21 st October 2002 Second.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
The ILC And the Grid Andreas Gellrich DESY LCWS2007 DESY, Hamburg, Germany
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
LCG-France Vincent Breton, Eric Lançon and Fairouz Malek, CNRS-IN2P3 and LCG-France ISGC Symposium Taipei, March 27th 2007.
22nd March 2000HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
2-3 April 2001HEPSYSMAN Oxford Particle Physics Site Report Pete Gronbech Systems Manager.
HEPIX - HEPNT, 1 Nov Milos Lokajicek, IP AS CR, Prague1 Status Report - Czech Republic HEP Groups and experiments Networking and Computing Grid activities.
Manchester Site report Sabah Salih HEPP The University of Manchester UK HEP Tier3.
October 2006ICFA workshop, Cracow1 HEP grid computing in Portugal Jorge Gomes LIP Computer Centre Lisbon Laboratório de Instrumentação e Física Experimental.
Tim 18/09/2015 2Tim Bell - Australian Bureau of Meteorology Visit.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National Laboratory.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
S.Jarp CERN openlab CERN openlab Total Cost of Ownership 11 November 2003 Sverre Jarp.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
IDE disk servers at CERN Helge Meinhard / CERN-IT CERN OpenLab workshop 17 March 2003.
7 March 2000EU GRID Project Proposal Meeting CERN, M. Lokajicek 1 Proposal for Participation of the Czech Republic in the EU HEP GRID Project Institute.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
23.March 2004Bernd Panzer-Steindel, CERN/IT1 LCG Workshop Computing Fabric.
7 September 2007 AEGIS 2007 Annual Assembly Current Status of Serbian eInfrastructure: AEGIS, SEE-GRID-2, EGEE-II Antun Balaz SCL, Institute of Physics,
02/12/02D0RACE Worshop D0 Grid: CCIN2P3 at Lyon Patrice Lebrun D0RACE Wokshop Feb. 12, 2002.
SAMPLE IMAGE gLite on the Market – Why and How ir Works 4 th EGEE User Forum Thursday, 05 March 2009 Le Ciminiere, Catania, Sicily, Italy Gaël Youinou.
RHIC/US ATLAS Tier 1 Computing Facility Site Report Christopher Hollowell Physics Department Brookhaven National Laboratory HEPiX Upton,
Italy Report HTASC Meeting DESY, October 8-9, 2001 Francesco Forti, INFN-Pisa.
W.A.Wojcik/CCIN2P3, Nov 1, CCIN2P3 Site report Wojciech A. Wojcik IN2P3 Computing Center URL:
EGEE is a project funded by the European Union under contract IST EGEE Summary NA2 Partners April
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
Computing activities in France Dominique Boutigny CC-IN2P3 May 12, 2006 Centre de Calcul de l’IN2P3 et du DAPNIA Restricted ECFA Meeting in Paris.
BNL dCache Status and Plan CHEP07: September 2-7, 2007 Zhenping (Jane) Liu for the BNL RACF Storage Group.
Pathway to Petaflops A vendor contribution Philippe Trautmann Business Development Manager HPC & Grid Global Education, Government & Healthcare.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
LCG-France, the infrastructure, the activities Informal meeting France-Israel November 3rd, 2009.
November 28, 2007 Dominique Boutigny – CC-IN2P3 CC-IN2P3 Update Status.
The status of IHEP Beijing Site WLCG Asia-Pacific Workshop Yaodong CHENG IHEP, China 01 December 2006.
ATLAS Computing: Experience from first data processing and analysis Workshop TYL’10.
CC-IN2P3: A High Performance Data Center for Research Dominique Boutigny February 2011 Toward a future cooperation with Israel.
CCIN2P3 Site Report - BNL, Oct 18, CCIN2P3 Site report Wojciech A. Wojcik IN2P3 Computing Center.
CCIN2P3 Network November 2007 CMS visit to Tier1 CCIN2P3.
LCG France Network Infrastructures
CC - IN2P3 Site Report Hepix Spring meeting 2011 Darmstadt May 3rd
Christos Markou Institute of Nuclear Physics NCSR ‘Demokritos’
SAM at CCIN2P3 configuration issues
A high-performance computing facility for scientific research
IN2P3 Computing Center April 2007
Pierre Girard ATLAS Visit
CC and LQCD dimanche 13 janvier 2019dimanche 13 janvier 2019
Presentation transcript:

Centre de Calcul IN2P3 Centre de Calcul de l'IN2P Boulevard Niels Bohr F VILLEURBANNE A computing service centre, national & HEP-centric (= deeply coupled to one research community). Main customers = 40+ physics experiments created 1973 : CCPN, Paris Moving 1986 : CC-IN2P3, Lyon

15 March / 10 0,8 PetaOctet used 60 tape reader units, cartridges DB, hierarchic system ~ 1600 cpu's, 1,4 M SI2K (~3-4 Teraflops effective) ~ 100 TB disk - Network & QoS - Projects et many services "à la carte" ~60 people (45 computer engineers) One computing resource centre common to IN2P3-CNRS & DSM-CEA National : 20 HEP laboratories, 40 expériments (PP, PN, Astro), users International :Tier-1 / Tier-A status Annual Budget : ~ 6 M€ (2005) Plus ~ 2,5 M€ salaries National : open to Biology (~ 6 groups). Grids : R&D, know-how, grid culture dissemination LHC, BABAR, D0, Auger, HESS, Virgo, Antares,

15 March / 10 Monthly Computing

15 March / 10 3 base Services : 1 - computing, > 1500 processors in farms, ~3,5 TeraHz or 1,3 M SI2K (100% Unix, 99% Linux, Majority SL3). Doubling every year for the past 10 years. Efficiency (effective/peak) >80%, or 3 Teraflops effective. 2 - Storage (100 TB disk, 800 TB cartridges, scalable to 7 PB, served by ~100 processors), with the HPSS system. Average monthly traffic tape-disk in 2004 ~ 0,5 Gbps (70 MB/s) 3 - Network (LAN+WAN): WAN = 20 labs + Fr Xptal site access to Internet, QoS.

15 March / 10 CPU sharing 2004 & 2005 LHC experiments are still far from reaching the target share of ~2/3.

15 March / 10

15 March / 10 Other services : - Data Base, Software & OS (licences, support, maintenance), - Website hosting (~150) - Many types of mutualized services (photothèque, mail, forums, PCs, visioconf & webcast for our community + others, EDH service, collaborative tools, etc..), - Data Back-up, hosting servers for external clients - hosting network nodes : Renater, RMU, Amplivia, Lyonix. - Software tools, grid development and testing, grid services, - Separate unit created for electronic scientific Publishing service - Specific Human support for some clients (large experiments + Biology) - hotline ("user support"), security, computing schools et al., All Services on a "best effort++" basis 24h/24.

15 March / M€ salaries

15 March / 10 Plans are to be a T1 + a T2 to host an analysis facility (≥ 2 other T2 are planned out of the ~10 EGEE.fr sites). Rough estimates are : CC TodayCC ,4 MSI2K16 MSI2K PB disk3 PB2 PB(tender 2005: 0.25 PB more) 0.8 PB tape6-8 PB4-6 PB people65 people~ 40(stabilize most fixed-term people) (what experiments are calling disk could well be a mixture of cache disk & tape with a good hierarchical system) CC is hosting a CIC, ROC, GOC, see: Infrastructure? heavy AC (air conditioning) work underway (started in 2004, reschuffle the room ) CC-in2p3 is deeply involved into the operational aspects, through EGEE SA1 (~10 people are paid by EU in a nutshell:

15 March / 10 Questions ?? Else: lunch, then visit of the computer room.