Polish Contribution to the Worldwide LHC Computing Grid WLCG M. Witek On behalf of the team of Polish distributed Tier-2 Outline Introduction History and.

Slides:



Advertisements
Similar presentations
PIONIER and its usability for GEANT extension to Eastern Europe Michał Przybylski, CEF Networks Workshop, May 2005.
Advertisements

NAT testing in Poland E. Brojer, M. Łętowska, A
THE ICT RESEARCH INFRASTRUCTURE DEVELOPMENT PROGRAMME Grzegorz Żbikowski Department of Information Systems for Science Ministry of Science and.
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
31/03/00 CMS(UK)Glenn Patrick What is the CMS(UK) Data Model? Assume that CMS software is available at every UK institute connected by some infrastructure.
Polish Tier-2 Ryszard Gokieli Institute for Nuclear Studies Warsaw.
Zagreb, NATO ANW: The Third CEENet Workshop on Network Management, Piotr Sąsiedzki POL-34 Silesian University of Technology Computer Centre.
Resources for the ATLAS Offline Computing Basis for the Estimates ATLAS Distributed Computing Model Cost Estimates Present Status Sharing of Resources.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
Alain Romeyer - Dec Grid computing for CMS What is the Grid ? Let’s start with an analogy How it works ? (Some basic ideas) Grid for LHC and CMS.
Polish Tier-2 Andrzej Olszewski Institute of Nuclear Physics Kraków, Poland October 2005 – February 2006.
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
Computing in Poland from the Grid/EGEE/WLCG point of view Ryszard Gokieli Institute for Nuclear Studies Warsaw Gratefully acknowledging slides from: P.Lasoń.
11 September 2007Milos Lokajicek Institute of Physics AS CR Prague Status of the GRID in the Czech Republic NEC’2007.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
“Science and technology potential in Poland” - Dr Olaf Gajl, Information Processing Centre OPI Warsaw, Pl International Conference “Scientific and Technological.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
10 October 2006ICFA DDW'06, Cracow Milos Lokajicek, Prague 1 Current status and plans for Czech Grid for HEP.
Henryk Palka NEC’07 Varna, Sept HEP GRID computing in Poland Henryk Palka Institute of Nuclear Physics, PAN, Krakow, Poland.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
1 Kittikul Kovitanggoon*, Burin Asavapibhop, Narumon Suwonjandee, Gurpreet Singh Chulalongkorn University, Thailand July 23, 2015 Workshop on e-Science.
Michal Turala Daegu, 24 May Poland networking, digital divide and grid projects M. Pzybylski The Poznan Supercomputing and Networking Center, Poznan,
LHCb computing in Russia Ivan Korolko (ITEP Moscow) Russia-CERN JWGC, October 2005.
Copyright © 2000 OPNET Technologies, Inc. Title – 1 Distributed Trigger System for the LHC experiments Krzysztof Korcyl ATLAS experiment laboratory H.
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
IST E-infrastructure shared between Europe and Latin America High Energy Physics Applications in EELA Raquel Pezoa Universidad.
ICHEP06, 29 July 2006, Moscow RDIG The Russian Grid for LHC physics analysis V.A. Ilyin, SINP MSU V.V. Korenkov, JINR A.A. Soldatov, RRC KI LCG.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
Clusterix:National IPv6 Computing Facility in Poland Artur Binczewski Radosław Krzywania Maciej Stroiński
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
POLISH OPTICAL INTERNET Cross Border Fiber - towards the revolution in NREN international connectivity Artur Binczewski (Poznan Supercomputing and Networking.
T E R E N A Z a g r e b, M a y 1 9 – 2 2 Next Generation Network - - a PIONIER example Maciej Stroiński, Artur Binczewski, Michał Przybylski Poznań.
EGEE is a project funded by the European Union under contract IST HEP Use Cases for Grid Computing J. A. Templon Undecided (NIKHEF) Grid Tutorial,
V.A. Ilyin,, RIGF, 14 May 2010 Internet and Science: LHC view V.A. Ilyin SINP MSU, e-ARENA.
HEPIX - HEPNT, 1 Nov Milos Lokajicek, IP AS CR, Prague1 Status Report - Czech Republic HEP Groups and experiments Networking and Computing Grid activities.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
Silicon Module Tests The modules are tested in the production labs. HIP is is participating in the rod quality tests at CERN. The plan of HIP CMS is to.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
The LHCb CERN R. Graciani (U. de Barcelona, Spain) for the LHCb Collaboration International ICFA Workshop on Digital Divide Mexico City, October.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Site Report --- Andrzej Olszewski CYFRONET, Kraków, Poland WLCG GridKa+T2s Workshop.
Geneva – Kraków network measurements for the ATLAS Real-Time Remote Computing Farm Studies R. Hughes-Jones (Univ. of Manchester), K. Korcyl (IFJ-PAN),
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
LHC Computing, CERN, & Federated Identities
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
Ian Bird WLCG Networking workshop CERN, 10 th February February 2014
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Polish Information and Foreign Investment Agency PL Warsaw, 12 Bagatela Street, phone: (+48 22) , fax: (+48 22) ;
Collaborative Research Projects in Australia: High Energy Physicists Dr. Greg Wickham (AARNet) Dr. Glenn Moloney (University of Melbourne) Global Collaborations.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
Enabling Grids for E-sciencE INFSO-RI Dr. Rüdiger Berlich Forschungszentrum Karslruhe Introduction to Grid Computing Christopher.
CrossGrid Workshop, Kraków, 5 – 6 Nov-2001 Distributed Data Analysis in HEP Piotr MALECKI Institute of Nuclear Physics Kawiory 26A, Kraków, Poland.
Grid activities in Czech Republic Jiri Kosina Institute of Physics of the Academy of Sciences of the Czech Republic
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
ATLAS – statements of interest (1) A degree of hierarchy between the different computing facilities, with distinct roles at each level –Event filter Online.
PL-Grid: Polish Infrastructure for Supporting Computational Science in the European Research Space 1 ESIF - The PLGrid Experience ACK Cyfronet AGH PL-Grid.
Grid site as a tool for data processing and data analysis
LHC DATA ANALYSIS INFN (LNL – PADOVA)
Philippe Charpentier CERN – LHCb On behalf of the LHCb Computing Group
Presentation transcript:

Polish Contribution to the Worldwide LHC Computing Grid WLCG M. Witek On behalf of the team of Polish distributed Tier-2 Outline Introduction History and status of Polish Tier-2 Federation Examples of physics results from LHC experiments CERN - European Organisation for Nuclear Reseach LHC – Large Hadron Collider CMS Alice LHCb Atlas

Polish WLCG Tier-2 team – A. Binczewski 3, M. Bluj 5, A. Cyz 2, M. Dwużnik 1, M. Filocha 4, L. Flis 1, R. Gokieli 4,5, J.Iwaszkiewicz 4, M. Kowalski 2, P. Lasoń 1, R. Lichwała 3, M. Łopuszyński 4, M. Magryś 1, P. Malecki 2, N. Meyer 3, K. Nawrocki 4,5, A. Olszewski 2, A. Oziębło 1, A. Padee 4,5, H. Pałka 2, M. Pospieszny 3, M.Radecki 1, R. Rowicki 4, D. Stojda 6,4, M. Stolarek 4, T. Szepieniec 1, T. Szymocha 1,2, M.Turała 1,2, K.Wawrzyniak 4,5, W.Wiślicki 4,5, M.Witek 2, P.Wolniewicz 3 Institutions – 1 AGH University of Science and technology, ACC Cyfronet AGH, Krakow – 2 Institute of Nuclear Physics PAN, IFJ PAN, Krakow – 3 Poznan Supercomputing and Networking Center, PSNC, Poznan – 4 Interdisciplinary Centre for Mathematical and Computational Modelling, ICM, Warsaw – 5 Soltan Institute for Nuclear Studies, IPJ, Warsaw – 6 Copernicus Science Centre, CSC, Warsaw, Poland More details will be available in the article of PL-Grid Book 2

Eksperiment requirements Example of on-line conditions – ~10 8 electronic channels to read at 40 MHz – 1 PB/sec of raw info from the detector – ~few TB/sec of zero suppressed data ( channels with signal only) Multi-level trigger system – Reduction factor – Output rate > 100 MB/sec 10 7 sec/year ~few PB of raw data per experiment Level 1 Level 2 40 MHz (1000 TB/sec) Level 3 75 KHz (75 GB/sec) 5 KHz (5 GB/sec) 100 Hz (100 MB/sec) Data Recording & Offline Analysis WLCG 3

(Pre)History MONARC (Models of Networked Analysis at Regional Centres for LHC Experiments) – Simulate data processing (a kind of Monte Carlo model): a complex set of networked computing systems (CERN, Tier-1 and Tier-2 regional centres) the analysis process, composed of a dynamic workload of reconstruction and analysis the behaviour of key elements of the system, such as distributed database servers and the networks Summary: CERN-LCB One of the 2000: data sharing difficult due to the prohibitively high cost of networking 4 Even more pronounced in Poland at that time !!!

From 2000 to : DataGrid 2000 – HEP oriented EU project 2000:CERN-Poland meeting (initiated by M. Turala & P. Malecki) 2002: CrossGrid first Polish EU project with HEP task. 2002: start of LCG (LHC Computing Grid) 2003:First LCG testbed of 14 sites (Cyfronet participation) : EGEE, EGEE II, EGEE III : Participation in Data Challenges : BalticGrid, BalticGrid II 2007: Poland signs MoU for WLCG 2009: start of PL-Grid project (WLCG infrastructure) : LHC start. Smooth running of Polish WLCG sites. 5

Polish participation in networking and grid projects Polish part of WLCG emerged out of many different projects. Some of them not directly related to HEP. Nevertheless they provided infrastructure and common tools. 6 PIONIER, CrossGrid, EGEE and PL-Grid were essential for the development of Polish WLCG infrastructure.

7 GTS 1,6 Gb/s GDAŃSK POZNAŃ ZIELONA GÓRA KATOWICE KRAKÓW LUBLIN WARSZAWA BYDGOSZCZ TORUŃ CZĘSTOCHOWA BIAŁYSTOK OLSZTYN RZESZÓW Bielsko-Biała GÉANT Gb/s KOSZALIN SZCZECIN WROCŁAW ŁÓDŹ KIELCE PUŁAWY OPOLE RADOM BASNET 34 Mb/s CESNET, SANET GÉANT / TELIA 2x2,5 Gb/s DFN 10 Gb/s Gorzów MAN 10 Gb/s (1 lambda) 2 x 10 Gb/s 1 Gb/s CBDF 10 Gb/s PIONIER’S FIBERS Polish infrastructure in WLCG Tier1 FZK Karlsruhe Tier2 PCSS Poznań Tier2 ICM Warszawa Tier2 ACK Cyfronet Kraków HEP VLAN 1 Gb/s HEP VLAN 1 Gb/s

Pionier – Polish Optical Internet Nationwide broadband optical network represents a base for research and development – Europe's first national academic network that uses its own dark fiber optics and DWDM 10GE transmission – Connects 21 Academic Network Centers of Metropolitan Area Networks (MAN) and 5 of the HPC (High Performance Computing) centers. Cross-border links with: Germany (DFN), Czech Republic (CESNET), Slovakia (SANET), Ukraine (UARNET), Byelorus, Lithuania Started in 2000, 5917,5 km of fibres 2010 Data sharing problem due to networking limits indicated by MONARC solved ! 8

Towards physics results - that is what WLCG was built for Every experiment (ALICE, ATLAS, CMS, LHCb) has its own computing model. All based on multi-tier structure Successful LHC restart in 2009 Excellent performance of the LHC and the experiments 3.5*10 14 collisions delivered in 2011 Polish distributed Tier-2 Kraków-Poznań-Warszawa The RAW data reside in Tier-0 and Tier-1. The final physics result is an effect of many processing steps. Some steps have to be repeated when better detector calibration and alignment appear. The role of Tier-2 sites: Monte Carlo production Analysis centre (larger sites) Limited reprocessing activity 9

LHCb computing model – DIRAC role 10

Polish WLCG resources ACK Cyfronet AGH – Tier2 Polish sites provide about 2% of the total computing power in proportion to the engagement of Polish scientists in the LHC experiments. Most of the resources comes from PL-Grid infrastructure Polish Tier-2 Federation CPU (HS06) Disk (TB) Tier-3 at IFJ PAN WLCG pledges 11 Relia bility Availa bility Atlas LHCb Warsaw ICM – Tier2 CMS LHCb Poznan PSNC – Tier2 Alice Atlas July 2011

Poland in LCG accounting A week with 9 % contribution of Cyfronet to the LHCb reprocessing of 2011 data. Congratulations for the site performance from the LHCb computing team. LHCb 3.9% delivered by Poland CMS 2% delivered by Poland Month 2011 MayJuneJulyAugSept Usage as % of pledge 165%266%299%173%229% Performance of Polish Tier-2 (From WLCG Accounting) Increase over 100% mainly due to Cyfronet additional resources Highlight from LHCb 12 Atlas 1.6 % Delivered by Poland Tier-1/2 At present 69 Tier-2 federations in 33 countries. The size of Polish Tier-2 is in top 20.

The Standard Model – Particles and Forces 13 Bulding blocks of our Universe

ATLAS and CMS – direct discovery potential Find the last undiscovered element of the Standard Model – The Higgs bozon Direct discovery of New Physics – Supersymmetric particles or any object/phenomena beyond Standard Model we even did not think about yet ATLAS CMS The Higgs boson search 14

LHCb – LHC beauty experiment Hunting the signs of new physics by precision measurements Bs0→μ+μ-Bs0→μ+μ- B s 0 (bs) meson contains two quarks which are not present in the world around us: b beauty and s strange Extremely rare decay 3 out of 10 9 decays Standard ModelSupersymmery New Physics 2 decays observed – 1.4 expected No sign of New Physics yet 2 events selected out of recorded WLCG 2 events selected out of 3.5*10 14 initial collisions New phenomena can significantly increase the decay rate 15

Summary Polish physicists and computer scientists created distributed WLCG Tier2, which availability and efficiency is high; the delivered resources are in the range 2-3 % of the total. The performance of the WLCG matches high requirements of the data processing of LHC experiments. Lots of highest quality physics results are being shown recently. Much more are on the way, hopefully showing New Physics. 16 Acknowledgments We would like to thank all people for their contribution to the successful creation and operation of Polish Tier-2 Federation. In particular the management and staff of the three computing centres, ACC Cyfronet AGH in Krakow, ICM UW in Warsaw and PSNC in Poznan for their constructive cooperation. A supportive approach of the directorates of IFJ PAN Krakow and IPJ Warsaw is also appreciated. We are also grateful to many people from CERN, the FZK Karlsruhe and other WLCG collaborating institutions for their friendly advice and support.