21 October 2010 Dietrich Liko Grid Tier-2 HEPHY Scientific Advisory Board.

Slides:



Advertisements
Similar presentations
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Advertisements

The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
W. MajerottoRECFA, Innsbruck, 26 March ‘04 High Energy Physics in Austria W. Majerotto Institut für Hochenergiephysik d. Österr. Akademie d. Wissenschaften,
1 Developing Countries Access to Scientific Knowledge Ian Willers CERN, Switzerland.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
CMS Physics Association Tier-2 is associated to one or several physics groups Vienna is associated to SUSY and b Tag An association is connected with responsibilities.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Status of the DESY Grid Centre Volker Guelzow for the Grid Team DESY IT Hamburg, October 25th, 2011.
A short introduction to GRID Gabriel Amorós IFIC.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
E-Infrastructure Reflection Group (eIRG) Dieter Kranzlmüller GUP/JKU & CERN
Fermilab User Facility US-CMS User Facility and Regional Center at Fermilab Matthias Kasemann FNAL.
Wenjing Wu Andrej Filipčič David Cameron Eric Lancon Claire Adam Bourdarios & others.
Organisation Management and Policy Group (MPG): Responsible for setting and policy decisions and resolving any issues concerning fractional usage, acceptable.
ALICE Upgrade for Run3: Computing HL-LHC Trigger, Online and Offline Computing Working Group Topical Workshop Sep 5 th 2014.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Grid Computing Status Report Jeff Templon PDP Group, NIKHEF NIKHEF Scientific Advisory Committee 20 May 2005.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Tier-2  Data Analysis  MC simulation  Import data from Tier-1 and export MC data CMS GRID COMPUTING AT THE SPANISH TIER-1 AND TIER-2 SITES P. Garcia-Abia.
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
Recommendations on the scientific programme The PAC endorsed the main lines of the proposed long-term programme. The draft document is expected.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Tim 18/09/2015 2Tim Bell - Australian Bureau of Meteorology Visit.
The LHCb Italian Tier-2 Domenico Galli, Bologna INFN CSN1 Roma,
BNL Tier 1 Service Planning & Monitoring Bruce G. Gibbard GDB 5-6 August 2006.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
Your university or experiment logo here What is it? What is it for? The Grid.
11 November 2010 Natascha Hörmann Computing at HEPHY Evaluation 2010.
6. Juli 2015 Dietrich Liko Physics Computing 114. Vorstandssitzung.
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
Computing Jiří Chudoba Institute of Physics, CAS.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
, VilniusBaltic Grid1 EG Contribution to NEEGRID Martti Raidal On behalf of Estonian Grid.
Cracow Grid Workshop, October 15-17, 2007 Polish Grid Polish NGI Contribution to EGI Resource Provisioning Function Automatized Direct Communication Tomasz.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
1 Future Circular Collider Study Preparatory Collaboration Board Meeting September 2014 R-D Heuer Global Future Circular Collider (FCC) Study Goals and.
LHC Computing, CERN, & Federated Identities
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
IAG – Israel Academic Grid, EGEE and HEP in Israel Prof. David Horn Tel Aviv University.
Overview of ATLAS Israel Computing RECFA, April Overview of ATLAS Israel Computing Overview of ATLAS Israel Computing RECFA Meeting Tel Aviv University,
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
Meeting with University of Malta| CERN, May 18, 2015 | Predrag Buncic ALICE Computing in Run 2+ P. Buncic 1.
Resource Provisioning EGI_DS WP3 consolidation workshop, CERN Fotis Karayannis, GRNET.
Monitoring the Readiness and Utilization of the Distributed CMS Computing Facilities XVIII International Conference on Computing in High Energy and Nuclear.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks NA5: Policy and International Cooperation.
ECFA European Committee for Future Accelerators Report from the chairman M. Krammer HEPHY, Vienna, Austria November 21, 2013RECFA CERN1.
Nigel Lockyer Fermilab Operations Review 16 th -18 th May 2016 Fermilab in the Context of the DOE Mission.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
CMB & LSS Virtual Research Community Marcos López-Caniego Enrique Martínez Isabel Campos Jesús Marco Instituto de Física de Cantabria (CSIC-UC) EGI Community.
Collaborative Research Projects in Australia: High Energy Physicists Dr. Greg Wickham (AARNet) Dr. Glenn Moloney (University of Melbourne) Global Collaborations.
1 Austrian Participation in ATLAS  Innsbruck University  FH Wiener Neustadt  Austrians in ATLAS at CERN RECFA meeting, Vienna, Emmerich Kneringer.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
Ian Bird, CERN WLCG Project Leader Amsterdam, 24 th January 2012.
Ian Bird WLCG Workshop San Francisco, 8th October 2016
A Dutch LHC Tier-1 Facility
Report from WLCG Workshop 2017: WLCG Network Requirements GDB - CERN 12th of July 2017
NGIs – Turkish Case : TR-Grid
Grid infrastructure development: current state
Long-term Grid Sustainability
Bulgaria’s research landscape and the context of CERN collaboration
News and computing activities at CC-IN2P3
The Grid Observatory SSC Towards a Computer Science SSC
High Energy Physics Computing Coordination in Pakistan
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

21 October 2010 Dietrich Liko Grid Tier-2 HEPHY Scientific Advisory Board

Computing Computing Group Leader: G. Walzel Group: S. Fichtinger, U. Kwapil, D. Liko, N. Hörmann, B. Wimmer Grid Effort: 2.5 FTE General Computing: 3 FTE 2Dietrich Liko21 October 2011

Computing Overview Status –CMS Operations –Other Activities Current issues –Operational Costs –EGI Membership New Developments –Austrian Center for Scientific Computing (ACSC) –Hosting at HPC Site of TU Vienna Summary 3Dietrich Liko21 October 2011

Computing Status of CMS Tier-2 Operation With the start of LHC physics operation, the grid has been challenged with the real use case – it was a success ! –The data could be transported out of CERN –The users could perform their analysis on the Tier-2 clusters 4Dietrich Liko21 October 2011 Week Up to 500 concurrent users On average jobs/day jobs/day

Computing Tier-2 in Vienna 5Dietrich Liko21 October 2011 Vienna is 2% of overall CMS Tier-2 capacity 450 CPU cores with Linux SLC5 –Average Tier-2 site About 3000 jobs/day Availability always close to 98 % –Among the best sites

Computing Datatransfers About 1 TB/day incoming and outgoing 300 TB Storage capacity 6Dietrich Liko21 October TB/day Spikes due to fast link to CERN Powercut

Computing Vienna CMS Center Commissioned and approved for computing shifts from Vienna First shifts have been done from Vienna 7Dietrich Liko21 October 2011 Open Day at LHC Startup Computing Shift

Computing Why is the Tier-2 important for CMS Vienna Group Contributes to the overall CMS Computing –2% of the CMS Tier-2 Capacity Part of the CMS SUSY Cluster –RWTH Aachen, Bari, London Imperial College, U Florida –We can influence how the common disk-space is used –We are a partner on the same level as our colleagues Our own specific analysis –Example: Development of OSET analysis required 40 TB of disk-space –A very visible contribution to CMS; without Tier-2 not possible Gives us resources we can control –In case of important findings and conferences –Competition between the institutes is severe All Vienna contributions to the physics publications profited from the Tier-2 resources 8Dietrich Liko21 October 2011

Computing Other activities: Theory group Collaboration with HEPHY Theory Group Scan of CP Violation parameters of a BSM Model – Two Higgs Doublet Could this model explain the Matter/Antimatter asymmetrie observed in the Universe ? Full scan would involve 122 Mill. points, actual scan ~1% 1 point uses ~9.5 h CPU time -> distributed jobs Publication in preparation 9Dietrich Liko21 October 2011 CP Violating effects have been found to be small

Computing Other activity: Radiobiology Collaboration with MedUni Vienna Study of Linear Energy Transfer of various Ions –Optimisation of cancer treatment at new radiation facilities More then 10 8 particles have to be simulated Submitted to Z Med Phys 10Dietrich Liko21 October 2011 Example of LET with different ions as simulated

Computing Current Issues Electricity costs charged to the Institute since 2010 –Significant load to our stretched budget –We are not able to operate all our equipment –In 2011 we will not operate at foreseen size –We have asked the Academy to retender our electricity contract; discussions are ongoing EGEE has been replaced by a new structure, the European Grid Initiative EGI –Other partners in grid computing left the field in Austria –No National Grid Initiative has been established –EGI membership fee is prohibitive for us (40 K Euro/year) –We are in discussion with EGI and CERN Current Hardware will take us until 2011/2012 –End of lifetime 11Dietrich Liko21 October 2011

Computing Regional WLCG Tier-2 Workshop Innsbruck, Vienna, Budapest, Debrezin About 30 participants Intense discussion on the future Requirements and plans for the next years were presented 12Dietrich Liko21 October 2011

Computing Future requirements CMS requirements will be growing over the years Belle II will require also computing Main Assumption: constant budget, 30% increase of capacity each year (given by Moore‘s law) 13Dietrich Liko21 October 2011

Computing Consolidations in the Austrian Computing Landscape Vienna Super Computer (VSC) –Univ., Univ. Of Technology, Boku Vienna Austrian Center for Scientific Computing (ACSC) –Univ. Innsbruck, Linz and Salzburg ACSC aims to be common framework –We joined ACSC as guests, aim for membership –Relation of VSC and ACSC has still to be clarified –I lead the Grid & Cloud Working Group 14Dietrich Liko21 October 2011

Computing Consolidation High Energy Physics requires somehow different setups then HPC –Large Data Storage –Collaboration with international partners (grid) Nevertheless many common aspects –We aim to integrate our Tier-2 in a larger HPC Center We are looking for partners –Stefan Maier Institiute: PANDA Experiment at the FAIR Collider –Others might be possible Specific funding for particle physics computing will be required 15Dietrich Liko21 October 2011

Computing Hosting at the VSC We have discussed a hosting scenario with the VSC group from the Technical University Place, Energy, Networking, Cooling would be available –5 racks –Up to 50 kW Power –10 Gbit Network (2 to 3 Gbit sustained) With new hardware we could move there by 2012 –The open issue is the funding 16Dietrich Liko21 October 2011

Computing Summary Tier-2 is operational and allows us to participate in the analysis of data –Important for our physics analysis –Important for our position in the CMS collaboration Electricity costs prohibitive to operate Tier-2 at available size We need a new solution for 2012, when the hardware gets obsolete We study the issue in the context of the ACSC VSC could be an ideal place to host a particle physics cluster 17Dietrich Liko21 October 2011