Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –

Slides:



Advertisements
Similar presentations
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Advertisements

S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Fighting Malaria With The Grid. Computing on The Grid The Internet allows users to share information across vast geographical distances. Using similar.
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Infrastructure overview Arnold Meijster &
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
High Energy Physics – A big data use case Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium is licensed.
Regional Computing Centre for Particle Physics Institute of Physics AS CR (FZU) TIER2 of LCG (LHC Computing Grid) 1M. Lokajicek Dell Presentation.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
Assessment of Core Services provided to USLHC by OSG.
1. 2 CERN European Organization for Nuclear Research Founded in 1954 by 12 countries – Norway one of them Today: 20 member states, around 2500 staff –
Massive Computing at CERN and lessons learnt
CERN IT Department CH-1211 Genève 23 Switzerland CERN openlab Board of Sponsors CERN IT: Some Strategic Directions David Foster Deputy IT.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Emmanuel Tsesmelis 2 nd CERN School Thailand 2012 Suranaree University of Technology.
What is EGI? The European Grid Infrastructure enables access to computing resources for European scientists from all fields of science, from Physics to.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Storage and data services eIRG Workshop Amsterdam Dr. ir. A. Osseyran Managing director SARA
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
ISS-AliEn and ISS-gLite Adrian Sevcenco RO-LCG 2011 WORKSHOP Applications of Grid Technology and High Performance Computing in Advanced Research.
INFSO-RI Enabling Grids for E-sciencE V. Breton, 30/08/05, seminar at SERONO Grid added value to fight malaria Vincent Breton EGEE.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
The Grid Beyond Physics Bob Jones, CERN EGEE project director.
R. Cherkaoui March 2010 EU-MED 3 Brussels High-performance networking in the context of the ATLAS particle physics experiment Prof. Rajaa Cherkaoui.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
1 The LHC Computing Grid – February 2007 Frédéric Hemmer, CERN, IT Department LHC Computing and Grids Frédéric Hemmer Deputy IT Department Head January.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
…building the next IT revolution From Web to Grid…
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
11 November 2010 Natascha Hörmann Computing at HEPHY Evaluation 2010.
Presentation of the A particle collision = an event Physicist's goal is to count, trace and characterize all the particles produced and fully.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
LHC Computing, CERN, & Federated Identities
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
1 The LHC Computing Grid – April 2007 Frédéric Hemmer, CERN, IT Department The LHC Computing Grid A World-Wide Computer Centre Frédéric Hemmer Deputy IT.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE: Enabling grids for E-Science Bob Jones.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
INRNE's participation in LCG Elena Puncheva Preslav Konstantinov IT Department.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
The ATLAS detector … … is composed of cylindrical layers: Tracking detector: Pixel, SCT, TRT (Solenoid magnetic field) Calorimeter: Liquid Argon, Tile.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
Enabling Grids for E-sciencE INFSO-RI Dr. Rüdiger Berlich Forschungszentrum Karslruhe Introduction to Grid Computing Christopher.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Computing infrastructures for the LHC: current status and challenges of the High Luminosity LHC future Worldwide LHC Computing Grid (WLCG): Distributed.
EGI-InSPIRE RI EGI Compute and Data Services for Open Access in H2020 Tiziana Ferrari Technical Director, EGI.eu
LHC collisions rate: Hz New PHYSICS rate: Hz Event selection: 1 in 10,000,000,000,000 Signal/Noise: Raw Data volumes produced.
Grid site as a tool for data processing and data analysis
PROGRAMME 10:00 Introduction to presentations and tour (10‘) Francois Grey  10:10 CERN openlab student programme - CERN opencluster (05')    Stephen Eccles 
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Long-term Grid Sustainability
EGEE support for HEP and other applications
The LHC Computing Grid Visit of Her Royal Highness
Tour of CERN Computer Center
Dagmar Adamova (NPI AS CR Prague/Rez) and Maarten Litmaath (CERN)
Tour of CERN Computer Center
Cécile Germain-Renaud Grid Observatory meeting 19 October 2007 Orsay
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid – Andreas Wagner

CERN IT Department CH-1211 Genève 23 Switzerland The IT Department & The Worldwide LHC Computing Grid (WLCG) Dr. Andreas Wagner Operating Systems and Infrastructure Services Deputy Group Leader, IT Department

CERN IT Department CH-1211 Genève 23 Switzerland Information Technology Department Andreas Wagner, CERN, IT Department Mission Mission – Physics Computing – General Purpose and Administrative Computing Projects Projects – With Physics community (WLCG) – With organisations, and industry More?

CERN IT Department CH-1211 Genève 23 Switzerland General Purpose Computing Andreas Wagner, CERN, IT Department Significant Numbers Significant Numbers – 7500 PCs in offices, 2500 Macs – Mail: ~20000 mailboxes, 6000 mailing lists, 3 Million messages/day –Web: Number of Web Sites ~15000 Hits per month ~1 billion – These services are widely used by people at CERN and physicist from their institutes!

CERN IT Department CH-1211 Genève 23 Switzerland The LHC Data Challenge (I) Andreas Wagner, CERN, IT Department

CERN IT Department CH-1211 Genève 23 Switzerland > 150 million sensors deliver data … … 40 million times per second View of the ATLAS detector (during construction) View of the ATLAS detector (during construction) Andreas Wagner, CERN, IT Department

CERN IT Department CH-1211 Genève 23 Switzerland The LHC Data Challenge (II) Andreas Wagner, CERN, IT Department

CERN IT Department CH-1211 Genève 23 Switzerland The LHC Data Challenge (III) Andreas Wagner, CERN, IT Department

10 Tier 0 at CERN: Acquisition, First pass reconstruction, Storage & Distribution 1.25 GB/sec (ions)

CERN IT Department CH-1211 Genève 23 Switzerland The LHC Data Challenge (III) The accelerator will run for 20 years Experiments will produce >25 Million Gigabytes of data each year (about 5 million DVDs, stacked would be tower of 6 km!) Total data stored at CERN 100 Pbyte (~ 700 years of HD quality movies) More than 480 million experiment files stored in data centre LHC data analysis requires a computing power equivalent to ~250,000 of today's CPU cores Requires many cooperating computer centres (>160 data centres), as CERN can only provide ~15% of the resources Andreas Wagner, CERN, IT Department

CERN IT Department CH-1211 Genève 23 Switzerland Solution: the Grid Use the Grid to unite computing resources of particle physics institutes around the world Andreas Wagner, CERN, IT Department The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe

CERN IT Department CH-1211 Genève 23 Switzerland How does the Grid work? It makes multiple computer centres look like a single system to the end-user Advanced software, called middleware, automatically finds the data the scientist needs, and the computing power to analyse it. Middleware balances the load on different resources. It also handles security, accounting, monitoring and much more. Andreas Wagner, CERN, IT Department

CERN IT Department CH-1211 Genève 23 Switzerland WLCG Organization Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (~130 centres): Simulation End-user analysis Austrian Federated Tier-2: Wien: HEPHY, Austrian Academy of Science, … (for CMS) Innsbruck: University of Innsbruck (for ATLAS)

Grid in Österreich Hagenberg Gregor Mendel Institut

16 Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … The LHC Computing Grid – Andreas Wagner WLCG has been leveraged on both sides of the Atlantic, to benefit the wider scientific community – Europe: Enabling Grids for E-sciencE (EGEE) European Grid Infrastructure (EGI) since 2010 – USA: Open Science Grid (OSG) since 2004 Many scientific applications  Broader Impact of the LHC Computing Grid

Collaborations with Institutions, Organisations, Public The LHC Computing Grid 17 Andreas Wagner Collaboration with Institutions: UNOSAT – Satellite image analysis for crisis response Collaboration with Industry: CERN openlab – Evaluates state-of-the-art technologies in a very complex environment and improves them; – Test in a research environment today what will be used in industry tomorrow Citizen Cyberscience Centre – Computing for Clean Water optimizing nanotube based water filters by large scale simulation on volunteer PCs – AfricaMap volunteer thinking to generate maps of regions of Africa from satellite images, with UNOSAT – volunteer project for public participation in LHC collision simulations, using VM technology Health e-Child – Study biomedical information for clinical practice, medical research, and personalised healthcare for the citizens of the EU.

The CERN Tier-0 in Numbers Data Centre Operations (Tier 0) – 24x7 operator support and System Administration services to support 24x7 operation of all IT services. – Hardware installation & retirement ~7,000 hardware movements/year; ~1800 disk failures/year – Management and Automation framework for large scale Linux clusters 18

Scaling CERN Data Center(s) to anticipated Physics needs The LHC Computing Grid – Andreas Wagner CERN Data Center dates back to the 70’s Upgraded in 2005 to support LHC (2.9 MW) Still optimizing the current facility (cooling automation, temperatures, infrastructure) Exploitation of 100 KW of remote facility down town Understanding costs, remote dynamic management, ensure business continuity Exploitation of a remote Data center in Hungary Max. 2.7 MW (N+1 redundancy) - Business continuity 100 Gbps connections Renovation of the “barn” for accommodating 450 KW of “critical” IT loads (increasing 513 total to 3.5 MW) 19

Computer Centre Tour CC Visit Point Visualisations Computer Museum Visitors Gallery View into upper machine room Foyer - Computer Museum GridView Walk Through the Computer Centre Machine Rooms Ground floor – Physics clusters Basement – Tape robots

For more information about the Grid: Thank you for your kind attention! The LHC Computing Grid 21 Andreas Wagner

The LHC Computing Grid – Andreas Wagner 22