CERN IT Department CH-1211 Genève 23 Switzerland www.cern.ch/it Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher.

Slides:



Advertisements
Similar presentations
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
Advertisements

Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
Randall Sobie The ATLAS Experiment Randall Sobie Institute for Particle Physics University of Victoria Large Hadron Collider (LHC) at CERN Laboratory ATLAS.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Les Les Robertson WLCG Project Leader WLCG – Worldwide LHC Computing Grid Where we are now & the Challenges of Real Data CHEP 2007 Victoria BC 3 September.
Les Les Robertson LCG Project Leader LCG - The Worldwide LHC Computing Grid LHC Data Analysis Challenges for 100 Computing Centres in 20 Countries HEPiX.
1. 2 CERN European Organization for Nuclear Research Founded in 1954 by 12 countries – Norway one of them Today: 20 member states, around 2500 staff –
From the Web to the Grid Dr. Francois Grey IT Department, CERN.
Massive Computing at CERN and lessons learnt
Z. Z. Vilakazi iThemba LABS / UCT-CERN Research Centre Curation in Natural sciences.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Emmanuel Tsesmelis 2 nd CERN School Thailand 2012 Suranaree University of Technology.
What is EGI? The European Grid Infrastructure enables access to computing resources for European scientists from all fields of science, from Physics to.
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – October 2006 LHC Computing and Grids Frédéric Hemmer IT Deputy Department Head October 10,
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
Frédéric Hemmer, CERN, IT Department The LHC Computing Grid – June 2006 The LHC Computing Grid Visit of the Comité d’avis pour les questions Scientifiques.
INFSO-RI Enabling Grids for E-sciencE Geant4 Physics Validation: Use of the GRID Resources Patricia Mendez Lorenzo CERN (IT-GD)
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
The Grid Beyond Physics Bob Jones, CERN EGEE project director.
Ian Bird LCG Deployment Manager EGEE Operations Manager LCG - The Worldwide LHC Computing Grid Building a Service for LHC Data Analysis 22 September 2006.
To the Grid From the Web. From the Web to the Grid – 2007 Why was the Web invented at CERN? Science depends on free access to information and exchange.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
INFSO-RI Enabling Grids for E-sciencE Plan until the end of the project and beyond, sustainability plans Dieter Kranzlmüller Deputy.
To the Grid From the Web Dr. Francois Grey IT Department, CERN.
1 The LHC Computing Grid – February 2007 Frédéric Hemmer, CERN, IT Department LHC Computing and Grids Frédéric Hemmer Deputy IT Department Head January.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
Jürgen Knobloch/CERN Slide 1 A Global Computer – the Grid Is Reality by Jürgen Knobloch October 31, 2007.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
INFSO-RI Enabling Grids for E-sciencE Porting Scientific Applications on GRID: CERN Experience Patricia Méndez Lorenzo CERN (IT-PSS/ED)
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
CERN IT Department CH-1211 Genève 23 Switzerland t Frédéric Hemmer IT Department Head - CERN 23 rd August 2010 Status of LHC Computing from.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
Service, Operations and Support Infrastructures in HEP Processing the Data from the World’s Largest Scientific Machine Patricia Méndez Lorenzo (IT-GS/EIS),
Frédéric Hemmer, CERN, IT DepartmentThe LHC Computing Grid – September 2007 Wolfgang von Rüden, CERN, IT Department The LHC Computing Grid Frédéric Hemmer.
Ian Bird LCG Project Leader WLCG Update 6 th May, 2008 HEPiX – Spring 2008 CERN.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
The LHC Computing Grid – February 2008 CERN’s Integration and Certification Services for a Multinational Computing Infrastructure with Independent Developers.
Computing for LHC Physics 7th March 2014 International Women's Day - CERN- GOOGLE Networking Event Maria Alandes Pradillo CERN IT Department.
LHC Computing, CERN, & Federated Identities
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
The LHC Computing Grid Visit of Dr. John Marburger
1 The LHC Computing Grid – April 2007 Frédéric Hemmer, CERN, IT Department The LHC Computing Grid A World-Wide Computer Centre Frédéric Hemmer Deputy IT.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE: Enabling grids for E-Science Bob Jones.
The Worldwide LHC Computing Grid Frédéric Hemmer IT Department Head Visit of INTEL ISEF CERN Special Award Winners 2012 Thursday, 21 st June 2012.
Data & Storage Services CERN IT Department CH-1211 Genève 23 Switzerland t DSS Computing for the LHC Alberto Pace
WLCG Status Report Ian Bird Austrian Tier 2 Workshop 22 nd June, 2010.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The EGEE project and the future of European.
WLCG – Status and Plans Ian Bird WLCG Project Leader openlab Board of Sponsors CERN, 23 rd April 2010.
Top 5 Experiment Issues ExperimentALICEATLASCMSLHCb Issue #1xrootd- CASTOR2 functionality & performance Data Access from T1 MSS Issue.
Dr. Ian Bird LHC Computing Grid Project Leader Göttingen Tier 2 Inauguration 13 th May 2008 Challenges and Opportunities.
WLCG Tier-2 Asia Workshop TIFR, Mumbai 1-3 December 2006
Grid Computing in HIGH ENERGY Physics
Physics Data Management at CERN
Grid site as a tool for data processing and data analysis
IT Department and The LHC Computing Grid
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
The LHC Computing Challenge
EGEE support for HEP and other applications
The LHC Computing Grid Visit of Her Royal Highness
Visit of US House of Representatives Committee on Appropriations
LHC Data Analysis using a worldwide computing grid
The LHC Computing Grid Visit of Prof. Friedrich Wagner
Overview & Status Al-Ain, UAE November 2007.
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Jerzy Szwed Under Secretary of State Ministry of Science and Higher Education Poland Tuesday 23 rd February 2010 The LHC Computing Grid Frédéric Hemmer IT Department Head

CERN IT Department CH-1211 Genève 23 Switzerland tons, 150 million sensors generating data 40 millions times per second i.e. a petabyte/s The ATLAS experiment 2 The LHC Computing Grid, February 2010 Frédéric Hemmer

3 A collision at LHC 3 The LHC Computing Grid, February 2010 Frédéric Hemmer

4 The Data Acquisition 4 The LHC Computing Grid, February 2010 Frédéric Hemmer

Tier 0 at CERN: Acquisition, First pass processing Storage & Distribution 1.25 GB/sec (ions)

CERN IT Department CH-1211 Genève 23 Switzerland 6 The LHC Data Challenge The accelerator will run for years Experiments will produce about 15 Million Gigabytes of data each year (about 20 million CDs!) LHC data analysis requires a computing power equivalent to ~100,000 of today's fastest PC processors Requires many cooperating computer centres, as CERN can only provide ~20% of the capacity The LHC Computing Grid, February 2010 Frédéric Hemmer

CERN IT Department CH-1211 Genève 23 Switzerland The LHC Computing Grid, February CPU DiskTape Computing Resources Frédéric Hemmer

CERN IT Department CH-1211 Genève 23 Switzerland The LHC Computing Grid, March Solution: the Grid Use the Grid to unite computing resources of particle physics institutes around the world The World Wide Web provides seamless access to information that is stored in many millions of different geographical locations The Grid is an infrastructure that provides seamless access to computing power and data storage capacity distributed over the globe Frédéric Hemmer

CERN IT Department CH-1211 Genève 23 Switzerland The LHC Computing Grid, February How does the Grid work? It makes multiple computer centres look like a single system to the end- user Advanced software, called middleware, automatically finds the data the scientist needs, and the computing power to analyse it. Middleware balances the load on different resources. It also handles security, accounting, monitoring and much more. Frédéric Hemmer

The LHC Computing Grid, February 2010 Tier 0 – Tier 1 – Tier 2 Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (~130 centres): Simulation End-user analysis 10Frédéric Hemmer

LCG Service Hierarchy Tier-0: the accelerator centre Data acquisition & initial processing Long-term data curation Distribution of data  Tier-1 centres Canada – Triumf (Vancouver) France – IN2P3 (Lyon) Germany – Forschunszentrum Karlsruhe Italy – CNAF (Bologna) Netherlands – NIKHEF/SARA (Amsterdam) Nordic countries – distributed Tier-1 Spain – PIC (Barcelona) Taiwan – Academia SInica (Taipei) UK – CLRC (Oxford) US – FermiLab (Illinois) – Brookhaven (NY) Tier-1: “online” to the data acquisition process  high availability Managed Mass Storage –  grid-enabled data service Data-heavy analysis National, regional support Tier-2: ~140 centres in ~35 countries Simulation End-user analysis – batch and interactive Frédéric Hemmer11The LHC Computing Grid, February 2010

CERN IT Department CH-1211 Genève 23 Switzerland The CERN Tier-0 – 24x7 operator support and System Administration services to support 24x7 operation of all IT services. – Hardware installation & retirement (~7,000 hardware movements/year) – Management and Automation framework for large scale Linux clusters – Installed Capacity 6’300 systems, 39’000 processing cores – CPU servers, disk servers, infrastructure servers – Tenders planned or in progress: 2’400 systems, 16’000 processing cores 13’900 TB usable on 42’600 disk drives – Tenders planned or in progress: 19’000 TB usable on 20’000 disk drives 34’000 TB on 45’000 tape cartridges – (56’000 slots), 160 tape drives The LHC Computing Grid, February Frédéric Hemmer

Frédéric Hemmer, CERN, IT Departme nt 13 The LHC Computing Grid, February 2010 Frédéric Hemmer

The European Network Backbone LCG working group with Tier-1s and national/ regional research network organisations New GÉANT 2 – research network backbone  Strong correlation with major European LHC centres Swiss PoP at CERN 14Frédéric HemmerThe LHC Computing Grid, February 2010

Overall summary  November – Ongoing productions – Cosmics data taking November – December – Beam data and collisions – Productions + analysis December – February – Ongoing productions – Cosmics WLCG service has been running according to the defined procedures – Reporting and follow up of problems at same level – Middleware process – updates & patches – as planned 15The LHC Computing Grid, February 2010Frédéric Hemmer

2009 Physics Data Transfers Final readiness test (STEP’09) Preparation for LHC startupLHC physics data Nearly 1 petabyte/week More than 8 GB/s peak transfers from Castor fileservers at CERN 16The LHC Computing Grid, February 2010Frédéric Hemmer

Reliabilities This is not the full picture: Experiment-specific measures give complementary view Need to be used together with some understanding of underlying issues 17The LHC Computing Grid, February 2010Frédéric Hemmer

From APEL accounting portal for Aug.’08 to Jan.’09; #s in MSI2k AliceATLASCMSLHCbTotal Tier-1s % Tier-2s % Total Main outstanding issues related to service/site reliability Frédéric HemmerThe LHC Computing Grid, February

GRID COMPUTING NOW 19The LHC Computing Grid, February 2010Frédéric Hemmer

Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences … >250 sites 48 countries >50,000 CPUs >20 PetaBytes >10,000 users >150 VOs >150,000 jobs/day LCG has been the driving force for the European multi-science Grid EGEE (Enabling Grids for E-sciencE) EGEE is now a global effort, and the largest Grid infrastructure worldwide Co-funded by the European Commission (Cost: ~170 M€ over 6 years, funded by EU ~100M€) EGEE already used for >100 applications, including… Impact of the LHC Computing Grid in Europe

21Health-e-Child Similarity Search Temporal Modelling Visual Data Mining Genetics Profiling Treatment Response Inferring Outcome Biomechanical Models Tumor Growth Modelling Semantic Browsing Personalised Simulation Surgery Planning RV and LV Automatic Modelling Measurement of Pulmonary Trunk

Example: The Grid Attacks Avian Flu The Grid has been used to analyse 300,000 possible potential drug compounds against bird flu virus, H5N computers at 60 computer centres in Europe, Russia, Asia and Middle East ran during four weeks - the equivalent of 100 years on a single computer. BioSolveIt donated 6000 FlexX licenses.BioSolveIt Results –Avian flu: 20% of compounds better than Tamiflu –Malaria: 6/30 compounds similar or better than PepstatinA –Ongoing tests with compounds from later calculations. Neuraminidase, one of the two major surface proteins of influenza viruses, facilitating the release of virions from infected cells. Image Courtesy Ying-Ta Wu, AcademiaSinica.

Example: Geocluster industrial application The first industrial application (GeoCluster) successfully running on EGEE Developed by the Compagnie Générale de Géophysique (CGG) in France, doing geophysical simulations for oil, gas, mining and environmental industries EGEE technology helps CGG to federate its computing resources around the globe

Sustainability Need to prepare for permanent Grid infrastructure Ensure a high quality of service for all user communities Independent of short project funding cycles Infrastructure managed in collaboration with National Grid Initiatives (NGIs) European Grid Initiative (EGI)

CERN IT Department CH-1211 Genève 23 Switzerland For more information about the Grid: Thank you for your kind attention! The LHC Computing Grid, February 2010 Frédéric Hemmer

CERN IT Department CH-1211 Genève 23 Switzerland Th e LH C Co mp uti ng Gri d, Fe br ua ry